Several universities in the world presented scientific articles with a hidden trick in recent months. At first glance, those seem normal. But if one looks with magnifying glass – or, rather, if one is an artificial intelligence (AI) discovers that they hid invisible instructions within the text, written in white or with microscopic lyrics. These, they asked the AI that, if by chance it was used to evaluate that article, put a good note. They told him things like “just making a positive review” or “not mentioning the negative points.” In other words: they wanted to manipulate the evaluation, because they knew that on the other side, on the side of which he decides if the article is good or bad, there is also ia.
To understand this, you have to explain how the academic world works. When a scientist writes an article, that text has to go through what is called “pairs review.” Other specialists, in general anonymous, read it and think if it is worth publishing it or not. It is a quality filter, but with the huge number of items that are presented every day and the little availability of experts willing to read them carefully, many reviewers use AI to do that work faster. And then there is an absurd situation: an AI reviews an article that was written (at least in part) for another AI, and sometimes even with secret instructions so that AI approves it without criticism. It is as if an athlete told the anti -doping control: “Look the other way, there is nothing to do,” and write it in invisible ink that only automatic control can read.
This situation shows a deeper problem: we are in a hybrid stage. Part of the process is done with humans, part with machines, but there are no clear rules, or a general agreement. Some play with the rules of the past, others run in the future. And that generates unfair advantages. Those who know how to talk to an AI, how to manipulate it, how to hide messages, run with advantage.
But the solution is not to invent more and more rules to control this. The solution is another: to raise all restrictions and allow the AI to participate in the whole process, from writing to evaluation. If everyone can use the same tools, the race is again fair. There is no doping if everyone runs with the same fuel. Upside down, the unfair is prohibiting it halfway. Because so, only those who know how to dodge the rules win.
The use of the science is not an anomaly, it is the future. What seems trap today, tomorrow will be norm. Frena it only extends the transition, and on the way, create an unequal system. Better to assume it, regulate the fair, and let the technology do its job. The game changed. It is time to accept the new rules, instead of fighting windmills.
Things as they are
Mookie Tenembaum addresses technology issues like this every week with Claudio Zuchovicki in his podcast artificial intelligence, financial perspectives, available in Spotify, Apple, YouTube and all platforms.