Scientists have recorded a decrease in interest in ChatGPT in the socio-political sphere

Scientists have recorded a decrease in interest in ChatGPT in the socio-political sphere

[ad_1]

ChatGPT and other tools created on the basis of artificial intelligence (AI) remain undervalued in the Russian socio-political sphere, according to a report presented to journalists by the Center for Social Research and Technological Innovation (CITY Center) of the Higher School of Economics. Last year’s euphoria, which was accompanied by high expectations, is being replaced by indifference and, at times, skepticism. In general, the authors of the study conclude that interest is stabilizing and reaching a kind of plateau. The state, which is increasingly using AI, is becoming a new driver of progress.

The subject of the CITY Center’s research was the experience of using ChatGPT and AI-based tools in politics and social research. The experience of their use, risks and prospects for further use were assessed: the authors of the report conducted six online focus groups with ChatGPT users and interviewed federal experts. They were also interested in the expectations and disappointments associated with the use of new technologies.

As it turned out, interest in AI-based tools in the humanitarian sphere is noticeably lower than in the technical sphere, stated Deputy Director of the CITY Center Efim Fidrya. While these tools are used abroad in marketing, education, research, and even in psychology (the Oxford Institute of Population Aging recommends a chatbot consultant as a remedy for depression), in Russia the role of ChatGPT is often reduced to that of a search engine assistant or translator.

This underestimation can be explained by a number of reasons, the sociologist notes – for example, ChatGPT objectively processes requests in English better than in Russian. In addition, access for Russians to its most advanced paid version is difficult due to payment problems and blocking of Russian users. The problem of so-called AI hallucinations also remains – that is, the unreliability of the information provided. “For example, I give him the task of writing me a five-minute sermon on the Apocalypse of John on some topic. He makes up Scripture quotations that aren’t there. “It shocked me,” the study authors quote from the response received during the focus group. But the overall trend may yet change, they hope. Thus, the driver of increased interest could be a state that actively uses AI – for example, introducing “Safe City” systems or traffic control systems on federal highways. And the Ministry of Digital Development plans to introduce ChatGPT technologies on the Gosuslugi portal for consulting citizens.

President of the St. Petersburg Politics Foundation Mikhail Vinogradov noted that examples of the use of neural networks in political life (for example, for the production of campaign materials) are gradually accumulating, but are still fragmentary. Meanwhile, ChatGPT technologies can be used to develop various tools (for example, to generate messages in trade), integrate experience from other industries, and neural networks can be “trained” to form hypotheses or be adapted to create content, the political scientist listed. Counter-propaganda, searching for vulnerabilities in opponents – all this is also quite accessible, the main thing is not to forget that control of the quality of the answers received remains with natural intelligence, Mr. Vinogradov emphasized.

INSOMAR Director of Political Analysis Viktor Poturemsky, in turn, recalled that the Internet did not enter our lives right away either. “You need to understand that AI is a projection space,” he explained. It is human nature to transfer our internal processes to external objects, so we immediately humanize AI, and if we cannot fill the space with ourselves, we ignore it, the sociologist reasoned. What was wonderful about the Internet was that it was easy, accessible, and you could fill it with whatever you wanted. AI, on the other hand, resists attempts to project onto it and does not live up to our expectations: a survey of experts who refused to use neural networks showed that they assess the costs required to obtain results as too high, Mr. Poturemsky noted. The need to train AI turned out to be a barrier on the way to it; people fail to animate this space and it “gets ignored,” the expert stated. Therefore, in his opinion, in the near future, AI will remain a space for professionals and this is where competencies will be built up.

Anastasia Kornya

[ad_2]

Source link