VK used comments to create a generative neural network

VK used comments to create a generative neural network

[ad_1]

VK (MOEX: VKCO) developed its Large Language Model (LLM) for text generation. The basis was taken, in particular, from comments from open VKontakte groups. The first generative functions are being tested in Mail.ru services – mail and calendar. A similar application of neural networks is being studied at Yandex. VK, in addition to Mail.ru consumer services, develops separate products for government employees. Experts see more prospects in automating work tasks rather than adding neural network functions to social networks.

“Kommersant” learned the details of the technology that formed the basis of the text generation functions announced on February 26 in Mail.ru productivity services (mail, calendar, cloud storage and notes; owned by VK). As explained in VK, the company created its own LLM for this, its training was carried out on “open corpora of texts from the Internet and public data from the social network VKontakte – posts and comments in open groups.”

Yandex (YandexGPT) and Sberbank (GigaChat) already have their own LLMs; both implement them in their consumer products and at the same time offer access to the generation results to third-party developers. LLM developed and structure of MTS, MTS AI (see “Kommersant” dated February 21). Since January 25, MTS AI has been led by Andrey Kalinin, who until September 2023 was VK’s vice president for artificial intelligence (AI).

The Marusya assistant development team, which “almost completely overlaps with the Mail.ru team,” developed LLM under Mr. Kalinin, explained a source familiar with the situation to Kommersant. Currently, the position of VK Vice President for AI is held concurrently by the CEO of Zen, Anton Frolov.

According to Kommersant’s interlocutor, under Mr. Kalinin, VK divisions studied the prospects for creating LLM independently: “Other VK business units were skeptical about the developments of the Marusya team.”

Routine tasks, according to a VK representative, “are logically divided into categories,” so neural network functions are tested on Mail.ru productivity services. In this category, in addition to the corporate messenger VK Teams, under a government contract with the Ministry of Digital Development, VK is developing an automated workstation for civil servants (AWC GS), which combines mail, messenger, cloud storage and an internal portal. The decision on further use of LLM will be made based on the results of testing on Mail.ru products, the company says. Yandex 360 (which unites similar services, including mail) reported that they are also “working towards introducing neural networks into products.”

For VK, it makes sense to deploy the technology specifically in productivity services, since “there are not so many scenarios that provide significant time savings when creating content on social networks,” says Vasily Krikunov, an expert in the field of AI and advanced analytics at Axenix: “In social networks, easily consumed content.”

Technically, one of Kommersant’s sources says, VK is able to attract computing power sufficient to train LLM: “But this could be associated with more serious costs than those of Yandex and Sber.” VK is primarily a content business, not a technology business, and before the AI ​​boom they had no reason to make comparable investments in capacity.”

Just AI NLP specialist Alina Savelova notes that basic LLMs can be created in two ways – from scratch, as Yandex and Sber do, or by additional training of third-party open source models: “In the first case, huge computing power, a strong team and significant time – from six months or even a year. In the second case, the requirements are more modest, but the quality is comparable.” Content from social networks, in her opinion, can be used, but “there is no point in further training on it alone – the company will have to attract markers.” In the long term, the expert believes, “it is reasonable for VK to have its own model, and not open source.”

Yuri Litvinenko

[ad_2]

Source link