Sber specialists: The domestic neural network has surpassed the American one in terms of the quality of answers in both Russian and English

Sber specialists: The domestic neural network has surpassed the American one in terms of the quality of answers in both Russian and English

[ad_1]

Analysts conducted the MMLU (Massive Multitask Language Understanding) exam for two neural networks – domestic and American. The exam included questions from 57 areas of knowledge, including: mathematics, history, medicine, physics, knowledge of the world and problem-solving abilities. As a result, Sber’s GigaChat PRO neural network model outperformed OpenAI’s ChatGPT-3.5-turbo version in terms of the quality of answers in English by 6%, quotes PRIME Senior Vice President, Head of the Technology Block of Sberbank Andrey Belevtsev.

“Sber’s artificial intelligence—the neural network model GigaChat PRO—has surpassed the currently available GPT-3.5-turbo model from OpenAI in the quality of answers in English. GigaChat has also surpassed its competitor in the quality of answers in Russian,” Belevtsev said.

He added that GigaChat can solve many intellectual or everyday problems: maintain a conversation, write texts or code, answer questions. And the inclusion of a Kandinsky model in the ensemble gives him the skill of creating images.

“Sber also continues to develop GigaChat for business needs. Companies have access to the GigaChat API – a software interface for accessing Sber’s neural network model. With its help, businesses can create their own solutions and optimize internal processes. In addition, the service simplifies working with large texts and helps write articles in the required style and format, search for information and prepare analytics based on it,” Belevtsev noted.

Let us recall that last April Sberbank launched in testing mode the multimodal neural network GigaСhat, which can maintain a dialogue with the user, write program code, create texts and pictures. Later, in September, the bank opened access to the service to everyone.

[ad_2]

Source link