The use of audio deepfakes for fraudulent calls under the FakeBoss scheme is on the rise.

The use of audio deepfakes for fraudulent calls under the FakeBoss scheme is on the rise.

[ad_1]

Cybersecurity experts note an increase in the use of audio deepfakes for fraudulent calls in instant messengers – faking the voice of a company executive to authorize the transfer of funds. Fraudsters use artificial intelligence to generate voices. The risks of attacks are most relevant for small and medium-sized businesses, where many issues are resolved directly with the company owner, experts say. In their opinion, banks and the state anti-fraud system should combat the problem. However, the latter does not currently analyze calls in instant messengers.

In Russia, since the beginning of the year, there has been an increase in the use of audio deepfakes in the fraudulent scheme FakeBoss (false boss) – calls from scammers using voice substitution, FACCT (formerly Group IB) told Kommersant. Attackers create a fake manager account, usually on Telegram: the name and photo are real, from the official website, from social networks, or “from a huge number of personal data leaks,” the company clarifies. At the end of 2023, cybersecurity experts were already reporting a wave of phishing attacks in Russia against enterprises with fraudulent emails being sent allegedly on behalf of managers or representatives of law enforcement agencies.

“In the new scheme, the fraudster does not write a message to a subordinate, but immediately calls through the messenger, using an audio deepfake of the manager’s voice, simulated using artificial intelligence,” says Evgeniy Egorov, leading specialist of the Digital Risk Protection department at FACCT. The goal is to “gain trust and force a subordinate, such as the chief accountant of an organization, to make payments to the criminals’ accounts.”

The scheme is confirmed by Informzashita: the use of deepfakes for calls to companies “can be called a new trend for Russia; previously they were used for identification in banks or calls to citizens.” As a rule, such incidents are not disclosed, so it is difficult to provide accurate statistics, the company clarifies.

The growth compared to last year is estimated at 30%, emphasizing that the number of such attacks will increase due to the increasing availability of deepfake technology itself.

In the West, fraudulent groups mastered audio deepfakes several years ago: back in 2019, in the UK, attackers managed to steal €220 thousand from an energy company by posing as its CEO and convincing an employee to transfer money in a telephone conversation.

In a fraudulent scheme using AI to replace a voice, attackers need to obtain a fragment of speech that will be used to train a neural network, FACCT explains: cybercriminals can, for example, record a conversation knowing the victim’s phone number, or hack an account in a messenger and gain access to voice messages user.

At the beginning of 2023, the Antifraud system was launched in Russia, which is supervised by Roskomnadzor. All telecom operators must connect to it by the end of February; over the past year, more than 622 million calls with number spoofing have already been prevented. But the system only responds to calls via mobile communications, and calls via instant messengers are not monitored.

Vice-President of the Association of Banks of Russia Alexey Voylukov believes that in small organizations, where many actions are often “carried out with verbal instructions” and there are no regulated processes with a well-established control system, the threat of such an attack is “very real.” “The growth of such fraud should lead to increased attention to operational risks and compliance procedures,” believes Mr. Voylukov.

The solution to the problem largely lies with the banks, argues SafeTech commercial director Daria Verestnikova, since the scheme is “similar to social engineering methods.” Accordingly, the expert explains, banks “should not skimp on means of confirming transactions, including legal entities, and on systems for countering fraudulent calls – anti-fraud, and the head of the company must see to whom the payment is going.”

Tatiana Isakova, Yulia Poslavskaya

[ad_2]

Source link