Phone scammers now have a superweapon: artificial intelligence “clones” voices

Phone scammers now have a superweapon: artificial intelligence “clones” voices

[ad_1]

A new form of “deepfake” has emerged, where hackers use artificial intelligence to clone your voice. To find out how convincing this is, a Daily Mail journalist allowed a hacker to clone her voice – with chilling results.

Our voices are almost as unique as our fingerprints – so how would you feel if your voice was cloned? In recent months, a new type of deepfaking has emerged known as voice cloning, in which hackers use artificial intelligence (AI) to imitate your voice.

Famous faces including Stephen Fry, Sadiq Khan and Joe Biden have already fallen victim to voice cloning, while one unnamed CEO was even tricked into transferring $243,000 to a scammer after receiving a fake phone call, reports the Daily Mail.

“But how does it work and how convincing is it? To find out, I allowed a professional hacker to clone my voice – with terrifying results, writes British publication journalist Shivali Best. – Voice cloning is an artificial intelligence technique that allows hackers to take an audio recording of someone, train an artificial intelligence tool on their voice, and recreate it.

Speaking to MailOnline, Dane Sherrets, solutions architect at HackerOne, explained: “It was originally used to create audio books and help people who had lost their voice for medical reasons. But today it is increasingly being used by Hollywood and, unfortunately, scammers.”

When the technology first emerged in the late 1990s, its use was limited to experts with deep knowledge of artificial intelligence.

However, over the years, the technology has become more accessible to the point where, according to Dane Sherrets, almost anyone can use it.

“Someone with very limited experience can clone a voice,” he said. “It takes maybe less than five minutes using some tools that are free and open source.”

“To clone my voice, all Sherrets needed was a five-minute video of me speaking,” the Daily Mail journalist continues her story. “I decided to record myself reading an article from the Daily Mail, although Mr Sherrets says most hackers could simply extract the audio from a quick phone call or even a video posted on social media.

“This can be done during a conversation, if there’s something being shared on social media, or even if someone is doing a podcast. It’s really just something we upload or record every day,” he said.

Once I sent Mr. Sherrets the clip, he simply loaded it into an instrument (which he chose not to name), which he could then “train” on my voice.

“Once that was done, I was able to type text or even talk directly to the tool and have it output whatever I wanted the message to say in your voice,” he said. “What’s really crazy about the tools that exist now is that it’s that I can add extra intonation, pauses, or other things that make the speech more natural, which makes it much more convincing in a scam scenario.”

Despite the lack of pauses or additional inflections, the first clip of my voice clone created by Mr. Sherrets was amazingly convincing.

The robot’s voice perfectly captured my American-Scottish hybrid accent as he said, “Hi Mom, this is Shivali. I’ve lost my bank card and I need to transfer some money. Could you please send some to the account I’m using?” Did you just get a message?”

However, the creepiness was heightened in the following clip, in which Mr. Sherrets added pauses.

“Towards the end you can hear a long pause and then a sigh, and it makes it sound a lot more natural,” the professional hacker explained.

While my experience with voice cloning was fortunately just a demonstration, Mr. Sherrets highlights some of the serious dangers of this technology.

“Some people have had fake kidnapping calls where their ‘child’ calls them and says, ‘I’ve been kidnapped, I need millions of dollars or they won’t let me go,’ and the child sounded very upset,” he said. -Today We’re increasingly seeing people trying to make more targeted social engineering attempts against companies and organizations. I used the same technology to clone my CEO’s voice. CEOs of companies often speak in public, so it’s very easy to get a high-quality audio recording of their voice and copy it “Having the CEO’s voice makes it much easier to quickly obtain a password or access to a system. Companies and organizations need to be aware of this risk.”

Fortunately, Mr. Sherrets says there are several key signs that indicate the voice is a clone.

“There are key signs,” he told MailOnline. “There are pauses, problems where it doesn’t sound as natural, and there may be what you call ‘artifacts’ in the background. For example, if a voice has been cloned in a crowded room and there’s a lot of other people are communicating, then when you use this voice clone, you will hear some garbage in the background.”

However, as technology continues to develop, these signs will become more difficult to detect.

“People need to be aware of this technology and constantly be suspicious of anything that requires urgent action from them – this is often a warning sign,” the expert explained. “They should quickly ask questions that perhaps only the real person really knows.” person, and not be afraid to try and test things before taking action.”

Sherrets recommends using a “safe word” for your family and friends. “If you have a really urgent situation, you can say that safe word and they’ll immediately know it’s really you,” he said.

Finally, the expert advises being aware of your digital footprint and keeping an eye on the amount you upload online. “Every time I load now, it expands my sonic attack area and can be used to train the AI ​​later,” he added. “There are trade-offs in this that everyone will have to make, but it’s something to be aware of.” “Your own audio recordings that appear there can be used against you.”

[ad_2]

Source link