Deputies wanted to mark audio and visual fakes
[ad_1]
“It could be a disclaimer, like on the radio.”
State Duma deputies will prepare a bill to protect people from deepfakes. A fake voice or a realistic image – all these tools created with the help of artificial intelligence are already being used by scammers. The victims of deception are mainly elderly people who find it difficult to adapt to modern technologies.
“Technologies can now copy the voice of both a public person and an ordinary person,” says Anton Tkachev, first deputy chairman of the committee on information policy, technology and communications, member of the New People faction, to MK. – That is, an elderly woman can be sure that she is communicating on the phone with her granddaughter, and not with a car. In such conditions, she will transfer all savings where she is told. For now, such cases are rare, but in a year or two they will become widespread. Neural networks in the hands of scammers are a ticking time bomb. We need to be ready to pass laws that will not allow people to be robbed. We cannot resist progress, but it must be directed in the right direction, and not turned into a weapon in the hands of scammers. Nowadays, many deepfakes are also associated with SVO – both with the creation of recordings allegedly on behalf of public figures, and with an attack on the families of participants in a special operation from the outside.
Tkachev added that the young field of the data economy needs new legislative initiatives. Over the past few years, Moscow has accumulated positive experience in regulating artificial intelligence. This practice should be implemented at the federal level to protect people from deepfakes.
– How technically do you propose to label deepfakes: images and audio tracks produced by artificial intelligence?
– Marking – also similar to advertising – in text or in a preview, a “watermark” agreed with the government and in the form of a QR code, which will contain information about the software that produced the content. From an audio point of view, this could be a disclaimer, like on the radio.
[ad_2]
Source link