Scammers can attack Russians via video link

No time to read?
Get a summary

According to the statement of Sergei Golovanov, chief expert of Kaspersky Lab, in the future fraudsters will be able to attack Russians on social networks in real time through fake voice messages, as well as video communication using deepfake technology. writes about this RIA News.

Attackers will use an image on the screen and create an image of another person, which will repeat their lip movements while communicating in real time. Golovanov noted that this will be possible in a few years and will be distributed through instant messaging programs. He also reminded of the current fraud scheme, where attackers send voice messages from supposed acquaintances requesting money transfers, but such messages can be recognized thanks to the robotic tone.

Golovanov believes that the only way to protect yourself is to check accounts, as it will be difficult to recognize AI by ear.

Yesterday in USA anxious Biden’s fake voicemail ahead of the New Hampshire primary. Experts say artificial intelligence will be used for political gain in elections around the world this year, with voice fraud being the biggest concern. Important notes are easy to create and organize but difficult to keep track of.

Previously physiognomist clarifiedIs it true that Tsyganova’s interview with Dudu (known as a foreign agent media in the Russian Federation)* is a deepfake?

No time to read?
Get a summary
Previous Article

Petro’s son summoned to testify about improper financing of father’s campaign

Next Article

“Remove the Russian Federation from the world market”: Will the USA be able to ban the import of Russian food Deputy Nilov called the new action of the USA to reduce dependence on Russian grain a joke 01/22/2024,