Russians were given tips on how to detect fake audio and video by scammers 06:05

The use of artificial intelligence is already actively used in fraud schemes and will become one of the most preferred tools of attackers in the future. Timur Garayev, head of Banki.ru’s information security department, told socialbites.ca.

“Even today, neural networks can be used by fraudsters to create more convincing phishing emails.” They analyze data from social networks and other publicly available data (including interests, contacts and online behavior) and use this in their schemes to tailor messages to the victim’s interests and “They adapt it to their needs,” he explained.

According to Garayev, with the help of neural networks it is possible to create fake websites that may appear to belong to legitimate organizations, forms for sending donations to various funds. All this allows fraudsters to access confidential data. Fake video and voice messages created using neural networks are becoming an increasingly common tool to deceive citizens. Neural networks are also used to create videos where people’s faces can be replaced with the faces of loved ones, acquaintances and colleagues. These videos can be used to get information from you, transfer money or blackmail.

“One of the phone deception methods is voice phishing. Criminals use phone calls or voicemails to trick people into stealing money or data. “These could be fake calls from financial institutions, government or technical services, or voicemails containing the voices of friends or loved ones,” he said.

It is important to follow some advice to avoid falling into the trap of scammers. Use caution when interacting with content from unreliable sources. Do not trust voice or video messages claiming to be from your friends, but rather messages from unknown numbers. Even if they write to you from a familiar number or send a video asking for a money transfer, do not hesitate to call back and clarify the details. You can request information that only you and the other person know.

“Check the authenticity of the content you receive. Often in voice messages generated by artificial intelligence you can hear absolute silence or, on the contrary, unnatural sounds. Videos, on the other hand, have characteristic artifacts in the form of blurred edges of the face or unnatural facial expressions. Increase your awareness of the methods used by scammers. Tell your loved ones, especially vulnerable audiences (children and elderly parents). When you need to verify your legitimacy from another device or enter a code from an SMS or code generator, use additional security measures such as two-factor authentication. “Do not trust unexpected calls and do not give confidential information over the phone,” concluded Garayev.

scammers before become It attacks Russian Telegram users five times more often.

What are you thinking?



Source: Gazeta

Popular

More from author

Kadatkina’s son, until the 100th anniversary of the actress, 05/14/2025, 19:42 announced the mother’s special acting method.

The composer Alexei Kolosov revived the special acting method on the 100th anniversary of his mother actress Lyudmila Kasatkina. Kolosova Nails News.ru. According to his...

05/14/2025, 23:51 Following the stress level, it created a “smart” dental floss

American scientists from the University of Tafts developed a “intelligent ör dental yarn with a sensor that can measure the level of cortisol -...

Toyota Avensis sedan’s five minus 00:30

The advantages of the second generation Toyota Avensis sedan are high reliability and comfort, but the model has weaknesses. This was reported as "Gazeta.ru"...

“Bologna” won the Italian Cup 00:00

“Bologna” became the owner of the Italian Cup, which defeated Milan in the final. The meeting at the Olympico Stadium in Rome ended with 1:...