A method has been created to protect video broadcasts from deepfakes

No time to read?
Get a summary

American scientists have developed a system that will protect video broadcasts from deepfakes and other types of tampering. The results of their study were published at: To collect Materials of the 21st annual international conference on mobile systems, applications and services Mobisys Conference.

According to experts, in some cases, the authenticity of video and audio files can be verified using metadata. However, this is not the most secure authentication method as such information can be deleted. It’s also not very suitable for public recordings of live broadcasts.

“The biggest problem we face is what happens at live events like public speaking or press conferences. Anyone in the audience can technically record a video of the speech and upload it anywhere. And once it is ready, it can be downloaded for free and downloaded repeatedly, which can be distributed to many people with malicious intent,” said Nirupam Roy, one of the researchers and an assistant professor of computer science.

To solve this problem, Roy and his team created TalkLock, a system that can generate a QR code that preserves the authenticity of a public person’s image.
“The idea is to use a device such as a smartphone or tablet to continuously generate cryptographic sequences created from small pieces of live speech that form a unique QR code. This code captures carefully extracted speech features,” explained the researcher.

Since the QR code will appear on the device screen with the speaker, the original recordings of the speaker will also contain the QR code. The presence of the code demonstrates the verifiable nature of the live broadcast recording, even if it is broadcast in different formats, uploaded to various social networks or shown on television.

In addition to placing a unique marker on a video or audio clip, TalkLock can systematically analyze features of the recording and compare them to the sequence of code generated from the original live version. Any inconsistencies detected by TalkLock will indicate that the content has been modified.

Scientists believe that their technology will help combat slander and disinformation on the global network.

Formerly YouTube platform started Fighting against neural network deepfakes.

No time to read?
Get a summary
Previous Article

State Duma believes Valieva has a chance to continue her career

Next Article

Nebulossa queens at the Benidorm Festival