Deepfake technology allows you to produce manipulated videos where you can show individuals doing or saying fake things. The tactic is used for different purposes on the Internet, as a way of stalking and even defaming people. Thinking about the risks of popularizing such clips on the web, the Witness organization, which advocates responsible use of deepfakes, released on Wednesday (16) the report of the Deepfakes and Synthetic Media Prepare Now convention, which took place on October 25. July in So Paulo.
READ: Fake Porn With Famous Grows On The Internet; understand
The purpose of the survey is to promote discussion of manipulated videos with a less US or European approach. The main victims of the new technology are marginalized groups such as activists, social movements and women. For example, pornographic deepfakes became popular over the last year with the manipulation of celebrity images like Gal Gadot and Emma Watson.
Deepfake has potential for promoting citizenship, but also used for fake pornographic videos. Photo: Playback / Instagram
Want to buy a cell phone, TV and other discounted products? Meet the Compare dnetc
In Brazil, deepfake technology is often used in political sparrows. However, it can also be applied to defame people or provoke gender violence. Some examples are fake videos of porn with women and the creation of manipulated photos of naked women, such as Deepnude.
Like easily accessible clip-handling technology, Internet users should doubt videos whose origins are dubious. The clips can undermine the credibility of institutions, public figures and social movements.
Because of these risks, I need citizens to have access to tools to verify the veracity of online shared videos. Safety expert tips for recognizing deepfakes include paying attention to the video light pattern for checking errors, or observing the movement of the person's eyes or mouth for imperfections.
According to the Witness report, one solution would be to develop specific tools for combating deepfake, and to share them with the population. For example, just as deepfake is an artificial intelligence (AI) programmed to create fake videos, it would be possible to train an AI to detect the video manipulation itself. Still, AI could learn someone's speech and movement style to indicate possible inconsistencies in video.
How to find out who stopped following you on Instagram