The DeepNude app has come to an end on Thursday (27) after causing Internet controversy. The goal of the program was to create photos of naked women using Artificial Intelligence. Launched in March this year for use on Windows computers, compromising imaging software was short-lived. "The world is not ready for DeepNude," says the official closing announcement on Twitter, see the full note at the end of the story.
What deepfake? Artificial intelligence used to make fake video
The end of the deepfake program comes after the negative repercussion of the app in the international press. The technology site Motherboard was the first to report the existence of the software, highlighting the danger of misuse of technology. Yesterday, as traffic increased, the app faced instability and went offline. Late in the afternoon, the company announced on its Twitter profile that the app had come to an end: "DeepNude won't launch other versions and won't allow anyone else even Premium users to access it."
DeepNude has been discontinued after controversy Photo: Dissemination / DeepNude
Want to buy a cell phone, TV and other discounted products? Meet the Compare TechTudo
The purpose of the application was, from ordinary photos, to undress people in a few clicks, creating realistic nudes. It is noteworthy that the system worked without the consent of those who were in photography and only with images of women, exchanging the blouse for breasts and the cala for vulvas. Motherboard tested DeepNude, which did not work with male photos and, according to the report, got better results with photos of women in light clothes showing more skin.
DeepNude's website defined the service as "automated offline software that transforms photos, creating fake nudes". The program was free and ran on Windows and Linux computers. Those who wanted high-resolution images could buy the premium version for $ 50. All fake images generated by the platform contained a watermark with the word FAKE.
While DeepNude's algorithm is similar to that used for complex deepfake video creation, the existence of an application that creates fake images in about 30 seconds could popularize the malicious use of harassing technology, for example. Which, according to the software developer (which remains anonymous), would not be a problem "if anyone has bad intentions, having DeepNude doesn't change much … If I don't, someone will do it in a year," he said. in an interview with Motherboard before ending platform.
According to the closing announcement, the developers did not foresee that the app would go viral, as "not so good", but have no interest in making money from inappropriate use of the system. "The chances of misuse of the application, if we had 500,000 users, would be high. We do not want to profit this way," they explain.
This is not the first time that artificial intelligence has been used with sexual connotation to manipulate images of women. In 2018, deepfake technology became known for including the faces of famous actresses in porn movies. The term refers to an AI-based technology capable of generating fake videos. The technique allows you to insert real people's faces into miscellaneous clips, or generate an audio to look like a person said that. Since its inception, deepfake has been used for political purposes (with fakes by great public people like Trump and Obama) or for pornographic purposes.
"This is the brief story and the end of DeepNude. We started this project a few months ago for user entertainment. We thought the monthly sales were few and under control. Honestly, the app is not that good, and it just works on. some photos. We never thought the app would go viral, and we will no longer be able to control access traffic. We underestimate the demand.
People who failed to upgrade will be reversed.
The world is not ready for DeepNude yet. "