The creators of the controversial DeepNude app, which used artificial intelligence to create fake nude images of women, have shut it down following widespread public outrage and concerns over its potential for abuse.
Key Takeaways
- DeepNude app used AI to create fake nude images of women.
- The app was taken offline after public backlash and concerns over misuse.
- Developers admitted they underestimated the demand and potential for abuse.
- Despite the shutdown, concerns remain about the app's continued availability through other means.
The Rise and Fall of DeepNude
DeepNude, an app that allowed users to digitally remove clothes from images of women to create fake nudes, was launched a few months ago. The app quickly gained notoriety for its potential to be misused, leading to a significant public outcry. The developers, who initially created the app for "entertainment," admitted they had greatly underestimated the demand and the potential for abuse.
Public Outrage and Ethical Concerns
The app faced severe criticism from various quarters, including anti-revenge porn campaigners and legal experts. Mary Anne Franks, a law professor and president of the Cyber Civil Rights Initiative (CCRI), called the app's intended use "predatory and grotesque." The CCRI and other advocacy groups highlighted the app's potential to facilitate non-consensual pornography and other forms of harassment.
Developer's Response and Shutdown
In response to the backlash, the developers announced on Twitter that they were shutting down the app. They acknowledged that the probability of misuse was too high and stated, "We don't want to make money this way." The developers also offered refunds to those who had purchased the app and urged users not to share it further.
Ongoing Concerns
Despite the shutdown, there are ongoing concerns about the app's availability. Copies of the software can still be found online, and there are fears that it could be used to create non-consensual nude images. Platforms like Discord have already taken action against servers distributing the app, but the threat remains.
The Broader Implications of Deepfake Technology
The DeepNude app is part of a broader trend of deepfake technology, which uses AI to create realistic but fake images and videos. While deepfakes can have legitimate uses, they also pose significant ethical and legal challenges. The potential for misuse in creating non-consensual pornography, spreading misinformation, and other harmful activities is a growing concern.
Conclusion
The shutdown of the DeepNude app is a significant step in addressing the ethical and legal issues surrounding deepfake technology. However, the incident underscores the need for ongoing vigilance and regulation to prevent the misuse of such technologies in the future.
Sources
- 'DeepNude' app to 'undress' women shut down after furor, Phys.org.
- App that can remove women's clothes from images shut down, BBC.
- Discord Just Banned a Server Selling DeepNude, an App That Undresses Photos of Women, VICE.
- DeepNude app that lets users create fake nudes of women is taken down | Technology News - The Indian Express, The Indian Express.
- App which used algorithm to ‘undress’ women and create fake nudes shut down | The Independent | The Independent, The Independent.