#
#

Alan Wake - 1 hour ago

Controversial 'DeepNude' App Taken Down Amidst Online Outrage

Blog image

A controversial app named 'DeepNude,' which used AI to create fake nude images of women from clothed photos, has been taken down following widespread online outrage. The app, which was available for free download, faced severe backlash for its potential misuse and ethical concerns.

Key Takeaways

  • 'DeepNude' app used AI to generate fake nude images of women.
  • The app was taken down due to overwhelming traffic and ethical concerns.
  • Developers announced refunds for premium users.
  • Experts warn that deepfake technology poses ongoing risks.

The Rise and Fall of 'DeepNude'

'First reported by Vice, 'DeepNude' allowed users to upload any photo of a dressed woman and used an algorithm trained on over 10,000 photos of naked women to generate a fake nude image. The app could be downloaded for free, but users had to pay $50 (£39) to remove a large watermark from the generated image.

Initially, the developers claimed that the site was taken offline because it couldn't handle the amount of traffic. However, they later admitted that the app's potential for misuse was too high, stating, "Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high."

Ethical Concerns and Public Outrage

The main concerns surrounding 'DeepNude' included its potential use for creating revenge porn and the fact that it was being used without the consent of the women in the photos. Katelyn Bowden, founder and CEO of the revenge porn activism organization Badass, described the app as "absolutely terrifying," warning that anyone could become a victim of revenge porn without ever having taken a nude photo.

Danielle Citron, a professor at the University of Maryland Carey School of Law, echoed these concerns, stating, "Yes, it isn’t your actual vagina, but... others think that they are seeing you naked. As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore."

The Future of Deepfake Technology

While 'DeepNude' has been taken down, experts believe that the problem of deepfakes is only just beginning. As the technology becomes easier to make and harder to detect, the risks associated with its misuse continue to grow. The developers of 'DeepNude' have stated that they will not be releasing any other versions of the app and will be refunding customers who paid for the premium version.

How 'DeepNude' Worked

Here's a breakdown of how the controversial app operated:

  1. Users could upload any photo of a dressed woman.
  2. The app contained an algorithm trained on over 10,000 photos of naked women.
  3. This algorithm generated an image of what it thought the woman would look like naked by comparing the uploaded image with the other naked photos it had seen.
  4. The algorithm only worked on women because nude photos of women were easier to find online, but the developer hoped to eventually create a male version.

Conclusion

The removal of 'DeepNude' highlights the ethical and societal challenges posed by deepfake technology. While the app may be gone, the conversation around the responsible use of AI and the protection of individuals' privacy continues to be more relevant than ever.

Sources