#
#

Alan Wake - 1 hour ago

Controversial Deepfake App DeepNude Shuts Down Hours After Exposure

Blog image

Less than a day after gaining widespread attention, the controversial deepfake app DeepNude, which used AI to create fake nude photos of women, has been shut down. The creators cited the high probability of misuse as the primary reason for the shutdown.

Key Takeaways

  • DeepNude used AI to create fake nude photos of women.
  • The app was available for Windows and Linux.
  • A free version placed a large watermark on images, while a paid version had a smaller, easily removable watermark.
  • The creators underestimated the interest and potential misuse of the app.
  • The app has been permanently taken down, and further versions will not be released.

The Rise and Fall of DeepNude

DeepNude, an app that used artificial intelligence to alter photos and make a person appear nude, was designed to work exclusively on women. The app was available for both Windows and Linux platforms. A free version of the app placed a large watermark across the images, indicating that they were fake. However, a paid version placed a smaller watermark in a corner, which could easily be removed or cropped out.

The app had been on sale for a few months, but it was not until Motherboard drew attention to it that it gained widespread notoriety. The creators of DeepNude admitted that the app was "not that great" at what it did, but it still worked well enough to raise significant concerns about its potential misuse.

Concerns and Misuse

While digital manipulation of photos is not new, DeepNude made this capability instantaneous and accessible to anyone. This raised alarms about the potential for the app to be used to harass women. Deepfake software has already been used to edit women into pornographic videos without their consent, leaving them with little recourse to protect themselves as these videos spread online.

The creator of the app, known only as Alberto, stated that he believed someone else would have created a similar app if he had not done it first. "The technology is ready (within everyone's reach)," he said. Alberto also mentioned that the DeepNude team would "quit it for sure" if they saw the app being misused.

The Shutdown

In a tweet, the DeepNude team announced that they had "greatly underestimated" the interest in the project and that the probability of misuse was too high. As a result, DeepNude will no longer be offered for sale, and no further versions will be released. The team also warned against sharing the software online, stating that it would be against the app's terms of service. They acknowledged, however, that "surely some copies" would still get out.

The team concluded their message by stating, "the world is not yet ready for DeepNude," implying that there might be a future time when such software could be used appropriately. However, as deepfake technology becomes easier to create and harder to detect, the primary issue remains the potential for misuse. Currently, there are few protections in place to prevent the misuse of such apps, leaving individuals vulnerable to having their images manipulated and misused.

Sources