#
#

Alan Wake - 1 hour ago

San Francisco Files Lawsuit Against 16 Deep Nude Websites

Blog image

San Francisco has initiated legal action against the operators of 16 websites that offer AI-assisted undressing of individuals in images. The lawsuit targets violations of US and California laws, including those against revenge and child pornography, and aims to shut down these services permanently.

Key Takeaways

  • Jurisdiction: The lawsuit is filed by the city of San Francisco, citing local victims as the basis for jurisdiction.
  • Global Reach: The accused companies are based in various countries, including Estonia, Serbia, the UK, and the US.
  • High Traffic: The websites received over 200 million visits in the first half of the year.
  • Legal Violations: The defendants are accused of violating laws against revenge pornography, child pornography, and unfair competition.
  • Objective: The lawsuit aims to shut down the websites, impose fines, and deter future activities.

Background

The city of San Francisco has filed a lawsuit against the operators of 16 websites that offer AI-assisted undressing of people in images. These services, often referred to as deepnudes or deepfake pornography, are accused of violating multiple laws, including those against revenge pornography and child pornography. The lawsuit also alleges that these websites engage in unfair competition.

Jurisdiction and Scope

At first glance, it may seem unusual for a city law firm to bring such a lawsuit, especially when most of the accused companies are based outside the United States. However, David Chiu, the attorney for the city of San Francisco, explained that the victims of these services, primarily women and girls, are also located in California. This gives his office the jurisdiction to act. Chiu also noted that it is not uncommon for lawsuits with far-reaching implications to be initiated at the city level.

Website Traffic and User Activity

In the first six months of this year alone, the websites in question received over 200 million visits. Users upload images of clothed individuals and can have them "undressed" for a fee. The lawsuit aims to shut down these services, impose fines on the operators, and deter future activities. These websites pose a significant global problem, leaving victims helpless.

Impact on Victims

The complaint highlights several troubling cases, including incidents in schools where images of girls were circulated after being generated by these websites. There are also known cases of attempted blackmail, where victims are asked to pay money to prevent intimate images from being published. In some instances, perpetrators used images readily available on social networks.

International Efforts

Efforts to combat such services are not limited to the United States. The UK, for example, is working on legislation to criminalize both the production and distribution of such images. This international focus underscores the global nature of the problem and the need for coordinated action.

Conclusion

The lawsuit filed by San Francisco aims to address a growing and troubling issue by shutting down 16 websites that offer AI-assisted undressing services. By targeting these operators, the city hopes to impose fines and deter future activities, thereby protecting potential victims from exploitation and harm.

Sources