close
close
Fri. Sep 13th, 2024

Deepfake Nudes: San Francisco Sues AI Deepfake Porn Sites: 200 Million Visits Fuel Harmful Trend To Undress Women And Girls

Deepfake Nudes: San Francisco Sues AI Deepfake Porn Sites: 200 Million Visits Fuel Harmful Trend To Undress Women And Girls

San FranciscoDeputy Chief City Attorney Yvonne Meré filed a process against 16 websites that use artificial intelligence to create deepfake pornography by undressing women and girls in photos without their consent.
This unprecedented legal action aims to shut down these popular sites which have emerged as a harmful trend among teenage boys using nudity apps to manipulate the images of their female classmates.
right New York TimesThe 16 sites targeted were visited 200 million times in the first six months of this year. The entities behind these websites are located in California, New Mexico, United Kingdom and Estonia. When contacted for comment, representatives for the sites were either unavailable or did not respond.
One site advertises its services by asking, “Do you have someone to undress?” Another states: “Imagine wasting your time taking her on dates”, claiming users use the site “to get her nudes”. Some sites offer free initial images, but then charge for more, accepting cryptocurrency or credit card payments.
The deepfake technology used by these sites relies on AI models trained with real pornography and images depicting child abuse to generate authentic-looking nude photos from clothed images.
Prosecutor David Chiu, the office’s top lawyer, emphasized the minimal repercussions for those behind the images. He noted the challenge in identifying the websites responsible once images begin circulating, making it difficult for victims to pursue legal action successfully.
“The article was flying around our office and we were like, ‘What can we do about this?'” Chiu recalled in an interview. “No one has tried to hold these companies accountable.”
Sara Eisenberg, head of the legal unit focusing on major social problemsemphasized that the problem cannot be solved simply by educating teenagers about the safe use of technology. Any photo can be manipulated without the subject’s consent, rendering traditional guarantees ineffective.
“You can be as internet and social media savvy as you want and you can teach your kids all the ways to protect themselves online, but none of that can protect them from someone using these sites to it does really awful and harmful things. Eisenberg said.
The suit seeks an injunction to shut down the websites and permanently prevent them from creating deepfake pornography in the future. It also seeks civil penalties and attorneys’ fees.
The lawsuit alleges that these sites violate state and federal revenge pornography laws, child pornography laws, and the California Unfair Competition Act, which prohibits illegal and unfair trade practices.
Meré took action after reading about the harmful effects of deepfake images in a New York Times article. He immediately contacted Eisenberg and together they sought support from Chiu to create the process.
“Technology was used to create deepfake nudes everyone from Taylor Swift to average middle school girls with little apparent repercussions,” Chiu said. “The images are sometimes used to extort money from victims or humiliate and harass them.”
Experts warn that deepfake pornography poses serious risks to victims, affecting their mental health, reputation, college and job prospects. The problem is exacerbated by the difficulty in tracing the origin of the images, which makes legal recourse difficult.
“This strategy could be seen as a Whac-a-Mole approach, as multiple sites could emerge,” Chiu admitted. However, the process proposes to add more sites as they are discovered, aiming for wider application as the problem evolves.
San Francisco, being a hub for artificial intelligence industry with major companies like OpenAI and Anthropic based there, it’s a fitting place for this legal challenge. Chiu acknowledged the positive contributions of the AI ​​industry, but emphasized that deepfake pornography is a “dark side” that needs to be addressed.
“Keeping up with the fast-changing industry as a government attorney is daunting,” Chiu said. “But that doesn’t mean we shouldn’t try.”
The lawsuit marks a significant effort to combat the misuse of AI technology in the creation of harmful content and hold accountable those who perpetuate these destructive practices.

Related Post