San Francisco's city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to "nudify" or "undress" photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online.
These sites, Chiu's suit claimed, are "intentionally" designed to "create fake, nude images of women and girls without their consent," boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images.
"In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated" non-consensual intimate imagery (NCII) and "this distressing trend shows no sign of abating," Chiu's suit said.
"Given the widespread availability and popularity" of nudify websites, "San Franciscans and Californians face the threat that they or their loved ones may be victimized in this manner," Chiu's suit warned.
In a press conference, Chiu said that this "first-of-its-kind lawsuit" has been raised to defend not just Californians, but "a shocking number of women and girls across the globe"—from celebrities like Taylor Swift to middle and high school girls. Should the city official win, each nudify site risks fines of $2,500 for each violation of California consumer protection law found.
Comments
( ̄□ ̄;)
There's nothing here…