A.I. Evolution Creates New Form Of Online Sexual Abuse
A new form of “image-based sexual abuse” is on the rise among American teens using AI “nudification” apps to taunt female students.
New research shows a growing trend among high school students across the country who are using these apps to generate and share fake naked photos of their classmates, Vox reports. Students in schools from California to Illinois have fallen victim to deepfake nudes being shared without their consent.
While revenge porn has been an issue for years, the emergence of deepfake technology makes it so that “anybody can just put a face into this app and get an image of somebody — friends, classmates, coworkers, whomever — completely without clothes,” said Britt Paris, an assistant professor of library and information science at Rutgers who has studied deepfakes.
Male students at Issaquah High School in Washington used a nudification app to “strip” photos of girls who attended their homecoming dance last fall. 10th-grade boys at Westfield High School in New Jersey shared fake X-rated photos of their female classmates around the school. The heightening fad is leading to legislation that would impose penalties on those found guilty of sharing fabricated images.
Washington, South Dakota, and Louisiana have already passed laws against generating and sharing fake nudes with states like California and others following close behind. Rep. Joseph Morelle (D-NY) recently reintroduced a bill that would make sharing deepfake nudes a federal crime.
While legislation will help, many are calling for a closer look at the apps behind the growing AI nudification trend. Amy Hasinoff, a communications professor at the University of Colorado Denver, believes the laws would only serve as a “symbolic gesture” unless something is done to combat the apps being used to generate the images.
“I am struggling to imagine a reason why these apps should exist,” Hasinoff said.
Lawmakers are also working to regulate the app stores that offer the nudification apps to bar them from being carried without clear consent provisions. Apple and Google removed several apps that offered deepfake nudes from the App Store and Google Play.
Fifteen-year-old Westfield student Francesca Mani was a victim of a deepfake image and shared how traumatic the experience was for her.
“I was in the counselor’s office, emotional and crying,” Mani said. “I couldn’t believe I was one of the victims.”
Even when the images are fake, the victims can deal with “shaming and blaming and stigmatization,” brought on by stereotypes that sexualize the female victims and make them appear more sexually active, Hasinoff notes.
“These images put these young women at risk of being barred from future employment opportunities and also make them vulnerable to physical violence if they are recognized,” Yeshi Milner, founder of the nonprofit Data for Black Lives said.
To combat deepfake images, nine states have passed or updated laws to penalize those involved, with more states on the rise. A federal bill introduced in 2023 would give victims or parents the ability to sue perpetrators for damages and press criminal penalties. While it has yet to pass in Congress, the bill has growing bipartisan support.
Some remain skeptical about the impact of the laws as AI nudification apps remain available for use.
“Until companies can be held accountable for the types of harms they produce,” Paris said. “I don’t see a whole lot changing.”
RELATED CONTENT: New AI Technology Can Detect Cancer On Mammograms Sooner