Lawmakers brainstorming bills to address AI sexual abuse

Lawmakers brainstorming bills to address AI sexual abuse

Editor’s note: This story describes suicide and sexual abuse.

Deepfakes — fake images and videos made with machine-learning technology that can realistically depict people’s faces  — have made headlines in recent years for tricking people and even the news in a lot of relatively harmless ways, such as a viral image of Pope Francis sporting a Rihanna-esque puffer coat.

But lawmakers and advocates say photo-generative software represents a serious escalation of “image-based sexual abuse”: the practice of blackmailing or extorting an individual by threatening to leak nude or sexual images of them. 

Those threats have been increasingly made using fake images or videos created by artificial intelligence software that can put a person’s face on bodies that aren’t their own. Vermont — and all but a few states — has few legal protections against that behavior.

In July 2015, a Vermont law took effect to address a kind of abuse often called “revenge pornography,” criminalizing the disclosure of sexually explicit images without the consent of the person depicted. Almost every other state has a law like that, but only four have provisions that deal specifically with deepfakes, according to the Cyber Civil Rights Initiative, and Vermont isn’t one of them. 

One of the two primary sponsors of that 2015 legislation, Rep. Barbara Rachelson, D-Chittenden 6, is working on an amendment to that law to better protect against new uses of technology to harass people online.

Rachelson attended a briefing at the White House in April with other state representatives from around the country to discuss new policies to address the issue. 

In an interview, Rachelson was quick to say the term “revenge porn” only describes one type of this abuse. “One thing that I've learned from going to the White House event was that we really shouldn't necessarily call it ‘revenge pornography’ because in some cases, the motive is different,” she said. “So it's not always revenge.” 

Abuse involving deepfake technology and other forms of digital sexual harassment gained more attention after the 2015 law passed, Rachelson said, and she wants to close some of the gaps in the law next legislative session.

Rachelson said the statute’s language around people leaking compromising photographs creates problems for victims. “What I learned were some of the cases that have gotten filed in Vermont, since the law passed, didn't end up being able to make a guilty conviction because, one, it was not the ex-boyfriend, but his new girlfriend who posted the pictures,” she said, referring to the law’s requirement that the blackmailer be an ex-partner. 

Another case in Vermont involved someone out of state, she said. “We can try to write as many safeguards into it as possible, but Congress also needs to do some legislation in order to protect inter-state situations,” said Rachelson. 

A popular argument in discussions about nude picture leaks is that individuals simply  shouldn’t take and share nude photo, but both Rachelson and Catherine Ducasse, associate director and victim advocate at HOPE Works, which provides services to sexual assault survivors, said that is the wrong way to approach this issue. 

“It's never a person's fault for their trust being violated,” Ducasse said. “That is only the perpetrator or the abusers' fault who shared the pictures nonconsensually. The onus is 100% on the person who shares those pictures nonconsensually.” 

With the implementation of AI in image-based harassment, sometimes the images affect individuals who never even took or shared nude pictures. So-called “deepfake pornography” is a new symptom of hyper-realistic AI image generation, and perpetrators are demanding money or real nude photographs to not share the generated photos publicly. 

“It’s the latest scam,” said Rachelson. “Some people are making a living doing this now. And, you know, it's important to think about sensible ways to address it.” 

Rachelson said a Florida lawmaker at that April White House briefing was the victim of deepfake pornography. “It’s being used to sort of politically ruin her, in addition to just sort of shame and humiliate her, but they're not even really photos of her.”

Part of limiting the impact of this issue begins with media literacy, said Rachelson. “It's more about also empowering young people to know who they're talking to online and to help them figure out if they're being scammed or to be thoughtful”

Lots of victims of image-based sexual harassment struggle with mental health issues throughout and following the experience due to the traumatic and personal nature of these kinds of attacks, Rachelson said. 

She said she has heard of instances in which individuals who were closeted had their sexual identity exposed to the public and personal circles unwillingly adults have had their images posted to sites used by their place of employment; and minors have seen images of themselves sent to family members, teachers and classmates.

Vermont Attorney General Charity Clark told Community News Service in a statement that her office is monitoring these technological trends “with concern.”

“AI must not be used to victimize anyone — particularly vulnerable populations, like children," Clark said.

Clark’s chief of staff Lauren Jandl said their office doesn’t have any data on cases involving AI yet. But “sextortion” — the practice of extorting money or sexual favors from someone by threatening to reveal evidence of their sexual activity — has been on the rise in Vermont, Jandl said, especially cases involving children and social media.

“It seems like that is a national trend,” said Jandl. 

Jandl emphasized that the lack of data doesn’t mean AI-related cyber harassment isn’t an issue in Vermont. “There is always a chance that this could have been reported to local law enforcement, which wouldn't necessarily then come to the Attorney General's office's attention.” 

“It is certainly a very interesting and disturbing issue,” Jandl added. 

Rachelson said there’s a link between cyber harassment and youth suicide attempts. 

“Victims of this .. didn't feel like they could talk to their parents or anybody about it,” said Rachelson. “I did hear from our attorney general's office that they're hearing about cases like this, but even high school students, teens or middle schoolers, it could be anybody.” 

Ducasse shared some of the strategies HOPE Works recommends. “When you're the victim of this, it feels like it's out of your control. So really trying to figure out okay, what's out of my control and what’s in my control and what are my options now, so you can regain some control in some parts of your life, just to kind of be able to move forward.” 

She said counseling could be a good idea, and the group suggests finding techniques to ground yourself.

Rachelson described the phenomena as feeling “so unbelievably hopeless,” but she is optimistic about changing Vermont’s laws on the issue.

“I know that people are going to be interested in being co-sponsors and hearing more,” she said.

If you or someone you care about is a victim of image-based sexual harassment, visit cybercivilrights.org to get help or learn more. You can also visit HOPE Work’s 24/7 confidential free chat-line for survivors at hopeworksvt.org/chatline-security.

Balint, Sanders and Welch join EPA chief in Waterbury to cheer solar loan program

Balint, Sanders and Welch join EPA chief in Waterbury to cheer solar loan program

Burlington group takes down anti-trans stickers while officials consider their options

Burlington group takes down anti-trans stickers while officials consider their options