The US Needs Deepfake Porn Laws. These States Are Leading the Way

Last year, WIRED reported that deepfake pornography is only increasing, and researchers estimate that 90 percent of deepfake videos are of porn, the vast majority of which is nonconsensual porn of women. But despite how pervasive the issue is, Kaylee Williams, a researcher at Columbia University who has been tracking nonconsensual deepfake legislation, says she has seen legislators more focused on political deepfakes.

“More states are interested in protecting electoral integrity in that way than they are in dealing with the intimate image question,” she says.

Matthew Bierlein, a Republican state representative in Michigan, who cosponsored the state’s package of nonconsensual deepfake bills, says that he initially came to the issue after exploring legislation on political deepfakes. “Our plan was to make [political deepfakes] a campaign finance violation if you didn’t put disclaimers on them to notify the public.” Through his work on political deepfakes, Bierlein says, he began working with Democratic representative Penelope Tsernoglou, who helped spearhead the nonconsensual deepfake bills.

At the time in January, nonconsensual deepfakes of Taylor Swift had just gone viral, and the subject was widely covered in the news. “We thought that the opportunity was the right time to be able to do something,” Beirlein says. And Beirlein says that he felt Michigan was in the position to be a regional leader in the Midwest, because, unlike some of its neighbors, it has a full-time legislature with well-paid staffers (most states don’t). “We understand that it’s a bigger issue than just a Michigan issue. But a lot of things can start at the state level,” he says. “If we get this done, then maybe Ohio adopts this in their legislative session, maybe Indiana adopts something similar, or Illinois, and that can make enforcement easier.”

But what the penalties for creating and sharing nonconsensual deepfakes are—and who is protected—can vary widely from state to state. “The US landscape is just wildly inconsistent on this issue,” says Williams. “I think there’s been this misconception lately that all these laws are being passed all over the country. I think what people are seeing is that there have been a lot of laws proposed.”

Some states allow for civil and criminal cases to be brought against perpetrators, while others might only provide for one of the two. Laws like the one that recently took effect in Mississippi, for instance, focus on minors. Over the past year or so, there have been a spate of instances of middle and high schoolers using generative AI to make explicit images and videos of classmates, particularly girls. Other laws focus on adults, with legislators essentially updating existing laws banning revenge porn.

Unlike laws that focus on nonconsensual deepfakes of minors, on which Williams says there is a broad consensus that there they are an “inherent moral wrong,” legislation around what is “ethical” when it comes to nonconsensual deepfakes of adults is “squishier.” In many cases, laws and proposed legislation require proving intent, that the goal of the person making and sharing the nonconsensual deepfake was to harm its subject.


Source link

About admin

Check Also

Logitech’s MX console for creatives

Deep into Adobe apps like Photoshop or Premiere? Logitech’s MX Creative Console is designed to …

Leave a Reply

Your email address will not be published. Required fields are marked *