Congress Targets AI Deepfakes: Too Little, Too Late?

Lightspring / shutterstock.com
Lightspring / shutterstock.com

You’ve got to hand it to Congress. When you think they’ve hit rock bottom, they find a new way to dig deeper. This time, it’s a bipartisan effort to tackle the cesspool of websites spreading non-consensual, sexually explicit deepfakes. Yes, folks, they finally found something to agree on, but don’t get your hopes up yet.

In recent months, both Republicans and Democrats have been pushing various bills aimed at holding those responsible for distributing deepfake pornography accountable. These proposed laws would also allow victims to seek financial compensation. Deepfakes are not just harmless fun; they’re AI-generated images or videos that put people in fake, often compromising situations. A study from Deeptrace Labs in 2019 revealed that a whopping 96% of all deepfake videos were non-consensual pornography. Imagine that. Anyone with a phone can now create a disturbingly realistic explicit video of someone else, even children, without their consent.

Shockingly, there are no federal laws to stop these vile websites. Senator Dick Durbin (D-IL) recently lamented that hundreds of apps can make non-consensual, sexually explicit deepfakes right on your phone. It’s high time Congress took action against this growing crisis.

Advocacy groups have been banging the drum for over a year, demanding that Congress do something about these malicious websites. We’ve seen deepfakes of public figures like Taylor Swift and Rep. Alexandria Ocasio-Cortez (D-NY). Ocasio-Cortez, a victim of deepfakes, has teamed up with Durbin to push the DEFIANCE Act. This legislation aims to stop the spread of non-consensual, sexually explicit deepfakes by providing a federal civil remedy for victims. The act would allow lawsuits against anyone who creates or distributes these deepfakes with malicious intent.

Republicans have their approach. Rep. Nancy Mace (R-SC) has proposed bills to increase fines for distributing non-consensual pornography from $150,000 to a hefty $500,000. Meanwhile, Sens. Ted Cruz (R-TX) and Amy Klobuchar (D-MN) have joined forces for the bipartisan TAKE IT DOWN Act. This bill would criminalize not only the publication but also the threat to publish non-consensual AI deepfakes. It would also force websites and social media platforms to remove such content swiftly to curb its spread.

However, some lawmakers are concerned that tech companies might hide behind Section 230 of the Communications Decency Act. Passed in 1996, this section protects tech giants from being held liable for the content posted by their users. Attorney Carrie Goldberg, known for representing many of Harvey Weinstein’s accusers, argues that this section should be scrapped altogether. She believes it prioritizes the interests of Big Tech over those of victims.

Legislative progress has slowed to a crawl as the election season heats up. Many lawmakers are more focused on keeping their seats than passing meaningful laws. Sen. Cynthia Lummis (R-WY) quickly shot down the DEFIANCE Act when Durbin introduced it, despite her support for the TAKE IT DOWN Act. Lummis argued that the DEFIANCE Act’s language was too broad and could harm online privacy and innovation, ultimately failing to help the victims it aimed to protect.

Yet, amidst the usual political infighting, there is strong bipartisan support for tackling this issue. Victims of non-consensual pornographic deepfakes have waited far too long for federal legislation to hold perpetrators accountable. Congress needs to step up and show victims that they are not forgotten.

So there you have it, folks. Congress is finally trying to do something right, but as always, it’s a slow and painful process. Let’s hope they can put aside their differences and pass meaningful laws to protect innocent people from these crimes. Because if they can’t get this right, what hope do we have for anything else?