We Need A “Swift” Resolution: Taylor Swift Controversy Sparks Federal Debate About Pornographic Deepfakes

Taylor GilbertsonNews & Insights

Sexually explicit, synthetic images of Taylor Swift inundated social media last month, sparking fierce backlash.[1] This is no new phenomenon: the widespread distribution of these non-consensual, “deepfake” pornographic images is increasingly common in the era of artificial intelligence (“AI”), and tends to disproportionately target women.[2] Deepfakes are AI generated images, video, audio, and text that depict real people in fabricated situations or circumstances.[3] This most recent incident sparked a national debate about the need for greater protections, as the government faces mounting pressure to criminalize the practice on a federal level.[4]

Currently, there are no federal protections against deepfakes. At present, most states have enacted some type of “revenge porn” statute,[5] but regulating deepfake pornography is entirely new territory. California is one of the few states that has expanded beyond already-existing laws, pioneering legislation that specifically targets those who share explicit deepfake content.

On September 22, 2019, California introduced Assembly Bill 602 (“AB 602”), which affords residents a civil remedy if they are targeted by these illicit attacks, including economic and noneconomic damages, and up to $150,000 if the act was committed with malice.[6] Although California residents have the right to sue under AB 602 if their images are used for sexually explicit purposes,[7] there remains the lingering question of whether these civil protections are sufficient.[8] A new study from cybersecurity company Deeptrace shows that “96% of deepfakes posted online are sexual explicit and 99% of those are women who work in entertainment.”[9] Given California’s strong ties to the entertainment industry, it makes sense the state would need to address this issue vigorously, but are state-specific, civil protections enough?

The Taylor Swift controversy has reignited calls to address the issue by introducing a federal standard. Even prior to this latest incident, federal lawmakers have been navigating the complex landscape of pornographic deepfakes and searching for ways to quash these increasingly common attacks.[10]

As new state legislation emerges, so does the need for a federal standard to provide uniform guidelines. However, there are disputes among states surrounding how to categorize and define “deepfake.”[11] The differences extend beyond mere definition disputes: some states require “illicit motive,” before a claim against an offender is actionable, which places a higher burden on those seeking relief.[12] In addition, while some states opt for standalone laws, others states, like Illinois, are expanding on their already-existing revenge porn laws.[13] This brings another important question to the forefront of this debate: do we need entirely new laws?

The proposed federal laws are wide-ranging. The Disrupt Explicit Forged Images and Non-Consensual Edits (“DEFIANCE”) Act[14] is the latest in a long line of proposed legislation that creates a federal, civil remedy for those implicated in deepfake attacks.[15] The Act was introduced by a bipartisan group of senators, as a bill “[t]o improve the rights to relief for individuals affected by non-consensual activities involving intimate digital forgeries, and for other purposes.”[16] Similarly, the DEEPFAKES Accountability Act would mandate disclosures on deepfake images and criminalize those that do not contain such disclosures.[17] Additionally, the Preventing Deepfakes of Intimate Images Act seeks to “prohibit the non-consensual disclosure of digitally altered intimate images.”[18] This proposed legislation would not only create a civil avenue for relief, but it would also criminalize these attacks in circumstances where the images are sexually explicit.[19] Anyone who violates this proposed law is subject to a $150,000 fine, or up to 10 years in prison if the individual shares images that facilitate violence.[20]

New legislation targeting deepfake pornography could be further delayed––or halted entirely––by First Amendment restraints. The First Amendment states that “Congress shall make no law. . . abridging the freedom of speech.”[21] Freedom of speech is not constricted to words that are merely spoken––it also protects a wide range of alternate speech, including written works, online posts, and video games.[22] The First Amendment is not all-encompassing, however, and there is a wide range of speech that is not protected by the Constitution, including obscenity, defamatory speech, and child pornography.[23]

Legislation aimed at regulating deepfakes could be barred by constitutional restraints, unless they are situated within an unprotected category.[24] However, even these categories have limitations.  For instance, obscenity is historically defined as content which is “utterly without redeeming social value.”[25]  In Miller v. California, the court defined obscenity as material which: appeals to the prurient interest in sex (judged by contemporary community standards), depicts or describes sexual conduct in a patently offensive way, and is devoid of “literary, artistic, political or scientific value.”[26] Notably, speech about sex, including adult pornography, is still protected due to its perceived value.[27] Thus, one challenge against AI-generated deepfakes may be to prove that those images lack such value.

Child pornography restrictions also have limitations. In Ashcroft v. Free Speech Coalition, the Supreme Court ruled that fake digital generations of child pornography are protected by the first Amendment because no children were harmed.[28] Courts have recognized that real children must depicted in the content in order for it to be deemed constitutionally unprotected speech.[29] If the proposed legislation targeting deepfakes is situated within either category (obscenity or child pornography), then these laws could be enacted and not run afoul of the Constitution.[30]

Though the wider repercussions of failing to protect against deepfake pornography on a federal level remains unclear, what is substantially more certain is that, in the era of AI, the distribution of these distorted images will become more common, and anyone is at risk. According to Loyola Law School Professor Rebecca Delfino, however, there are ways to mitigate further harm.

“First, we need a call of action, not dissimilar to what we witnessed in the aftermath of the Taylor Swift controversy, when her fans united with a common goal: to get the explicit images immediately removed from all platforms,” said Professor Delfino. “Likewise, we need a large movement, with dedicated people on the ground calling on Congress to create federal legislation. Finally, we need to empower victims by giving them control over their images. We need a legal framework that transforms victims’ images into property rights and provides both civil and criminal penalties against anyone who tries to violate them.”[31]

[1] Ben Beaumont-Thomas, Taylor Swift Deepfake Pornography Sparks Renewed Calls for US Legislation, The Guardian (Jan. 26, 2024), https://www.theguardian.com/music/2024/jan/26/taylor-swift-deepfake-pornography-sparks-renewed-calls-for-us-legislation.

[2] Id.

[3] Bradley Waldstreicher, Deeply Fake, Deeply Disturbing, Deeply Constitutional: Why the First Amendment Likely Protects the Creation of Pornographic Deepfakes, Cardozo Law Review Volume 42, https://cardozolawreview.com/deeply-fake-deeply-disturbing-deeply-constitutional-why-the-first-amendment-likely-protects-the-creation-of-pornographic-deepfakes/ (last visited Feb. 25, 2024).

[4] Bill Chappell, Deepfakes Exploiting Taylor Swift Images Exemplify a Scourge with Little Oversight, NPR (Jan. 26, 2024), https://www.npr.org/2024/01/26/1227091070/deepfakes-taylor-swift-images-regulation.

[5] Elliott Davis Jr., These States Have Banned the Type of Deepfakes That Targeted Taylor Swift, US News (Jan. 30, 2024), https://www.usnews.com/news/best-states/articles/2024-01-30/these-states-have-banned-the-type-of-deepfake-porn-that-targeted-taylor-swift.

[6] K.C. Halm, et al., Two New California Laws Tackle Deepfake Videos in Politics and Porn. Davis Wright Tremaine LLP (Feb. 28, 2020), https://www.dwt.com/blogs/media-law-monitor/2020/02/two-new-california-laws-tackle-deepfake-videos-in#:~:text=did%20not%20consent.-,Cal.,1708.86(a)(4).

[7] Kari Paul, California Makes ‘Deepfake’ Videos Illegal, but Law May Be Hard to Enforce, The Guardian (Oct. 7, 2024), https://www.theguardian.com/us-news/2019/oct/07/california-makes-deepfake-videos-illegal-but-law-may-be-hard-to-enforce.

[8] Id.

[9] Id.

[10]Geoff Mulvihill, What to Know About How Lawmakers Are Addressing Deepfakes Like the Ones That Victimized Taylor Swift, AP News (Jan. 31, 2024), https://apnews.com/article/deepfake-images-taylor-swift-state-legislation-bffbc274dd178ab054426ee7d691df7e.

[11] Cassandre Coyer, States are Targeting Deepfake Pornography – But Not in a Uniform Way, Law.com (Aug. 10, 2023), https://www.law.com/legaltechnews/2023/08/10/states-are-targeting-deepfake-pornography-but-not-in-a-uniform-way/?slreturn=20240102155147.

[12] Id.

[13] Id.

[14] The Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, S. 3696, 2024 https://www.congress.gov/118/bills/s3696/BILLS-118s3696is.pdf.

[15] Id.

[16] Id.

[17] The Deepfakes Accountability Act of 2023, H.R.5586, 118th Congress, 2023 https://www.congress.gov/118/bills/hr5586/BILLS-118hr5586ih.pdf.

[18] Bill Donahue, The Taylor Swift Deepfakes Were Awful. How Do We Stop the Next One?, Billboard (Feb. 9, 2024), https://www.billboard.com/business/legal/taylor-swift-deepfakes-illegal-stopped-1235593162/.

[19] Id.

[20] Ashley Belanger, Sharing Deepfake Porn Could Lead to Lengthy Prison Time Under Proposed Law, ARS TECHNICA (Jan. 17, 2024), https://arstechnica.com/tech-policy/2024/01/sharing-deepfake-could-lead-to-lengthy-prison-time-under-proposed-law/.

[21] U.S. Const. amend. I.

[22] Waldstreicher, supra note 3.

[23] Congressional Research Service, The First Amendment: Categories of Speech, https://crsreports.congress.gov/product/pdf/IF/IF11072#:~:text=The%20Court%20generally%20identifies%20these,criminal%20conduct%2C%20and%20child%20pornography.

[24] Waldstriecher, supra note 3.

[25] Miller v. California, 413 U.S. 15, 24 (1973).

[26] Id.

[27] David L. Hudson Jr., Obscenity and Pornography, Middle Tennessee State University https://firstamendment.mtsu.edu/article/obscenity-and-pornography (last updated Feb. 18, 2024).

[28] Ashcroft v. Free Speech Coal., 535 U.S. 234, 242 (2002).

[29] New York v. Ferber, 458 U.S. 747, 764 (1982).

[30] Id.

[31] Zoom Interview with Rebecca Delfino, Law Professor, Loyola Law School (Feb. 26, 2024).