Copyright Office cites ‘urgent’ need for digital replicas law as legislation introduced
Lawmakers should take action to pass a law that would address the “speed, precision, and scale of AI-created digital replicas,” the U.S. Copyright Office recommended in a Wednesday report, as a bipartisan group of senators introduced legislation that aims to do just that.
The report from the Copyright Office, which is part of the Library of Congress, explores the history, risks, and current legal environment as it relates to digital replicas, concluding that Congress should “establish a federal right that protects all individuals during their lifetimes from the knowing distribution of unauthorized digital replicas.”
In a statement, Shira Perlmutter, the director of the Copyright Office and register of copyrights, said the office believes “there is an urgent need for effective nationwide protection against the harms that can be caused to reputations and livelihoods.”
“We look forward to working with Congress as they consider our recommendations and evaluate future developments,” Perlmutter said.
One such development also came on Wednesday, as the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024 (NO FAKES Act) was introduced officially in the Senate. That bill, which was first released in draft form in October, is sponsored by Sens. Chris Coons, D-Del.; Marsha Blackburn, R-Tenn.; Amy Klobuchar, D-Minn.; and Thom Tillis, R-N.C.
Digital replicas are generally considered to be technologies that can replicate people’s voices and appearance. With the sophistication and wide availability of generative AI tools, the Copyright Office report said, the proliferation of digital replicas has increased. Recent examples range from a fake song by Drake and The Weeknd titled “Heart on My Sleeve” that went viral last year to a robocall replicating President Joe Biden’s voice that discouraged people from voting in the New Hampshire Democratic primary. Both incidents were cited in the report.
While there are some benefits to digital replicas — such as getting enabling performances by artists who can no longer perform or support creative work — the report said there’s a “broad range of actual or potential harms arising from unauthorized digital replicas.” Those harms include explicit content, the report said, citing a study that found 98% of deepfake videos online were pornography and 99% of people targeted were women. They also pose a danger to the political system by making misinformation difficult to identify.
Current federal laws, however, are written too narrowly to fully address digital replicas, according to the report. The Federal Trade Commission Act and Lanham Act, for example, each offer some protection, but those laws would only apply in commercial circumstances or with well-known figures, respectively.
“Based on our analysis of the comments received, independent research, and a review of work being done at other agencies, we believe there is an urgent need for a robust nationwide remedy beyond those that already exist,” the report said.
The Copyright Office’s recommendation comes as lawmakers are considering various pieces of legislation to address the issues. Two of those proposals would more broadly address digital replicas, the report said. Those include the NO FAKES Act and No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act, which is also bipartisan.
Under the NO FAKES Act, as introduced Wednesday, companies and individuals would be liable for damages for creating, hosting or sharing an unapproved digital replica of a person, and online services would have to take such content down upon notice. There would also be First Amendment protected exclusions for things like documentaries or parody, and the bill would “largely preempt state laws addressing digital replicas,” according to a press release from the lawmakers.
Notably, the Copyright Office recommended against preemption, arguing a non-preemptive law would have “greater clarity.”
“Either full or partial preemption raises the specter of extensive litigation over its scope and the question of which state-level protections remain available,” the report said. “This uncertainty could be minimized by specifying that the federal digital replica law supplements rather than preempts a state’s protections.”
The report was the first installment of a multipart series from the Copyright Office on copyright and AI. Those efforts to explore copyright issues related to AI began in early 2023, and included input from various stakeholders and the general public. A request for public input on those efforts last August garnered more than 10,000 comments, including from artists, publishers, lawyers, technology companies, and sports leagues, the report said.
Forthcoming installments of the series will address whether things created in whole or in part by AI can be copyrighted, legal implications for AI models trained on work that’s copyrighted, licensing, and the allocation of potential liability, the report said.