House lawmakers and industry experts joined together in Los Angeles, Calif., on Feb. 2 to discuss artificial intelligence (AI) and intellectual property, showcasing wide support for the recently introduced No Artificial Intelligence Fake Replicas and Unauthorized Duplications (AI FRAUD) Act.
The No AI FRAUD Act – introduced Jan. 10 by Reps. Madeleine Dean, D-Pa., and Maria Salazar, R-Fla. – aims to protect Americans’ identities from misuse via AI by establishing an intellectual property right over one’s voice and likeness.
“The Academy is grateful for the introduction of the No AI FRAUD Act, supported by many members of this committee,” Harvey Mason Jr., president and CEO of the Recording Academy, testified during a House Judiciary Subcommittee on Courts, Intellectual Property, and the Internet hearing.
“The bill establishes in Federal law that an individual has a personal property right in the use of their image and voice. That’s just common sense, and it is long overdue,” he continued, adding, “The bill also empowers individuals to enforce this right against those who facilitate, create, and spread AI frauds without their permission.”
Mason added, “Importantly, the bill has provisions that balance these new protections with the First Amendment to safeguard speech and innovation. Freedom of expression is essential to the music we create, but freedom of expression must also include the ability to protect your own individual expression from being misappropriated by others.”
He highlighted that the start of “music’s biggest week” – the 66th annual Grammy Awards – began with hundreds of artists and actors urging Congress to pass the No AI FRAUD Act.
“I join with many other creators in the Human Artistry Campaign in support of the No AI FRAUD Act and want to express my deep appreciation to its sponsors,” country music star Lainey Wilson said during the subcommittee hearing.
The legislation specifically creates an intellectual property right that every individual holds over their own likeness and voice; allows individuals to seek monetary damages for harmful, unauthorized uses of their likeness or voice; and guards against sexually exploitative deepfakes and child sexual abuse material.
“How do we balance the protections for creativity and one’s likeness – to give a property right for voice and likeness … and do it ethically,” Rep. Dean, the co-author of the bill, asked during the hearing.
Rep. Dean emphasized that Congress has to “protect against AI generated composition that steals our humanity.”
“When we introduced this bill, it was not an attempt in any way to chill parody or satire … But I don’t want lawmakers to turn a blind eye to say that this is just too complicated,” she continued, adding, “I want to have conversations on how do we protect first amendment rights, how do we protect artists, and how do we protect the public?”
Rep. Nathanial Moran, R-Texas, an original co-sponsor of the No AI FRAUD Act, said he was hopeful exposure from the Feb. 2 hearing would help move the piece of legislation quickly through the committee.
“But I also recognize that these kinds of hearings are very important to hear constructive feedback, to hear criticisms, to hear pushback,” Rep. Moran said. “We need to fine tune what we’re trying to propose so that ultimately if we pass something, which I hope we do, it will be beneficial to all and it will be the appropriate language. We don’t want anything under inclusive and we certainly don’t want anything overly broad at all.”