Sen. Ron Wyden, D-Ore., reintroduced the Algorithmic Accountability Act of 2023 this week, to create new protections for people affected by AI systems that are already impacting decisions affecting housing, credit, education and other high-impact uses.

The bill – cosponsored by Sen. Cory Booker, D-N.J. – applies to new generative AI systems used for critical decisions, as well as other AI and automated systems. Rep. Yvette D. Clarke, D-N.Y., also reintroduced the same measure in the House on Sept. 21.

“AI is making choices, today, about who gets hired for a job, whether someone can rent an apartment and what school someone can attend,” Sen. Wyden said. “Our bill will pull back the curtain on these systems to require ongoing testing to make sure artificial intelligence that is responsible for critical decisions actually works, and doesn’t amplify bias based on where a person lives, where they go to church or the color of their skin.”

The bill requires companies to conduct impact assessments for effectiveness, bias, and other factors when using artificial intelligence to make critical decisions. It also creates, for the first time, a public repository at the Federal Trade Commission of these systems and adds 75 staff to the commission to enforce the law.

“We know of too many real-world examples of AI systems that have flawed or biased algorithms: automated processes used in hospitals that understate the health needs of Black patients; recruiting and hiring tools that discriminate against women and minority candidates; facial recognition systems with higher error rates among people with darker skin; and more,” said Sen. Booker. “The Algorithmic Accountability Act would require that automated systems be assessed for biases, hold bad actors accountable, and ultimately help to create a safer AI future.”

Sen. Wyden initially introduced the bill in 2019 with Sen. Booker, and Rep. Clarke introduced the same measure in the House. Neither bill progressed past the committee level.

“Americans do not forfeit their civil liberties when they go online. But when corporations with vast resources continue to allow their AI systems to carry biases against vulnerable groups, the reality is that countless have and will continue to face prejudice in digital spaces,” said Rep. Clarke. “No longer can lines of code remain exempt from our anti-discrimination laws. My bill recognizes that every algorithm has an author and every bias has an origin and that, through proper regulation, we can ensure safety, inclusion, and equity are truly priorities in critical decisions affecting Americans’ lives.”

Read More About
About
Cate Burgan
Cate Burgan
Cate Burgan is a MeriTalk Senior Technology Reporter covering the intersection of government and technology.
Tags