Legislation Requires Assessment of Critical Algorithms and New Public Disclosures; Bill Endorsed by AI Experts and Advocates; Bill Will Set the Stage for Future Oversight by Agencies and Lawmakers

Washington, D.C. – Today, U.S. Senator Tammy Baldwin co-sponsored the Algorithmic Accountability Act of 2022 led by Senators Ron Wyden (D-OR) and Cory Booker (D-NJ) and Representative Yvette Clarke (D-NY), a landmark bill to bring new transparency and oversight of software, algorithms, and other automated systems that are used to make critical decisions about nearly every aspect of Americans’ lives.

“Too often, Big Tech’s algorithms put profits before people, from negatively impacting young people’s mental health, to discriminating against people based on race, ethnicity, or gender, and everything in between,” said Senator Baldwin. “It is long past time for the American public and policymakers to get a look under the hood and see how these algorithms are being used and what next steps need to be taken to protect consumers. I am proud to support the Algorithmic Accountability Act of 2022 to take the first step in giving the folks of Wisconsin and our country transparency and accountability from Big Tech’s harmful algorithms.”

The bill requires companies to conduct impact assessments for bias, effectiveness and other factors, when using automated decision systems to make critical decisions. It also creates, for the first time, a public repository at the Federal Trade Commission of these systems, and adds 75 staff to the commission to enforce the law.

It is also co-sponsored by Senators Brian Schatz (D-HI), Mazie Hirono (D-HI), Ben Ray Luján (D-NM), Bob Casey (D-PA), and Martin Heinrich (D-NM).

The bicameral group updated the 2019 Algorithmic Accountability Act after speaking with dozens of experts, advocacy groups and other stakeholders on how to improve the bill. The 2022 legislation shares the goals of the earlier bill, but includes numerous technical improvements, including clarifying what types of algorithms and companies are covered, ensuring assessments put consumer impacts at the forefront, and providing more details about how reports should be structured. A full summary of the bill is available here.

The Algorithmic Accountability Act is endorsed by a broad array of experts and civil society organizations: Access Now, Accountable Tech, Aerica Shimizu Banks, Brandie Nonnecke, PhD, Center for Democracy and Technology (CDT), Color of Change, Consumer Reports, Credo AI, EPIC, Fight for the Future, IEEE, JustFix, Montreal AI Ethics Institute, OpenMined, Parity AI and US PIRG.

“Big Tech’s problem of algorithmic bias has gone on for too long and we can no longer allow for these issues to go unregulated,” said Arisha Hatch, Vice President of Color Of Change. “When bias in algorithms goes unchecked, Black people are subjected to discrimination in healthcare, housing, education, and employment — impacting nearly all parts of our lives. In order to reduce the impact of this bias, Big Tech and their operations must proactively detect and address discrimination. Companies conducting their own audits is a first step but prevention will be key. The Algorithmic Accountability Act can effectively protect Black people from automated discrimination, equip the FTC with the resources necessary to enforce these protections and create a more equitable digital space. Color Of Change commends Sens. Wyden and Booker and Rep. Clarke for advancing racial justice equities in tech regulation. We hope Congress will pass this instrumental legislation.”

“Poorly designed algorithms can result in inaccurate outcomes, inconsistent results, serious discriminatory impacts, and other harms,” said Nandita Sampath, Policy Analyst at Consumer Reports. “The Algorithmic Accountability Act is an important foundation to provide researchers and policymakers with the tools to identify who can be impacted by these emerging technologies and how. We look forward to continue working with the sponsors of the bill to seek out the most effective ways to mitigate algorithmic harm.”

The full bill text is available here.

A summary of the bill is available here.

A section-by-section of the bill is available here.

An online version of this release is available here.

Print Friendly, PDF & Email