Helen Toner, a former board member at OpenAI and now the director of strategy at Georgetown’s Center for Security and Emerging Technology, has expressed concerns regarding the effectiveness of Congress in creating sound AI policies. She fears that the current legislative environment might lead to hasty and ill-considered regulations.
At an event in Washington, D.C., Toner highlighted the inefficiencies within Congress, stating, “Congress right now — I don’t know if anyone’s noticed — is not super functional, not super good at passing laws, unless there’s a massive crisis.” She worries that AI, a significant and influential technology, might only see laws enacted in reaction to major crises, which might not be the most effective approach.
Current Efforts and Executive Actions
Despite the legislative gridlock, there have been some efforts to regulate AI at the executive level. In 2023, President Joe Biden signed an executive order introducing consumer protections and mandating AI developers to share safety test results with government agencies. Additionally, the National Institute of Standards and Technology published guidelines for managing AI risks.
However, Congress has yet to pass any comprehensive AI legislation akin to the EU’s recently enacted AI Act. With the 2024 elections on the horizon, significant legislative action on AI seems unlikely.
State-Level Legislation Surge
In the absence of federal regulations, state and local governments have stepped in to fill the gap. According to the Brookings Institute, there was a substantial increase in AI-related bills at the state level in 2023. Legislators introduced nearly 400 new state-level AI laws recently, reflecting a significant uptick compared to the previous year.
States like California and Colorado have advanced numerous AI bills. California introduced around 30 bills focused on consumer and job protection, while Colorado passed a law requiring AI companies to exercise “reasonable care” to prevent discrimination. Tennessee’s ELVIS Act, signed into law by Governor Bill Lee, prohibits AI from cloning musicians’ voices or likenesses without consent.
Challenges of a Fragmented Regulatory Landscape
The varied state laws create a complex regulatory environment, leading to uncertainty for both industry and consumers. Different definitions and standards for “automated decision making” across states illustrate this challenge. Some laws consider decisions automated even with human involvement, while others do not.
Toner advocates for a federal mandate, even a broad one, to bring some uniformity and clarity. She believes that establishing sensible and minimal regulations now could prevent more severe problems and the need for rushed, ill-conceived responses in the future. “Some of the smarter and more thoughtful actors that I’ve seen in this space are trying to say, OK, what are the pretty light-touch — pretty common-sense — guardrails we can put in place now to make future crises — future big problems — likely less severe,” she said.
See also: Generative AI Takes Robots A Step Closer To General Purpose