Image
Open hand facing up and glowing slightly from the palm with the letters 'AI' floating above

California v Colorado: How two states leading AI legislation are approaching it differently

© Shutthiphong Chandaeng - iStock-1452604857
Paige Gross
(Colorado Newsline)

Colorado was first in the nation to pass comprehensive AI regulation, in 2024, but less than a year before the law goes into effect, lawmakers find themselves trying to clean up the landmark legislation, with the hopes of refining some of the terms that Colorado businesses say could slow growth and stifle innovation.

“It signals leadership in a proactive stance on AI, risks and the benefits as well,” said Ryan Thompson, a tech and telecom lawyer at Hogan Lovells. “But on the con side, when you have a law that is broadly written, it could very well end up being overly broad and difficult to implement.”

Image
Concept image of a book with the title "The Law" with a gavel and strike plate rest on top.

© tussik13 - iStock-908521486

The law is one example of how state legislators across the country have attempted to ensure privacy, fairness and consumer protection while leaving room for tech companies to grow these technologies on a global scale.

They’re doing so because the federal government has not yet regulated AI. Yet there’s been a big shift toward bipartisan support to enact some regulations in the last year, said Jeff Le, founder and managing principal of tech policy consultancy 100 Mile Strategies LLC.

“The polling is very clear that people are really worried about artificial intelligence,” Le said.

At least 550 AI bills have been introduced across 45 states and Puerto Rico in 2025, the National Conference of State Legislatures reported. And as legislative bodies study the technology to develop their own laws, many are looking to California and Colorado for inspiration.

The states have made history in the ways they’ve legislated AI with two different approaches. While California initially led the way on state-level AI regulation with multiple bills that increase business accountability, combat discrimination and regulate how businesses use data, Colorado became the first U.S. state to enact comprehensive AI legislation last year.

There are pros and cons to each approach, Thompson said. There’s an advantage to having a cohesive framework, nomenclature and set of obligations for everyone, he said. But narrower laws may create more clarity on specific aspects of the industry.

Colorado’s law, set to go into effect in February, could be a lesson to the rest of the country how to proceed.

Colorado’s Law

As it stands, Colorado’s law — called Consumer Protections for Artificial Intelligence — is the most comprehensive law regulating artificial intelligence in the nation. Other states, like California, Virginia and Connecticut have come close with similar bills in the last few years, but lawmakers or governors ultimately blocked them.

When Governor Jared Polis signed it into law last year, he said that some parts of the law may need to be amended for more specificity.

Image
PROMO Flag - Colorado State - iStock - PromesaArtStudio

© iStock - PromesaArtStudio.

Senator Majority Leader Robert Rodriguez said in a statement to States Newsroom that his committee worked this year with industry and civil groups to develop SB 318, a bill meant to clean up the specific applications and descriptions about AI after hearing concerns from the state’s businesses community.

But the legislative session ended May 7 without a resolution, and the law will go into effect as written in February. It makes developers of AI responsible for consumer protections around “high risk” AI systems.

The law requires developers to disclose certain information about AI systems to those who will deploy it, and outline the potential risks of algorithmic discrimination — the potentially discriminatory outcomes that could come from decision-making algorithms, many of which are currently being used to sort and rank job applications, mortgages and health information, among other tasks.

“There’s not necessarily the frustration that this law exists, but rather frustration with some of its provisions,” said Arsen Kourinian, a partner at Mayer Brown who specializes in data privacy and AI law. “Some are vague and it’s difficult to apply them to actual use cases.”

For example, he said, the law includes certain carve-outs for “narrow procedural tasks” completed with AI, but the law does not define that term.

Rodriguez said he plans to continue conversations about amending the law through the summer — “It remains my goal to ensure we have a policy on the books that protects consumers without stifling innovation,” he said.

The balance of protections and innovation Rodriguez describes is the current crux of most AI legislation in the states. While most laws seek to put some sort of protections or guidelines in place to safeguard users against harm, many in the tech industry say they can’t be global competitors or grow the tech industry with protections like these in place.

“There’s a legitimate concern about, hey, if you over regulate the technology, then other countries are going to get ahead of us, and we’re going to be left behind,” Kourinian said. “So I think that’s the one concern with taking a Colorado approach, that there’s the perception that it might be over-regulation.”

California’s approach

That pressure to preserve the tech industry is what ultimately caused California Governor Gavin Newsom to veto a similar, wide sweeping bill last year.

Senate Bill 1047 would have required safety testing of costly AI modelsto determine whether they would likely lead to mass death, endanger public infrastructure or enable severe cyberattacks. It made it through both chambers before Newsom said it could be “curtailing the very innovation that fuels advancement in favor of the public good.”

The state is in an interesting position because it’s home to an overwhelming majority of the country’s tech companies, said Teri Olle, Director of Economic Security California, an economic security advocacy group. Olle worked with lawmakers to develop SB 1047.

Image
California State Capitol Building

© Marcopolo9442 - iStock-179063219

“You cannot walk down the street with your arms out without hitting somebody who’s … involved in AI, and maybe a billionaire,” Olle said. “We are in it.”

But a large majority of Californians supported the bill, including 64 percent of voters who work in the tech industry. It seemed that people were hungry for some kind of regulations, Olle said, but the fear of stifling innovation was stronger.

Olle said the feeling was that decisionmakers were keen for AI regulations, but there’s a “not here, not now” attitude.

“California is interestingly left in the limbo where they don’t have a comprehensive AI law,” Kourinian said. “And instead is addressing it on a piecemeal basis or through various other laws that apply in different contexts.”

Since SB 1047 was vetoed, more targeted AI bills have been having success this year in Sacramento. Assemblymember Rebecca Bauer-Kahan, a Bay Area Democrat, is the sponsor of two AI-related bills that advanced out of the State Assembly this month, one which regulates children’ s use of AI chatbots, and another called the Automated Decisions Safety Act that closely mirrors Colorado’s and other states approach in regulating decision-making algorithms. Bauer-Kahan has reintroduced the latter bill twice now, but it’s getting more traction this year

She said this incremental approach is just how California does things — “We tend to take more step-by-step approaches to legislation,” Bauer-Kahan said. In going too broad, you may risk watering down the focus of a law, she said.

The assemblymember, whose background is in regulatory law, said she would favor a federal approach to AI regulation.

“The problem is we don’t have a Congress that is going to do what our communities want, and so in the absence of their action, the states are stepping up,” she said.

Though the tech industry is such a huge part of California’s economy, Bauer-Kahan said she believes good regulation will not stop it from thriving, and in fact build more trust between companies and their users.

Moving forward

As many states close their legislative sessions in the coming weeks, those that have advanced AI laws, either in sweeping attempts like Colorado’s, or in a piece-by-piece approach like California, will likely continue to learn from other state’s approaches.

Image
Hands on a computer keyboard with simulated holographic images floating above representing aspects of artificial intelligence

© Khanchit Khirisutchalual - iStock-1515913422

Even in tech-savvy California, Le said, legislators are not experts on the tech industry or its applications — “it does require serious inspection,” he said. And legislators will always struggle to keep up with the quickly changing norms of the tech industry.

Federal guidelines would simplify this process for every state, Le said, but it’s not clear how or when congress would seriously consider them, especially under a very pro-innovation, low regulation Trump administration. It’s likely that state legislative bodies will continue to explore what works best for them, and continue learning from states that pass influential laws in the meantime.

“I think the California context is generally not a question of if they’re going to do it,” Le said. “It’s how they’re going to do it.”