Long Saga of Colorado AI Act Appears to Have Come to Close With Revised Law
Author
Jason M. Schwent
Ever since its initial passage into law in 2024, the Colorado AI Act has been a lightning rod for controversy and calls for change. Over the ensuing two years, multiple attempts to amend the law were floated and proposed by consumer and industry groups. The implementation of the law itself was delayed several times to allow for such changes, with Governor Jared Polis calling a special session of the legislature last August to specifically address potential changes. All of those attempts appear to have culminated in Senate Bill 189 having passed both the Colorado House (57-6) and Senate (34-1) this week. The bill next heads to the desk of Governor Jared Polis where it is expected to be signed into law and to take effect as of January of 2027.
Now that the law has been amended to address concerns raised by both consumer advocacy and industry leaders, what does the revised Colorado AI Act look like and how does it compare to the original law passed two years ago?
The “New” Colorado AI Act
As with the original AI Act, the revised law is concerned with the use of so-called automated decision-making technology and AI systems (“ADMT”) in connection with high-stakes decisions affecting consumers. The legislation defines ADMT broadly to cover systems that process personal data and generate outputs such as predictions, recommendations, classifications, rankings, or scores that help guide decisions about individuals. The legislation would apply to the use of such ADMT in connection with “consequential decisions”—a term defined as encompassing decisions that can significantly affect a person’s life, including decisions impacting:
- Employment and compensation
- Education access
- Housing eligibility
- Financial and lending services
- Insurance
- Health care services
- Essential government services
If ADMT is used in connection with decisions that could affect any of these types of decisions, then they would be covered by the law. This scope is in keeping with the original scope of the law from 2024. Where the differences start to appear is what is required to be done or to be disclosed if ADMT is used in connection with a consequential decision.
Key Requirements for AI Developers and Deployers
As with the original 2024 law, the new Colorado AI Act would require disclosures to be made by both developers of ADMT technology as well as those using ADMT systems. For ADMT developers beginning on January 1, 2027, developers of covered ADMT systems would be required to provide organizations using those systems with detailed technical documentation. That documentation must explain:
- Intended uses of the system
- Categories of training data
- Known limitations
- Instructions for proper human oversight and review
- Material updates or modifications
The legislation also creates direct obligations for businesses and agencies that use ADMT systems in connection with consequential decisions. Deployers must provide clear notice to consumers when they interact with covered ADMT systems. If an automated system contributes to an adverse decision, consumers are entitled to receive a plain-language explanation of the system’s role within 30 days. This differs from the original law where businesses using such technology would be required to describe how the ADMT systems were making their decisions (including the information relied upon) and to show that such decisions were not biased in doing so. The new law has fewer requirements but still provides some information on the use of ADMT in connection with consequential decisions and a right to appeal such decisions.
Consumer Rights Under the Bill
The new law also establishes several consumer protections designed to increase transparency and human oversight in AI-assisted decisions. Under the new law, consumers would have the right to:
- Access personal data used by the system
- Correct inaccurate personal data
- Request meaningful human review
- Seek reconsideration of adverse decisions influenced by ADMT
These provisions are also similar to those in the original law, though they stop short of the original affirmative requirement to show that the decisions being made are not biased. But they do allow consumers to address concerns about algorithmic bias, inaccurate data, and opaque decision-making processes.
Enforcement and Liability
As with the original law, the Colorado Attorney General would enforce the law under the Colorado Consumer Protection Act. Violations would be treated as deceptive trade practices. However, the bill includes a 60-day “right to cure,” giving companies an opportunity to correct violations before enforcement actions proceed if remediation is possible.
Importantly, the legislation does not create a new private right of action for consumers. Instead, it clarifies how responsibility may be divided between ADMT developers and deployers in existing discrimination lawsuits.
Exemptions and Industry Concerns
The bill contains exemptions for entities already subject to other regulatory regimes, provided they comply with equivalent legal obligations.
Supporters argue the legislation strikes a balance between innovation and consumer protection. As with any compromise, no group got everything they wanted, but the compromise does appear to have support from both consumer advocacy groups and industry and tech leaders. Supporters of the new law describe the bill as a more business-friendly refinement of Colorado’s earlier AI framework. Critics, however, have expressed concerns from multiple directions. Some technology advocates argue the rules remain burdensome for AI developers, while some consumer advocates believe the bill weakens protections compared with earlier proposals.
Broader Significance
Colorado has emerged as one of the leading states attempting to regulate AI governance at the state level. SB26-189 reflects a growing national trend toward requiring transparency, accountability, and human oversight in automated decision-making systems, particularly in areas with significant impacts on employment, housing, lending, and public services. The law also may serve as a template for other state regulations in this space—balancing the need to not stifle innovation in this space while allowing consumers to hold developers and users of this technology responsible for biased development or deployments.
If you or your business have concerns about how this law will impact you or your operations, let the attorneys on the Clark Hill Data Privacy, Protection, and Cybersecurity Team help you.
This publication is intended for general informational purposes only and does not constitute legal advice or a solicitation to provide legal services. The information in this publication is not intended to create, and receipt of it does not constitute, a lawyer-client relationship. Readers should not act upon this information without seeking professional legal counsel. The views and opinions expressed herein represent those of the individual author only and are not necessarily the views of Clark Hill PLC. Although we attempt to ensure that postings on our website are complete, accurate, and up to date, we assume no responsibility for their completeness, accuracy, or timeliness.