Colorado AI Act 2026: Key Deadlines and What Your Business Needs to Do Now
The Colorado AI Act takes effect June 30, 2026. Here's a timeline of deadlines and a step-by-step compliance plan.
The Colorado AI Act Takes Effect June 30, 2026: Here's What You Need to Do
The Colorado AI Act (SB 24-205) takes effect on June 30, 2026, making Colorado the first U.S. state to comprehensively regulate artificial intelligence systems. The original effective date was February 1, 2026, but Governor Polis signed SB 25B-004 in August 2025 to delay implementation by five months, giving lawmakers additional time to consider revisions. If your business uses AI tools to make decisions about consumers—from credit approvals to hiring to insurance pricing—you need to understand this law right now.
Unlike other states that have focused primarily on specific sectors, Colorado's law casts a wide net. It applies to businesses of all sizes that deploy or develop "high-risk AI systems" affecting Colorado residents. For a broader view of AI compliance across all states, see our comprehensive AI compliance guide for small businesses. The Attorney General has indicated that enforcement will begin with educational outreach, but penalties for non-compliance can reach $20,000 per violation.
If you're using AI in your business operations, you need a compliance plan now—before the June 30 deadline arrives.
Understanding the Timeline: Key Dates You Cannot Miss
The Colorado AI Act follows a staggered implementation schedule designed to give businesses time to adapt. Here's what you need to know:
June 30, 2026: The law takes effect. The core requirements for deployers and developers of high-risk AI systems become enforceable. Businesses should be conducting impact assessments and implementing risk management policies before this date.
June 30, 2026 - June 29, 2027: Grace period for enforcement. During this first year, the Colorado Attorney General has stated the focus will be on education and guidance rather than penalties. However, this doesn't mean you can ignore compliance. Any violations during this period create a track record that could be used against you later.
July 1, 2027: Full enforcement begins. After this date, the Attorney General can pursue civil penalties without a cure period. If you're not compliant by this deadline, each violation can result in fines up to $20,000.
Ongoing: Businesses must continually monitor and update their AI systems, conduct new impact assessments when systems change significantly, and maintain documentation for at least three years.
The message is clear: if you haven't started your compliance process yet, you're already behind schedule.
Who Is Actually Affected? Breaking Down Deployers vs. Developers
The Colorado AI Act creates two distinct categories of regulated entities, and you need to know which one applies to your business.
Deployers
A "deployer" is any business that uses a high-risk AI system to make, or be a substantial factor in making, consequential decisions about Colorado consumers. This is where most small and medium-sized businesses fall.
You're likely a deployer if you:
- Use AI-powered hiring tools to screen job applicants
- Employ algorithmic systems to approve or deny credit, loans, or financing
- Utilize AI for insurance underwriting or claims processing
- Deploy AI chatbots that make eligibility determinations
- Use predictive analytics to determine pricing for individual consumers
- Implement AI systems for educational placement or opportunity decisions
Developers
A "developer" is the entity that builds or substantially modifies a high-risk AI system. Unless you're creating AI tools from scratch or heavily customizing existing systems, you're probably not a developer under this law.
Most businesses purchasing off-the-shelf AI software are deployers, not developers. However, if you're building custom AI models or significantly training and modifying existing systems, you may have developer obligations as well.
What Counts as "High-Risk AI"?
Not every AI system falls under this regulation. The law specifically targets "high-risk artificial intelligence systems"—those that make or substantially assist in making consequential decisions.
A consequential decision is one that has a material legal or similarly significant effect on:
- Education, enrollment, or opportunities
- Employment or employment opportunities
- Financial services or credit
- Essential government services
- Healthcare services
- Housing
- Insurance
- Legal services
Common examples in small business contexts include:
- An AI resume screener that ranks candidates and determines who gets interviews (see legal requirements for AI tools at work)
- A credit scoring algorithm that approves or denies loan applications
- An insurance pricing tool that sets premiums based on individual risk factors
- A tenant screening system that recommends approval or rejection of rental applications
However, narrow-purpose tools that simply automate basic tasks without making substantive decisions generally don't qualify. A calendar scheduling AI or a basic chatbot that answers FAQs typically wouldn't be high-risk.
Your Compliance Requirements: What the Law Actually Demands
If you're a deployer of high-risk AI systems, the Colorado AI Act imposes several specific obligations:
1. Conduct Impact Assessments
You must complete an algorithmic impact assessment for each high-risk AI system you deploy. This assessment must document:
- The purpose and intended use of the AI system
- The benefits and potential risks to consumers
- Data categories used in the system
- Measures taken to mitigate algorithmic discrimination risks
- How consumers can appeal or contest decisions
These assessments must be completed before deploying a new system and updated when you make substantial modifications. While the law doesn't mandate annual updates if nothing changes, best practice suggests reviewing assessments at least yearly.
2. Implement Risk Management Policies
You need written policies and procedures designed to prevent algorithmic discrimination. These policies should address:
- How you evaluate AI systems for bias before deployment
- Ongoing monitoring processes to detect discrimination
- Procedures for responding to discovered bias
- Training requirements for employees who use the AI systems
This isn't about creating perfect AI systems—it's about demonstrating that you have systematic processes in place to identify and address problems.
3. Provide Consumer Notices
Colorado consumers have the right to know when high-risk AI is being used to make consequential decisions about them. You must provide clear, easily understandable notices that:
- Inform consumers that AI is being used
- Explain the general nature of the AI system
- Describe what purpose it serves in decision-making
These notices should be provided in the same manner you provide privacy notices—typically during the application or interaction process.
4. Offer an Appeal Mechanism
Consumers must have the ability to:
- Opt out of AI-based decision-making and request human review
- Appeal or contest adverse decisions
- Receive a statement of the principal reasons for the decision
This means you need clear procedures for handling consumer requests and conducting human reviews when asked.
5. Maintain Documentation
You must keep records of your impact assessments, risk management policies, and compliance activities for at least three years. During an investigation, the Attorney General can request these documents.
Ready to get compliant? Generate your AI compliance documents in under 2 minutes.
Generate Free AI Policy →Step-by-Step Compliance Checklist for Your Business
Here's your practical roadmap to Colorado AI Act compliance:
Step 1: Identify Your AI Systems
Create an inventory of every AI tool your business uses. For each one, document:
- What the tool does
- Who provides it (vendor name)
- What business function it supports
- What data it uses
- What decisions it influences
Step 2: Classify Risk Level
For each AI system, determine whether it makes or substantially contributes to consequential decisions in the regulated categories (employment, credit, housing, insurance, etc.).
Step 3: Request Documentation from Vendors
For high-risk systems from third-party vendors, contact them to request:
- Documentation showing their compliance with developer obligations
- Information about how the system works and what data it uses
- Any impact assessments they've already completed
- Information about their bias testing and mitigation efforts
Many established vendors are already preparing this documentation as Colorado's law took effect.
Step 4: Complete Impact Assessments
For each high-risk system, complete your impact assessment. Many vendors are providing templates or partial assessments to help deployers meet this requirement. Document:
- Why you're using the system
- What benefits it provides
- What risks you've identified
- How you're mitigating those risks
Step 5: Create Written Policies
Develop your risk management policy and procedures. This doesn't need to be lengthy, but it should be specific to your business. Include:
- Who's responsible for AI compliance
- How often you'll review AI systems
- Your process for investigating potential bias
- Training requirements for employees
Step 6: Update Your Consumer Notices
Add AI disclosures to your existing notices. This could be in your privacy policy, application forms, or employee handbook—wherever appropriate for the context.
Step 7: Establish Appeal Procedures
Create a clear process for consumers to:
- Request human review of AI decisions
- Submit appeals
- Receive explanations of adverse decisions
Train your staff on how to handle these requests.
Step 8: Document Everything
Keep copies of all assessments, policies, notices, and compliance activities. Create a compliance file that you can produce if the Attorney General investigates.
Step 9: Schedule Regular Reviews
Set calendar reminders to review your AI systems, policies, and assessments at least annually—more frequently if systems change or you identify issues.
Penalties and Enforcement: What Happens If You Don't Comply
The Colorado AI Act is enforced by the state Attorney General under the Colorado Consumer Protection Act. This means serious penalties for violations.
Civil Penalties: Each violation can result in fines up to $20,000. Importantly, these can add up quickly. If you're using an AI system to process hundreds of applications without proper notices, that could theoretically be hundreds of violations.
No Private Right of Action (For Now): Currently, the law doesn't allow consumers to sue you directly. Only the Attorney General can bring enforcement actions. However, this could change, and violations could potentially support other legal claims.
Cure Period: After July 1, 2027, if the Attorney General identifies a violation, they must give you 60 days to cure the problem before imposing penalties—unless the violation was intentional or you've had previous violations.
Injunctive Relief: Beyond fines, the Attorney General can seek court orders requiring you to change your practices or stop using non-compliant AI systems.
The real risk isn't just the financial penalties—it's the reputational damage and business disruption from an enforcement action.
How Colorado's Law Compares to Other AI Regulations
Colorado's AI Act doesn't exist in a vacuum. If you operate in multiple jurisdictions, you need to understand how it fits with other regulations.
NYC Local Law 144: New York City's bias audit law for employment AI tools, which took effect in 2023, has similar goals but narrower scope. It applies only to automated employment decision tools and requires annual bias audits. Colorado's law is broader but doesn't require the same statistical bias testing.
California CPRA: California's privacy law includes AI-related provisions around automated decision-making, but they're less comprehensive than Colorado's requirements. However, if you serve California consumers, you may need to comply with both.
EU AI Act: Colorado's law was heavily influenced by Europe's AI Act, which categorizes AI systems by risk level. However, the EU regulation is more comprehensive and includes stricter requirements for high-risk systems. If you operate internationally, the EU AI Act likely represents the ceiling for compliance requirements.
Federal Landscape: Various federal agencies including the FTC, EEOC, and CFPB have issued guidance on AI use, but there's no comprehensive federal AI law yet. Colorado's Act may serve as a model for federal legislation or other states.
Other States: Several states including Connecticut, Utah, and others are considering or have passed AI-related legislation. The compliance landscape will likely become more complex over the next few years.
Practical Implication: Build your compliance program to exceed Colorado's requirements. This gives you flexibility as other jurisdictions adopt similar laws.
Special Considerations for Small Businesses
While the Colorado AI Act doesn't formally exempt small businesses, there are practical considerations that may affect your risk profile:
Vendor Relationships: Many small businesses use third-party AI tools rather than developing their own. The good news is that vendors bear some compliance burden as developers. However, you still have deployer obligations, so don't assume your vendor's compliance means you're off the hook.
Resource Constraints: Unlike large corporations, you probably don't have a compliance department. The law recognizes this implicitly by requiring "reasonable" measures proportional to the risk. Your impact assessment and policies should reflect your business size and resources.
Enforcement Priorities: While the Attorney General can enforce against businesses of any size, initial enforcement will likely focus on larger deployers or particularly egregious violations. That said, don't count on flying under the radar—especially if you're using AI in sensitive areas like hiring or credit decisions.
Third-Party Support: Consider whether you need external help with compliance. Depending on your AI usage, you might need legal advice, technical auditing support, or compliance documentation tools.
What You Should Do This Week
Stop thinking of Colorado AI Act compliance as a future project. Here are immediate actions you can take:
-
Complete your AI inventory today. You can't comply with a law governing AI systems if you don't know what AI you're using.
-
Contact your software vendors. Email every vendor whose tools touch decision-making processes and ask about their Colorado AI Act compliance.
-
Review at least one high-risk system. If you use AI for hiring, credit decisions, or similar functions, start your impact assessment for your highest-risk system now.
-
Assign responsibility. Designate someone in your organization as the point person for AI compliance.
-
Update your consumer-facing materials. If you're using high-risk AI and haven't disclosed it, draft and publish appropriate notices immediately.
You don't need to achieve perfect compliance in a week, but you should be actively working toward it.
Compliance Documentation Made Practical
The documentation requirements under the Colorado AI Act—impact assessments, risk management policies, consumer notices, and appeal procedures—can seem overwhelming, especially for small businesses without legal departments.
Attestly helps businesses generate the specific compliance documents required under the Colorado AI Act. Our platform walks you through the requirements in plain language and produces professional documentation tailored to your specific AI use cases. Whether you need impact assessments, policy templates, or consumer notices, we make the paperwork part of compliance straightforward so you can focus on running your business.
Frequently Asked Questions
When did the Colorado AI Act take effect?
What are the penalties for violating the Colorado AI Act?
Does the Colorado AI Act apply to small businesses?
What counts as a high-risk AI system under the Colorado AI Act?
What compliance documents do I need for the Colorado AI Act?
Am I a deployer or developer under the Colorado AI Act?
The Colorado AI Act represents a significant shift in how businesses must approach AI systems. But with clear understanding of the requirements and a systematic approach, compliance is achievable for businesses of any size. The key is starting now—because the clock is already ticking.
Need an AI disclosure policy?
Answer 6 questions about your business and generate your free compliance documents in under 2 minutes. No signup required.
Generate Your Free AI Policy →Related Guides
How to Update Your Privacy Policy for AI: A Step-by-Step Guide
Your privacy policy probably needs an AI update. Here's exactly what to add and how to word it.
What Is an AI Disclosure Policy? Everything Your Business Needs to Know
Learn what an AI disclosure policy is, why your business needs one, and what it should include to stay compliant.
AI Compliance Requirements in Washington: What Small Businesses Need to Know in 2026
Washington has specific AI legislation affecting businesses. Here's what small business owners need to know to stay compliant.
AI Compliance in West Virginia: What Small Businesses Should Do Now (Even Without a State Law)
West Virginia doesn't have specific AI legislation yet, but compliance still matters. Here's what your business should do now.