← Back to Blog
Attestly Team··North Dakota

AI Compliance in North Dakota: What Small Businesses Should Do Now (Even Without a State Law)

North Dakota doesn't have specific AI legislation yet, but compliance still matters. Here's what your business should do now.

AI Compliance Requirements for Small Businesses in North Dakota

If you're a North Dakota business owner using AI tools like ChatGPT, AI-powered CRMs, or automated marketing platforms, you're probably wondering what rules you need to follow. The short answer? North Dakota doesn't have specific AI legislation yet—but that doesn't mean you're off the hook.

While the state hasn't passed its own AI laws, federal regulations still apply to your business, and neighboring states' rules might affect you if you serve customers across state lines. South Dakota is in a similar situation, while Minnesota has already enacted privacy provisions with direct AI implications—especially relevant for businesses in the Fargo-Moorhead area. More importantly, getting ahead of compliance now can protect your business from future liability and build customer trust in an increasingly regulated landscape.

Current State of AI Regulation in North Dakota

As of February 2026, North Dakota has not enacted dedicated AI legislation. The state legislature has not introduced comprehensive bills regulating artificial intelligence systems, automated decision-making, or algorithmic transparency.

This places North Dakota among the minority of states without AI-specific rules, even as states like Colorado, California, and Utah have implemented detailed AI governance frameworks. However, the absence of state law doesn't mean North Dakota businesses operate in a regulatory vacuum.

What fills the gap:

Federal agencies—particularly the Federal Trade Commission (FTC), Equal Employment Opportunity Commission (EEOC), and Consumer Financial Protection Bureau (CFPB)—have issued guidance and actively enforce existing laws that apply to AI systems. These regulations cover deceptive practices, discrimination, data privacy, and consumer protection.

North Dakota's agricultural sector has seen significant AI adoption for crop monitoring, predictive analytics, and automated farm equipment. The state's energy industry also uses AI for oil and gas operations. While these applications haven't triggered state-specific regulation yet, they fall under federal oversight and industry-specific rules.

Who Should Care About AI Compliance in North Dakota

Don't assume AI compliance is only for tech giants. If your business uses any of these tools, compliance matters:

Businesses using AI for customer interactions:

  • Chatbots on your website
  • AI-powered customer service systems
  • Automated email or SMS marketing
  • Social media management tools with AI features

Businesses making decisions with AI:

  • AI-assisted hiring or resume screening
  • Automated credit decisions or loan processing
  • Pricing algorithms that adjust based on customer data
  • Inventory management systems that use predictive AI

Professional service providers:

  • Healthcare practices using AI diagnostic tools
  • Legal services using AI research or document review
  • Accounting firms using AI for tax preparation or fraud detection
  • Real estate agencies using AI valuation tools

Industry-specific scenarios:

  • Retailers using AI recommendation engines
  • Financial institutions using algorithmic risk assessment
  • Insurance companies using AI underwriting
  • Agriculture operations using AI crop analysis or automated equipment

If you process customer data, make decisions that affect people's opportunities or prices, or use AI to communicate with customers, you need to understand compliance requirements.

Federal Requirements That Apply to North Dakota Businesses

Even without state legislation, federal law creates real obligations for North Dakota businesses using AI.

FTC Act Section 5: Prohibition on Deceptive Practices

The FTC has made clear that AI tools cannot be used to deceive consumers. This means:

Transparency requirements: If an AI system makes decisions affecting customers, you generally need to disclose that AI is involved. Claiming human review when decisions are fully automated violates FTC rules.

Accuracy obligations: Marketing claims generated by AI must be truthful. If your AI generates product descriptions, customer testimonials, or performance claims, you're liable if they're false or misleading.

"AI washing" prohibitions: Falsely claiming your product uses AI (when it doesn't) or exaggerating AI capabilities is considered deceptive advertising.

Anti-Discrimination Laws

Federal civil rights laws apply regardless of whether decisions are made by humans or algorithms:

Equal Credit Opportunity Act (ECOA): If you use AI for lending, credit, or payment decisions, the system cannot discriminate based on race, color, religion, national origin, sex, marital status, or age.

Fair Housing Act: Real estate businesses using AI for tenant screening, property valuations, or advertising must ensure these systems don't create discriminatory outcomes.

Title VII (Employment): AI hiring tools, resume screeners, or employee monitoring systems cannot have disparate impacts on protected classes. The EEOC actively investigates algorithmic discrimination.

Data Privacy and Security

While North Dakota lacks a comprehensive privacy law, federal regulations require:

Reasonable data security: The FTC enforces data security requirements. If your AI processes customer information, you need appropriate safeguards.

Industry-specific rules: Healthcare businesses must comply with HIPAA when using AI with patient data. Financial institutions face GLBA requirements. Children's data requires COPPA compliance.

Common AI Tools That Trigger Compliance

North Dakota businesses commonly use these AI tools, each creating specific compliance considerations:

ChatGPT and Similar Large Language Models

Compliance concerns:

  • Customer data you input into ChatGPT may be used for training unless you use enterprise versions with data protection agreements
  • AI-generated content you publish must be accurate; you're liable for false claims
  • If you use ChatGPT to draft customer communications, you need to review for potential bias or inappropriate content

Best practices: Use business accounts with data processing agreements, verify all AI-generated claims before publication, and maintain human oversight.

AI-Powered CRM Systems (Salesforce Einstein, HubSpot AI, etc.)

Compliance concerns:

  • These systems often make predictions about customer behavior or sales likelihood
  • Automated lead scoring could inadvertently discriminate
  • Customer data used for AI training needs proper consent and protection

Best practices: Audit AI scoring systems for bias, provide opt-outs for automated decisions affecting customer treatment, and document your AI decision-making logic.

Marketing Automation with AI Features

Compliance concerns:

  • Personalized pricing or offers could create discriminatory patterns
  • Automated content generation might make false claims
  • Email and SMS automation must still comply with CAN-SPAM and TCPA

Best practices: Test campaigns for unintended discrimination, review AI-generated content for accuracy, and maintain required unsubscribe mechanisms.

AI Hiring and HR Tools

Compliance concerns:

  • Resume screening AI has documented bias problems
  • Video interview analysis tools may discriminate based on speech patterns, facial features, or other protected characteristics
  • Employee monitoring AI raises privacy concerns

Best practices: Conduct adverse impact analyses, allow candidates to request human review, and disclose AI use in the hiring process.

AI Image and Video Generators (Midjourney, DALL-E, etc.)

Compliance concerns:

  • Copyright status of AI-generated images remains legally complex
  • Using AI art in commercial contexts carries intellectual property risk
  • AI-generated images of people may violate publicity rights

Best practices: Review terms of service carefully, avoid generating images of real people without consent, and consider human-created alternatives for high-stakes commercial use.

Step-by-Step Compliance Checklist for North Dakota Businesses

Follow these practical steps to build compliant AI practices:

Step 1: Inventory Your AI Tools

Create a list of every AI system your business uses. Include:

  • Name and vendor of the AI tool
  • What business function it serves
  • What data it accesses
  • Whether it makes automated decisions affecting people
  • Who in your organization uses it

Step 2: Review Vendor Agreements

For each AI tool, examine:

  • Data processing agreements and privacy terms
  • Whether your data is used for training
  • Liability provisions if the AI produces harmful outputs
  • Compliance representations from the vendor

Request Business Associate Agreements for healthcare data, data processing addendums for customer information, and terms that prohibit using your data for model training.

Step 3: Assess Risk for Each AI System

Categorize your AI tools by risk level:

High risk: Systems that make decisions about employment, credit, housing, insurance, or legal matters. These require the most scrutiny.

Medium risk: Customer-facing systems that personalize experiences, pricing, or communications.

Lower risk: Internal productivity tools that don't process sensitive data or make decisions affecting people.

📋

Ready to get compliant? Generate your North Dakota AI compliance documents in under 2 minutes.

Generate Free AI Policy →

Step 4: Implement Transparency Measures

For customer-facing AI:

  • Add disclosures explaining that AI is used
  • Create a simple AI policy explaining how you use these tools
  • Provide contact information for customers who have questions or want human review

For employment AI:

  • Notify applicants that AI assists in screening
  • Offer alternative application processes upon request
  • Document your AI hiring systems for potential EEOC inquiries

Step 5: Establish Human Oversight

Create review processes for:

  • AI-generated content before publication
  • High-stakes decisions (hiring, credit, pricing) made by AI
  • Customer complaints about AI interactions
  • Periodic audits of AI system outputs

Assign specific team members responsibility for AI oversight.

Step 6: Test for Bias and Accuracy

Depending on your AI applications:

  • Analyze outcomes by demographic group to identify disparate impacts
  • Test AI-generated content for factual accuracy
  • Review AI recommendations against ground truth data
  • Document your testing methodology and results

Step 7: Create Necessary Policies

Document:

  • An AI acceptable use policy for employees
  • Data handling procedures for AI systems
  • Your process for responding to AI-related complaints
  • How you'll stay current as regulations evolve

Step 8: Train Your Team

Ensure employees understand:

  • Which AI tools they're authorized to use
  • Data they should never input into AI systems
  • How to escalate AI-related concerns
  • Basic compliance requirements

Penalties and Enforcement: What's Actually at Risk

Without North Dakota-specific AI laws, enforcement comes from federal agencies and civil litigation.

FTC Enforcement

The FTC can issue civil penalties up to $50,120 per violation for deceptive AI practices. Recent enforcement actions show the FTC is serious about AI compliance:

  • Companies making false claims about AI capabilities have faced million-dollar settlements
  • Businesses using AI in deceptive ways (fake reviews, misleading chatbots) have been fined and required to stop using AI

EEOC Actions

Discrimination in AI hiring can result in:

  • Back pay for affected applicants
  • Compensatory and punitive damages
  • Mandatory changes to hiring systems
  • Ongoing monitoring requirements

Recent EEOC settlements involving algorithmic discrimination have reached seven figures.

Private Lawsuits

Businesses face the greatest risk from customer and employee lawsuits:

  • Class actions alleging discriminatory AI practices
  • Individual suits for harm caused by AI decisions
  • Intellectual property claims over AI-generated content

Defense costs alone can devastate a small business, even if you ultimately prevail.

Reputational Harm

Beyond formal penalties, compliance failures create:

  • Negative media coverage
  • Customer loss
  • Difficulty recruiting talent
  • Damaged relationships with vendors and partners

How North Dakota Compares to Other States

North Dakota's regulatory approach differs significantly from nearby states and national leaders in AI governance.

Colorado's AI Law

Colorado enacted comprehensive AI legislation requiring:

  • Algorithmic discrimination impact assessments
  • Consumer rights to opt out of certain AI decisions
  • Developer disclosures about high-risk AI systems
  • Mandatory bias testing and documentation

Impact on North Dakota businesses: If you serve Colorado customers, these rules may apply to you regardless of your physical location.

Montana's Approach

Montana, North Dakota's neighbor, also lacks comprehensive AI legislation but has considered privacy bills that would affect AI data practices.

Minnesota's Proposed Regulations

Minnesota has explored AI transparency requirements for businesses operating in the state. Multi-state businesses in Fargo-Moorhead need to monitor both North Dakota and Minnesota developments.

California's Leadership

California has multiple AI-related laws:

  • SB 1047 requiring safety protocols for large AI models
  • CPRA privacy rights that limit AI profiling
  • AB 2013 regulating AI in employment decisions

Practical consideration: Many AI vendors build products to meet California standards, meaning your tools may already incorporate features designed for stricter compliance.

What This Means for North Dakota Businesses

Operating in a less-regulated state creates both opportunities and risks:

Advantages:

  • More flexibility in AI adoption without state-specific red tape
  • Lower immediate compliance costs
  • Ability to innovate without waiting for regulatory approval

Disadvantages:

  • Uncertainty about future requirements
  • Risk that sudden legislation could require expensive retrofitting
  • Potential competitive disadvantage if customers prefer businesses with formal AI governance

What to Do Right Now

North Dakota businesses should take these immediate actions:

1. Don't Wait for State Legislation

Federal requirements already apply. Start building compliant practices now rather than scrambling when North Dakota inevitably passes AI laws.

2. Document Everything

Create records of:

  • What AI tools you use and why
  • Data your AI systems process
  • Testing you've performed for accuracy and bias
  • Decisions made about AI implementation

This documentation proves good-faith compliance efforts if issues arise.

3. Stay Informed

Monitor:

  • Federal AI guidance from FTC, EEOC, and other agencies
  • Legislation in neighboring states
  • Industry-specific regulatory developments
  • Your AI vendors' policy updates

4. Build Scalable Processes

Design compliance procedures that can adapt as requirements change:

  • Modular policies that can be updated
  • Testing frameworks that can be expanded
  • Vendor management processes for new tools
  • Training programs that incorporate new requirements

Understanding how much AI compliance actually costs can help you budget effectively as a small business.

5. Consider Voluntary Compliance

Many North Dakota businesses are adopting best practices from leading states:

  • Conducting bias audits even though not required
  • Providing AI transparency even where not mandated
  • Implementing opt-out rights for automated decisions

This positions you ahead of future requirements and builds customer trust.

6. Get the Right Documentation in Place

Proper compliance documentation serves multiple purposes:

  • Demonstrates good faith if regulators investigate
  • Provides evidence in litigation defense
  • Shows customers you take AI governance seriously
  • Helps employees understand their responsibilities

Creating these documents doesn't require hiring expensive lawyers. Tools like Attestly can generate customized AI compliance policies, vendor questionnaires, and data processing documentation specifically tailored to your North Dakota business in minutes rather than weeks. Getting proper documentation now is one of the most cost-effective risk management steps you can take.

The Bottom Line

North Dakota's lack of AI-specific legislation doesn't mean small businesses can ignore AI compliance. Federal rules already create real obligations, neighboring states' laws may affect your operations, and proactive governance protects your business from future liability.

The businesses that will thrive as AI regulation evolves are those building compliant practices today—before they're forced to by law. Start with the basics: understand what AI tools you use, implement reasonable safeguards for customer data, maintain human oversight of important decisions, and document your good-faith compliance efforts.

AI offers tremendous opportunities for North Dakota businesses to compete more effectively, serve customers better, and operate more efficiently. Thoughtful compliance doesn't limit those opportunities—it ensures they're sustainable as the regulatory landscape develops.

Frequently Asked Questions

Does North Dakota have specific AI laws for small businesses?

No, as of February 2026, North Dakota has not enacted dedicated AI legislation. The state has no comprehensive bills regulating AI systems, automated decision-making, or algorithmic transparency. However, federal regulations from the FTC, EEOC, and industry-specific agencies still apply to all North Dakota businesses using AI.

What should my North Dakota business do right now to prepare for AI compliance?

Start by inventorying every AI tool your business uses and assessing each for risk level. Review vendor contracts for data protection terms, implement transparency measures for customer-facing AI, establish human oversight for high-stakes decisions, and test AI systems for bias and accuracy. Document all of these efforts to prove good-faith compliance.

Do I need an AI disclosure policy in North Dakota?

While North Dakota doesn't mandate AI disclosures, federal FTC guidelines require transparency when AI affects consumer decisions. Creating a voluntary AI disclosure policy protects against federal enforcement, builds customer trust, and positions your business ahead of likely future state regulations. Many North Dakota businesses are adopting best practices from leading states proactively.

What penalties can North Dakota businesses face for AI non-compliance?

Federal FTC penalties can reach up to $50,120 per violation for deceptive AI practices. EEOC discrimination cases involving AI hiring tools have resulted in seven-figure settlements. Private lawsuits including class actions for biased AI decisions are also a growing risk, and defense costs alone can devastate a small business.

Need an AI disclosure policy for your North Dakota business?

Answer 6 questions about your business and generate your free compliance documents in under 2 minutes. No signup required.

Generate Your Free AI Policy →