← Back to Blog
Attestly Team··South Dakota

AI Compliance in South Dakota: What Small Businesses Should Do Now (Even Without a State Law)

South Dakota doesn't have specific AI legislation yet, but compliance still matters. Here's what your business should do now.

AI Compliance Requirements for Small Businesses in South Dakota

If you're running a small business in South Dakota and using AI tools like ChatGPT, AI-powered marketing platforms, or smart CRM systems, you might be wondering: what are my legal obligations? The short answer is that South Dakota doesn't have specific AI legislation—yet. But that doesn't mean you're off the hook for compliance.

While South Dakota takes a hands-off regulatory approach compared to states like California or Colorado, federal regulations still apply to your business. North Dakota is in a similar position, while nearby Nebraska has enacted a Data Privacy Act with automated decision-making provisions. Plus, if you serve customers in other states, you may need to follow their AI laws too. And make no mistake: AI regulation is coming. Being proactive now can save you significant headaches later.

This guide breaks down everything South Dakota small business owners need to know about AI compliance in 2026.

Current State of AI Regulation in South Dakota

South Dakota has not enacted any AI-specific legislation as of February 2026. The state legislature has not introduced comprehensive bills targeting artificial intelligence transparency, algorithmic accountability, or automated decision-making systems.

This regulatory approach aligns with South Dakota's broader philosophy toward business regulation. The state consistently ranks as one of the most business-friendly in the nation, with minimal state-level privacy laws and a preference for letting federal standards set the baseline.

However, the absence of state law doesn't create a compliance-free zone. Several important factors still apply:

Federal regulations govern AI use across all states. The Federal Trade Commission (FTC) has been actively enforcing existing consumer protection laws against deceptive and unfair AI practices. These federal standards apply equally to South Dakota businesses.

Industry-specific rules matter. If you operate in healthcare, financial services, insurance, or other regulated industries, federal sector-specific laws impose AI compliance requirements regardless of where your business is located.

Interstate commerce considerations apply. If your South Dakota business serves customers in states with AI laws—like California, Colorado, or Connecticut—you may need to comply with those states' requirements for those customers.

Common law liability exists. Even without specific AI statutes, businesses can face lawsuits for negligence, discrimination, breach of contract, or other harms caused by AI systems under traditional legal theories.

The regulatory landscape is also shifting rapidly. Over 30 states introduced AI legislation in 2025, and many legal experts expect South Dakota will eventually follow suit, particularly if neighboring states like Minnesota or Iowa enact comprehensive AI laws.

Who Should Care About AI Compliance in South Dakota

You might think AI compliance only matters for tech companies or large enterprises. That's not the case. If your South Dakota business uses any of these tools or practices, compliance matters to you:

Businesses using AI for customer interactions: If you use chatbots on your website, AI-powered customer service tools, or automated email marketing systems, you're using AI that affects consumers.

Companies making automated decisions: Using AI to screen job applications, approve loans or credit, set prices dynamically, or make tenant screening decisions puts you in a higher-risk category for compliance.

Healthcare and medical practices: Any South Dakota healthcare provider using AI for diagnosis support, treatment recommendations, patient communications, or medical billing faces HIPAA requirements that extend to AI systems.

Financial services and insurance: Banks, credit unions, insurance agencies, and financial advisors using AI tools must comply with federal financial regulations including fair lending laws and the Fair Credit Reporting Act.

Retailers and e-commerce businesses: Online stores using AI for product recommendations, dynamic pricing, inventory management, or fraud detection should have compliance measures in place.

Professional services firms: Accounting firms, law offices, real estate agencies, and consulting businesses increasingly use AI for research, document drafting, client analysis, and administrative tasks—all of which create compliance considerations.

Employers of any size: If you use AI tools to recruit, screen, hire, evaluate, or manage employees, you face potential liability under federal employment discrimination laws.

The bottom line: if you use any technology that makes decisions, predictions, or recommendations without direct human involvement in every step, you're using AI in a way that creates compliance obligations.

Federal Requirements That Apply to South Dakota Businesses

Since South Dakota lacks state-specific AI laws, federal regulations become your primary compliance framework. Here are the key federal requirements:

FTC Act Section 5

The Federal Trade Commission enforces broad prohibitions against "unfair or deceptive acts or practices." The FTC has made clear this applies to AI systems. Your business cannot:

  • Make false claims about AI capabilities or how AI systems work
  • Use AI in ways that cause unjustified consumer harm
  • Deploy AI systems with inadequate security that puts consumer data at risk
  • Fail to provide reasonable transparency about AI use in decision-making

Equal Credit Opportunity Act (ECOA)

If your business extends credit—including financing options, business loans, or payment plans—AI systems used in credit decisions must comply with ECOA. This means:

  • No discrimination based on race, color, religion, national origin, sex, marital status, or age
  • Adverse action notices must be provided when AI denies or modifies credit terms
  • You must be able to provide specific reasons for adverse decisions

Fair Credit Reporting Act (FCRA)

Businesses using AI to make employment, credit, insurance, or housing decisions based on consumer reports must:

  • Obtain proper authorization before pulling reports
  • Provide pre-adverse action notices before making negative decisions
  • Give consumers copies of reports used in decisions
  • Ensure AI vendors qualify as proper consumer reporting agencies when applicable

Americans with Disabilities Act (ADA)

AI tools used in employment or customer service must be accessible. This includes:

  • Ensuring AI chatbots work with screen readers
  • Providing alternative methods for people who can't use AI interfaces
  • Not using AI in ways that systematically disadvantage people with disabilities

Health Insurance Portability and Accountability Act (HIPAA)

Healthcare providers and their business associates in South Dakota must ensure:

  • AI tools processing protected health information have proper business associate agreements
  • Patient data used to train or operate AI systems remains secure and private
  • AI-generated health information maintains the same confidentiality as human-generated records

Industry-Specific Regulations

Depending on your sector, additional federal rules may apply:

  • Financial institutions: Gramm-Leach-Bliley Act requirements for data security
  • Children's businesses: COPPA compliance if AI systems collect data from users under 13
  • Telecommunications: FCC robocall and robotext restrictions for AI-powered communications
  • Advertising: CAN-SPAM Act requirements for AI-generated marketing emails

Common AI Tools That Trigger Compliance Requirements

Understanding which tools create compliance obligations helps you prioritize your efforts. Here are the most common AI applications South Dakota small businesses use:

Generative AI Platforms (ChatGPT, Claude, Gemini)

When you use tools like ChatGPT for customer communications, content creation, or business operations, compliance issues include:

  • Ensuring accuracy of AI-generated information provided to customers
  • Not inputting confidential customer or employee data into public AI systems
  • Understanding that you remain legally responsible for AI-generated content
  • Disclosing AI use when it materially affects customer decisions

AI-Powered CRM Systems (Salesforce Einstein, HubSpot AI)

These tools often include AI features for:

  • Lead scoring and prioritization
  • Predictive analytics about customer behavior
  • Automated email campaigns
  • Sales forecasting

Compliance considerations include ensuring AI scoring doesn't create illegal discrimination and maintaining transparency about automated decision-making in customer relationships.

Marketing and Advertising AI (Jasper, Persado, Seventh Sense)

AI tools that generate ad copy, optimize send times, or personalize content must:

  • Comply with truth-in-advertising standards (you're liable for false AI-generated claims)
  • Follow FTC endorsement and disclosure guidelines
  • Respect do-not-call and do-not-email preferences even in AI-automated campaigns

Hiring and HR AI Tools (HireVue, Pymetrics, Eightfold)

AI used in employment decisions creates significant compliance risk:

  • Must not create disparate impact on protected groups
  • Should be regularly audited for bias
  • May require disclosure to applicants or employees
  • Must comply with EEOC guidelines on employment testing

Chatbots and Virtual Assistants

AI chatbots on your website or customer service channels should:

  • Clearly identify themselves as bots (not pretend to be human when they're not)
  • Escalate to humans for complex or sensitive issues
  • Protect any personal information collected during conversations
  • Provide accurate information (you're liable for the bot's statements)

Pricing and Revenue Optimization Tools

Dynamic pricing AI must avoid:

  • Price discrimination based on protected characteristics
  • Deceptive pricing practices
  • Collusion with competitors (even unintentional algorithmic collusion)

Security and Fraud Detection Systems

AI monitoring for fraud or security threats should:

  • Have appropriate accuracy and not create excessive false positives
  • Include human review for high-stakes decisions
  • Comply with consumer notification requirements when fraud is detected

Step-by-Step Compliance Checklist for South Dakota Businesses

Here's a practical compliance roadmap for small businesses in South Dakota using AI tools:

Step 1: Create an AI Inventory

Document every AI tool and system your business uses:

  • What AI applications are you using? (List specific tools and platforms)
  • What decisions or actions does each AI system make?
  • What data does each AI system access or process?
  • Who are the vendors, and where is data stored or processed?
  • Which systems affect customers, employees, or other stakeholders?

This inventory forms the foundation of your compliance program.

Step 2: Assess Your Risk Level

Evaluate each AI system for compliance risk:

  • High risk: AI making consequential decisions about people (hiring, credit, medical, housing)
  • Medium risk: AI directly interacting with customers or generating public-facing content
  • Lower risk: Internal AI tools for productivity or analysis that don't affect external parties

Focus your compliance efforts on high-risk systems first.

Step 3: Review Vendor Contracts and Privacy Policies

For each AI tool, verify:

  • Do vendor contracts include data protection and indemnification clauses?
  • Are privacy policies updated to disclose AI use?
  • Do you have business associate agreements for HIPAA-covered AI?
  • Who owns data processed by AI systems?
  • Can you audit vendor AI systems for bias or accuracy?

Step 4: Implement Transparency Measures

Even without legal mandates, transparency builds trust:

  • Disclose AI use in contexts where customers would reasonably want to know
  • Update your privacy policy to explain what AI systems collect and use data
  • Consider adding an AI use disclosure page to your website
  • Train staff to explain AI-assisted decisions when customers ask
📋

Ready to get compliant? Generate your South Dakota AI compliance documents in under 2 minutes.

Generate Free AI Policy →

Step 5: Establish Human Oversight

High-stakes AI decisions should include human review:

  • Implement "human in the loop" processes for employment, credit, and similar decisions
  • Train staff to recognize when AI recommendations may be biased or incorrect
  • Create escalation procedures when AI systems produce questionable results
  • Document human review of AI decisions

Step 6: Test for Bias and Accuracy

Regularly evaluate AI systems:

  • Test whether AI outcomes differ across demographic groups
  • Verify accuracy of AI-generated information before relying on it
  • Monitor for algorithmic drift (AI performance degrading over time)
  • Document testing results and corrective actions taken

Step 7: Secure AI Systems and Data

Implement security measures appropriate to the sensitivity of data:

  • Limit employee access to AI systems containing sensitive data
  • Use encryption for data processed by AI tools
  • Vet AI vendors' security practices
  • Have incident response plans for AI-related data breaches

Step 8: Train Your Team

Employees using AI tools should understand:

  • Which AI systems the business uses and for what purposes
  • Basic compliance requirements relevant to their roles
  • When to escalate AI-related questions or concerns
  • How to identify potential bias or errors in AI outputs

Step 9: Document Your Compliance Program

Create written policies covering:

  • Acceptable use policies for AI tools
  • Data protection standards for AI systems
  • Procedures for human review of AI decisions
  • Vendor management requirements for AI providers

Documentation demonstrates good faith compliance efforts if questions arise.

AI regulation changes rapidly:

  • Subscribe to updates from the FTC and relevant industry regulators
  • Monitor whether South Dakota introduces AI legislation
  • Track AI laws in states where you have customers
  • Review and update your compliance program at least annually

Penalties and Enforcement

While South Dakota lacks state-level AI penalties, federal enforcement is real and growing.

FTC Enforcement

The FTC has levied significant penalties for AI-related violations:

  • Companies have paid millions in settlements for algorithmic discrimination
  • The FTC has required businesses to destroy AI models built on improperly collected data
  • Penalties can include both monetary fines and operational restrictions

Recent FTC cases have targeted companies for deceptive AI claims, biased algorithms, and inadequate data security in AI systems.

Employment Discrimination Claims

The Equal Employment Opportunity Commission (EEOC) has made AI in hiring a priority:

  • Businesses can face discrimination lawsuits if AI hiring tools disadvantage protected groups
  • EEOC investigations can be costly even if they don't result in penalties
  • Remedies may include back pay, policy changes, and monitoring requirements

Financial Services Penalties

Banks, lenders, and other financial institutions face enforcement from multiple agencies:

  • Office of the Comptroller of the Currency (OCC)
  • Consumer Financial Protection Bureau (CFPB)
  • Federal Reserve
  • State banking regulators

Penalties for discriminatory AI in lending can reach millions of dollars.

Private Lawsuits

Even without specific AI statutes, businesses face litigation risk:

  • Class action lawsuits for biased AI decisions
  • Individual lawsuits for negligence or breach of contract
  • Shareholder derivative suits if AI problems harm company value

Reputational Harm

Beyond formal penalties, AI compliance failures create:

  • Negative media coverage
  • Customer loss and boycotts
  • Difficulty recruiting talent
  • Damaged relationships with business partners

For small businesses, reputational damage can be more devastating than regulatory fines.

How South Dakota Compares to Other States

Understanding the regulatory landscape in other states helps you anticipate where South Dakota might head and prepare for multi-state compliance if you expand.

States with Comprehensive AI Laws

Colorado enacted the most comprehensive AI law in 2024, requiring:

  • Algorithmic discrimination impact assessments
  • Transparency about automated decision-making
  • Consumer rights to opt out of certain AI profiling
  • Developer disclosure obligations

California has multiple AI-related laws covering:

  • Automated decision-making technology disclosures
  • Bot identification requirements
  • AI-related employment laws
  • Sector-specific AI regulations

Connecticut, Illinois, and New York have significant AI employment laws requiring audits, disclosures, and bias testing for AI hiring tools.

Vermont and Utah have enacted narrower AI laws focusing on specific sectors or applications.

The South Dakota Difference

South Dakota's approach differs in several ways:

No state-level requirements: Businesses face no South Dakota-specific AI disclosure, testing, or documentation mandates.

Lower compliance costs: Without state requirements, South Dakota businesses avoid the expense of state-specific compliance programs (at least for now).

Federal baseline applies: South Dakota businesses still must meet the same federal standards as businesses in states with AI laws.

Competitive considerations: Some businesses promote robust AI governance as a competitive advantage even without legal mandates.

Future uncertainty: The lack of legislation could change quickly if neighboring states enact laws or federal legislation passes.

Regional Context

Looking at the region:

  • Minnesota has considered AI legislation focusing on automated decision systems
  • Iowa has discussed AI governance frameworks but not passed legislation
  • Nebraska similarly has no AI-specific laws
  • Wyoming takes a minimalist regulatory approach like South Dakota

South Dakota's approach aligns with regional patterns, but the regulatory tide is shifting nationally.

What South Dakota Businesses Should Do Right Now

Even without state mandates, proactive AI compliance makes business sense. Here's what to do today:

Immediate Actions (This Week)

Our complete AI compliance guide for small businesses covers the fundamentals every business owner should know.

Conduct an AI audit: Spend a few hours identifying every AI tool your business uses. Include obvious tools like ChatGPT and less obvious ones like AI features in your email platform or accounting software.

Review customer-facing AI: Check anywhere AI interacts with customers—chatbots, automated emails, recommendation engines. Ensure these systems work properly and don't make false claims.

Update your privacy policy: If you use AI tools that process customer data, your privacy policy should mention this. Most businesses can add a simple paragraph about AI use.

Short-Term Actions (This Month)

Assess vendor compliance: Contact vendors of AI tools you use. Ask about their data security, bias testing, and compliance measures. Document their responses.

Implement disclosure practices: Decide where meaningful AI disclosure makes sense for your business. This might be in your terms of service, privacy policy, or customer communications.

Train key staff: Ensure employees using AI tools understand basic compliance principles. This doesn't require extensive training—a one-hour session covering the basics is a good start.

Review employment practices: If you use AI in hiring, promotion, or performance evaluation, verify you can demonstrate these tools don't discriminate.

Medium-Term Actions (Next Quarter)

Develop written AI policies: Create simple, practical policies governing AI use in your business. These don't need to be complex—clear one-page guidelines often work better than lengthy documents.

Establish oversight processes: Implement human review for high-stakes AI decisions. Document these reviews.

Test AI accuracy: Periodically verify that AI tools are producing accurate, unbiased results. This can be as simple as spot-checking AI outputs against human judgment.

Plan for scaling: If you expect to grow, serve customers in other states, or expand AI use, develop a compliance roadmap that can scale with your business.

Ongoing Actions

Monitor regulatory developments: Set up Google Alerts or subscribe to newsletters covering AI regulation. Spend 15 minutes monthly reviewing updates.

Review vendor relationships annually: At least once per year, reassess your AI vendors' compliance practices and contract terms.

Update your compliance program: As your AI use evolves, update your policies, training, and oversight processes accordingly.

Document everything: Keep records of your compliance efforts, including policies, training, testing results, and vendor communications. This documentation protects you if questions arise.

Building a Sustainable Approach

The key to AI compliance for small businesses is building sustainable practices into your operations rather than treating compliance as a one-time project.

Start with high-risk AI applications and expand your compliance efforts as resources allow. Even basic measures like transparency, human oversight of important decisions, and vendor due diligence provide meaningful protection.

Don't let perfect be the enemy of good. A simple, implemented compliance program beats an elaborate plan that never gets executed.

Getting Help with AI Compliance Documentation

Creating compliance documentation doesn't have to be overwhelming or expensive. While South Dakota may not require specific AI compliance documents today, having policies, disclosures, and procedures in place protects your business from federal liability and prepares you for future regulations.

Attestly helps small businesses generate customized AI compliance documents tailored to their specific situation—including their location, industry, and the AI tools they use. In just a few minutes, you can create professional AI use policies, privacy policy updates, vendor assessment templates, and other compliance documents written in plain English for your business.

Whether South Dakota enacts AI legislation or not, having clear documentation of your AI governance practices demonstrates your commitment to responsible AI use—to customers, employees, regulators, and business partners.

The AI regulatory landscape will continue evolving rapidly. The businesses that thrive will be those that view compliance not as a burden but as a foundation for building trust and using AI responsibly. Start building that foundation today.

Frequently Asked Questions

Does South Dakota have specific AI laws for small businesses?

No, South Dakota has not enacted any AI-specific legislation as of February 2026. The state consistently ranks as one of the most business-friendly in the nation with minimal state-level privacy laws. However, federal regulations from the FTC, EEOC, and industry-specific agencies still apply to all South Dakota businesses using AI.

What should my South Dakota business do right now about AI compliance?

Start by conducting an AI audit to identify every tool your business uses, then review customer-facing AI for accuracy and disclosure needs. Update your privacy policy to mention AI use, assess vendor compliance, implement disclosure practices, and train key staff on basic compliance principles. Focus your efforts on high-risk AI applications first.

Can South Dakota businesses be affected by other states' AI laws?

Yes. If your South Dakota business serves customers in states with AI laws like California, Colorado, or Connecticut, you may need to comply with those states' requirements for those customers. Interstate commerce considerations apply, and many AI vendors build products to meet the strictest state standards, meaning your tools may already incorporate compliance features.

What penalties can South Dakota businesses face for AI non-compliance?

While South Dakota lacks state-level AI penalties, federal enforcement is real. The FTC has levied multi-million dollar settlements for AI-related violations. EEOC complaints about AI hiring discrimination can result in back pay, damages, and mandatory system changes. Private lawsuits including class actions for biased algorithmic decisions are also a growing risk.

Need an AI disclosure policy for your South Dakota business?

Answer 6 questions about your business and generate your free compliance documents in under 2 minutes. No signup required.

Generate Your Free AI Policy →