AI Compliance Requirements in New York: What Small Businesses Need to Know in 2026
New York has specific AI legislation affecting businesses. Here's what small business owners need to know to stay compliant.
AI Compliance Requirements for Small Businesses in New York: A Practical Guide
If you're a small business owner in New York using AI tools—whether that's ChatGPT for customer service, AI-powered recruiting software, or marketing automation with machine learning—you need to understand your compliance obligations. New York has become one of the most proactive states in regulating artificial intelligence, particularly when it comes to employment decisions.
While AI regulation might sound intimidating, the reality is more manageable than you might think. If you're wondering whether your business even needs an AI disclosure policy, the answer for most New York employers using AI in hiring is a definitive yes. This guide breaks down what New York businesses actually need to do to stay compliant with current AI laws, which tools trigger these requirements, and how to implement practical compliance measures without hiring a legal team.
The Current State of AI Regulation in New York
New York has established itself as a leader in AI regulation, primarily through NYC Local Law 144, which took effect on July 5, 2023. This law specifically targets automated employment decision tools (AEDTs)—AI systems used to screen candidates or make employment-related decisions.
Local Law 144 was the first law of its kind in the United States to require bias audits for AI hiring tools, setting a precedent that other jurisdictions have since followed. The law applies throughout New York City's five boroughs and is enforced by the NYC Department of Consumer and Worker Protection (DCWP).
Beyond Local Law 144, New York State has been actively considering broader AI legislation. As of February 2026, several state-level proposals are working through the legislature, including bills that would:
- Expand bias audit requirements beyond employment to housing, credit, and insurance decisions
- Require impact assessments for high-risk AI systems
- Establish transparency requirements for AI use in government services
- Create consumer rights around automated decision-making
While these state-level proposals aren't yet law, they signal where regulation is heading. For small business owners, this means that compliance requirements may expand in the coming years, making it smart to establish good AI governance practices now.
Who Should Care: Does This Apply to Your Business?
The most immediate compliance obligation—Local Law 144—applies specifically to employers and employment agencies operating in New York City who use AI tools in their hiring or promotion processes.
You need to comply with Local Law 144 if:
- Your business is located in New York City OR you're hiring employees who will work in NYC
- You use any automated tool that relies on machine learning, statistical modeling, data analytics, or AI to screen candidates or employees
- This tool is used to "substantially assist or replace discretionary decision-making" in hiring or promotion
The law applies regardless of your company size. Whether you're a three-person startup or a 50-employee company, if you meet these criteria, compliance is mandatory.
You should pay attention even if you're outside NYC if:
- You operate elsewhere in New York State (state-level AI laws may soon apply)
- You do business with NYC-based companies (they may require vendors to demonstrate AI compliance)
- You use AI tools for other business decisions (future regulations may expand to other domains)
Importantly, you don't get a pass just because a third-party vendor provides your AI tool. The law holds employers responsible for the tools they use, even if they didn't develop them internally.
Specific Requirements Under NYC Local Law 144
If Local Law 144 applies to your business, you have three core obligations: conducting bias audits, providing notices, and maintaining specific documentation.
Bias Audit Requirement
Before using an AEDT, you must have it audited by an independent auditor to test for bias based on race, ethnicity, and sex. This audit must:
- Be conducted within one year before the tool's use
- Calculate selection rates and impact ratios for different demographic groups
- Follow specific statistical methodologies outlined in the law
- Be performed by a genuinely independent auditor (not your AI vendor, unless they use a separate independent entity)
The audit results must be publicly posted on your website, including the distribution date of the AEDT, the selection rates for each category, and the impact ratios. If your company doesn't have a website, you must provide this information to candidates or employees upon request.
Notice and Disclosure Requirements
You must notify all candidates and employees in New York City at least 10 business days before using an AEDT that:
- An AEDT will be used in the hiring or promotion process
- The job requirements and qualifications the tool will assess
- Information about the types of data collected for the AEDT, the source of that data, and the employer's data retention policy
This notice must be in clear, plain language—no hiding behind legal jargon. For candidates, you can provide notice in the job posting itself. For current employees being considered for promotion, you need a separate notification process.
Alternative Selection Process
Perhaps surprisingly, the law requires you to offer an alternative selection process or accommodation for candidates and employees who request one. This doesn't mean abandoning AI entirely, but you need a mechanism for individuals to opt out or request human review.
Record-Keeping
You must maintain records demonstrating compliance, including:
- Copies of bias audit reports
- Evidence of notices provided to candidates and employees
- Distribution dates and historical versions of the AEDT
These records must be kept for at least three years and made available to the DCWP upon request.
Common AI Tools That Trigger Compliance
Understanding which tools actually fall under the AEDT definition is crucial. Many small businesses use AI features without realizing they're subject to regulation.
Clearly Covered Tools
AI-powered recruiting platforms like HireVue, Pymetrics, or Modern Hire that use algorithms to score or rank candidates based on video interviews, assessments, or resume parsing definitely qualify as AEDTs.
Resume screening software that automatically filters applications using machine learning to identify qualified candidates triggers compliance requirements.
Chatbots that conduct initial screening by asking qualifying questions and automatically advancing or rejecting candidates based on responses are covered.
Gray Area Tools
LinkedIn Recruiter and similar platforms that use AI to match jobs with candidates may or may not trigger requirements, depending on exactly how you use them. If you're just using search filters you manually set, probably not. If the platform's algorithm is automatically scoring or ranking candidates, potentially yes.
Applicant tracking systems (ATS) with AI features require careful evaluation. Basic ATS functions like organizing applications don't trigger compliance, but "smart ranking" or "best match" features that use AI likely do.
Assessment tools depend on their sophistication. Simple skills tests with predetermined right answers aren't AEDTs, but tools using AI to interpret and score responses generally are.
Definitely NOT Covered
General-purpose AI tools like ChatGPT that you use to draft job descriptions or get management advice aren't AEDTs unless you're directly feeding candidate information into them and using the output to make selection decisions.
Marketing automation tools with AI features (like email campaign optimization) don't fall under Local Law 144, which specifically targets employment decisions.
Customer relationship management (CRM) AI features that predict customer behavior or suggest next actions aren't employment-related and therefore aren't covered.
The key question to ask: "Is this tool using AI to evaluate, score, rank, or make recommendations about candidates or employees in a way that substantially influences hiring or promotion decisions?" If yes, compliance applies.
Ready to get compliant? Generate your New York AI compliance documents in under 2 minutes.
Generate Free AI Policy →Step-by-Step Compliance Checklist for New York Businesses
Getting compliant doesn't require a massive legal undertaking. Here's a practical roadmap:
Step 1: Inventory Your AI Tools (Week 1)
Document every tool in your hiring process that might use AI, machine learning, or automation. Contact your vendors directly and ask: "Does this product use AI to evaluate, score, or rank candidates? Would it be considered an AEDT under NYC Local Law 144?"
Many vendors have published compliance statements—check their websites or request documentation.
Step 2: Determine Which Tools Are AEDTs (Week 1-2)
For each tool, evaluate whether it substantially assists or replaces human decision-making. If you're uncertain, err on the side of caution and treat it as an AEDT—over-compliance is safer than under-compliance.
Step 3: Obtain Bias Audits (Weeks 2-8)
For any AEDT, you need a qualified independent auditor to conduct a bias audit. Options include:
- Request audit documentation from your vendor (many large AI hiring platforms now provide audits to clients)
- Hire an independent auditor yourself (firms specializing in AI bias testing have emerged to serve this market)
- If your vendor won't provide an audit, consider whether you should switch to a compliant alternative
Audits typically cost between $5,000 and $25,000 for small businesses, though costs vary widely based on tool complexity.
Step 4: Publish Audit Results (Week 8-9)
Create a dedicated page on your website posting the bias audit results. The page should include:
- Date the AEDT was distributed (put into use)
- Selection rates by race/ethnicity and sex
- Impact ratios
- The date the audit was conducted
Make this page easily discoverable—many companies create a URL like "yourcompany.com/aedt-disclosure" and link to it from job postings.
Step 5: Update Your Hiring Process (Week 9-10)
Revise your job postings and internal procedures to include required notices. Create template language like:
"We use an automated employment decision tool (AEDT) to assist in our hiring process. This tool evaluates [specific qualifications, such as technical skills and experience]. You can request an alternative selection process or reasonable accommodation by contacting [email]. For information about data collection and retention, see [link]. To view our AEDT bias audit, visit [link]."
Step 6: Establish an Alternative Process (Week 10)
Define what your alternative selection process looks like. This might mean:
- Offering manual resume review by a human recruiter
- Providing reasonable accommodations for assessments
- Having a clear point of contact for accommodation requests
Document this process and train your hiring team on it.
Step 7: Implement Record-Keeping (Week 11-12)
Set up a system to maintain compliance documentation for three years:
- Audit reports
- Dates notices were provided
- Copies of notice language used
- Versions of tools used
- Any accommodation requests and how they were handled
A simple shared drive folder with clear naming conventions often suffices for small businesses.
Step 8: Train Your Team (Ongoing)
Make sure everyone involved in hiring understands:
- Which tools are AEDTs
- That notices must be provided before use
- How to respond to alternative process requests
- Where compliance documentation is stored
Penalties and Enforcement
The NYC Department of Consumer and Worker Protection enforces Local Law 144, and violations can be costly.
Civil penalties for non-compliance include:
- Up to $500 for a first violation (per AEDT)
- Up to $1,500 for subsequent violations (per AEDT)
While these might sound modest, penalties apply per violation, and each use of a non-compliant tool could potentially be counted separately. A small business conducting 100 screenings with a non-compliant AEDT could theoretically face $50,000 in penalties.
Enforcement approach: The DCWP has taken a relatively measured approach, focusing first on education and compliance assistance rather than aggressive penalty assessment. However, they respond to complaints, meaning a dissatisfied candidate could trigger an investigation.
Beyond regulatory penalties, non-compliance creates litigation risk. Candidates who believe they were discriminated against by a non-compliant AI tool may have stronger grounds for employment discrimination claims. Even if you ultimately prevail, the legal costs of defending such claims can be substantial.
Reputational risk also matters. In an era when job seekers research employers extensively, public disclosure of non-compliance or bias in hiring tools can damage your employer brand and make it harder to attract talent.
The bottom line: compliance is significantly cheaper than the potential cost of penalties, litigation, and reputation damage.
How New York Compares to Other States
New York's AI regulation, particularly NYC Local Law 144, is among the most specific and stringent in the country, but the state isn't alone in regulating AI.
California has not enacted AI-specific hiring laws comparable to Local Law 144, but does have the California Consumer Privacy Act (CCPA), which creates transparency requirements around automated decision-making that can affect AI use.
Neighboring New Jersey and Connecticut are also developing their own AI compliance frameworks, making the entire Northeast region a hotspot for AI regulation.
Illinois regulates AI through its Biometric Information Privacy Act (BIPA), which restricts collection of biometric data (like facial scans or voice analysis) often used in AI hiring tools. Many AI hiring platforms have faced expensive BIPA litigation.
Colorado passed the Colorado AI Act (SB 24-205), which takes effect in 2026 and establishes broader requirements for "high-risk AI systems," including impact assessments and consumer rights. This goes beyond hiring to include housing, credit, and other consequential decisions.
Maryland enacted laws requiring employers to notify job applicants about the use of facial recognition technology in interviews.
At the federal level, there's currently no comprehensive AI employment law, though the Equal Employment Opportunity Commission (EEOC) has issued guidance stating that employers can be liable for discrimination if their AI tools produce biased outcomes, regardless of whether those tools comply with local laws like NYC's.
New York's approach is notable for its specificity and enforceability. Rather than general principles or guidance, Local Law 144 creates concrete, auditable requirements with clear enforcement mechanisms. This makes compliance clearer but also less flexible than principle-based approaches in some other jurisdictions.
For multi-state employers, this creates complexity—you may need to comply with NYC's bias audit requirements, California's data privacy rules, and Illinois's biometric restrictions all for the same hiring tool. If you're concerned about how much AI compliance will cost, the good news is that a well-designed compliance program can satisfy multiple state requirements simultaneously. The general trend is toward more regulation, not less, making robust AI governance increasingly important.
What to Do Right Now
If you're a New York small business using or considering AI tools, here are your immediate action items:
This week:
-
Audit your current tools. List every platform involved in hiring or promotion and identify which ones use AI features.
-
Contact your vendors. Email every platform provider asking about Local Law 144 compliance and requesting bias audit documentation if applicable.
-
Review your job postings. Check whether you're currently providing required notices. If not, pause using any AEDT until you're compliant.
This month:
-
Obtain necessary audits. For any AEDT without current audit documentation, either get one from your vendor or engage an independent auditor.
-
Publish audit results. Create a webpage with audit information and link to it from your careers page.
-
Update templates. Revise job posting templates, offer letters, and internal promotion processes to include required notices.
This quarter:
-
Establish governance. Designate someone (even if it's you as the owner) as responsible for AI compliance. Create a simple checklist for evaluating new tools before adoption.
-
Document everything. Set up your three-year record-keeping system and populate it with current compliance documentation.
-
Train your team. Brief everyone involved in hiring on compliance requirements and procedures.
Ongoing:
-
Stay informed. Monitor for new New York State legislation that may expand AI requirements beyond employment. Subscribe to updates from industry associations or legal resources focused on AI regulation.
-
Review annually. Bias audits expire after one year. Set calendar reminders to ensure audits stay current.
-
Evaluate new tools carefully. Before adopting any new HR tech, recruiting platform, or assessment tool, determine if it's an AEDT and what compliance steps are needed.
The key is not to let perfect be the enemy of good. You don't need a sophisticated AI governance program rivaling Fortune 500 companies. You need practical documentation, clear processes, and consistent execution.
Frequently Asked Questions
Does NYC Local Law 144 apply to small businesses?
How much does a bias audit cost for Local Law 144 compliance?
What are the penalties for violating NYC Local Law 144?
Does Local Law 144 apply if I use a third-party AI hiring tool?
Do I need to comply with Local Law 144 if my business is outside NYC but I hire NYC workers?
Simplifying Compliance with the Right Tools
AI compliance doesn't have to be overwhelming. While New York's requirements are specific, they're also clear—and that clarity makes compliance achievable for businesses of any size.
The hardest part is often just getting started: understanding what applies to you, gathering the right information, and creating documentation that satisfies legal requirements without requiring a law degree to produce.
That's exactly why Attestly exists. We help New York small businesses generate customized AI compliance documents—from bias audit summaries and notice templates to data retention policies and employee notifications—in minutes rather than weeks. Our platform is built specifically for companies that need to comply with laws like NYC Local Law 144 but don't have in-house legal teams or unlimited budgets.
Whether you're just starting to use AI tools or realizing you need to get compliant with tools you've been using, having the right documentation is the foundation of defensible AI governance. Visit attestly.io to generate your customized New York AI compliance documents today.
Need an AI disclosure policy for your New York business?
Answer 6 questions about your business and generate your free compliance documents in under 2 minutes. No signup required.
Generate Your Free AI Policy →Related Guides
AI Compliance in Vermont: What Small Businesses Should Do Now (Even Without a State Law)
Vermont doesn't have specific AI legislation yet, but compliance still matters. Here's what your business should do now.
AI Compliance in Pennsylvania: How Privacy Laws Affect Your Business's AI Use
Pennsylvania's privacy laws have implications for AI use. Learn how they affect your business and what steps to take.
How to Update Your Privacy Policy for AI: A Step-by-Step Guide
Your privacy policy probably needs an AI update. Here's exactly what to add and how to word it.
What Is an AI Disclosure Policy? Everything Your Business Needs to Know
Learn what an AI disclosure policy is, why your business needs one, and what it should include to stay compliant.