Navigating California's New AI Employment Rules: A Practical Compliance Roadmap for California Businesses

Nathan Klein, Partner

November 13, 2025

California's AI Hiring Rules Went Live October 1 - Is Your Business Compliant?

If your business uses technology to screen resumes, assess candidates, or make employment decisions, you're now operating under new legal requirements. On October 1, 2025, the California Civil Rights Council's comprehensive regulations on artificial intelligence in employment took effect, fundamentally changing how California employers can leverage AI-powered hiring tools.

These regulations aren't minor administrative updates—they represent one of the most comprehensive state-level efforts to regulate the potential discriminatory effects of AI's application in the workplace.

Research from the California Civil Rights Department shows that automated decision systems are "increasingly used in employment settings to facilitate a wide range of decisions related to job applicants or employees, including with respect to recruitment, hiring, and promotion". However, as the Civil Rights Council noted when finalizing these rules, while "these tools can bring myriad benefits, they can also exacerbate existing biases and contribute to discriminatory outcomes".

The challenge for California employers is clear: most businesses have adopted some form of AI in their hiring processes, yet many remain unaware of their new compliance obligations. The regulations apply to all employers in California with five or more employees who use artificial intelligence, machine-learning, algorithms, statistics, or other data processing techniques in employment decision-making.

Understanding California's New AI Regulatory Framework

The California Civil Rights Council finalized these regulations in June 2025, clarifying how the Fair Employment and Housing Act (FEHA) applies to artificial intelligence and automated decision systems. Rather than creating entirely new anti-discrimination laws, the regulations amend the existing FEHA framework to explicitly address AI tools as potential sources of discrimination.

As the Civil Rights Department explained in its official announcement: "These rules help address forms of discrimination through the use of AI, and preserve protections that have long been codified in our laws as new technologies pose novel challenges".

The regulations aim to accomplish several critical objectives:

  • Make clear that using an automated decision system may violate California law if it harms applicants or employees based on protected characteristics such as gender, race, age, disability, national origin, or religion
  • Ensure employers maintain employment records, including automated decision data, for a minimum of four years
  • Affirm that certain AI assessments, including tests or puzzle games that elicit information about disabilities, may constitute unlawful medical inquiries
  • Define key terms such as "automated-decision system," "artificial intelligence," and "proxy" within the employment context

What Exactly Qualifies as an "Automated Decision System"?

One of the most significant aspects of these regulations is their expansive definition of what constitutes an automated decision system. The regulations define an ADS as "a computational process that makes a decision or facilitates human decision making regarding an employment benefit". This broad definition may be derived from or use artificial intelligence, machine-learning, algorithms, statistics, or other data processing techniques.

Critically, the rules apply even if the tool doesn't make the final decision but merely influences it. This means many more employers are covered than initially anticipated.

Examples of covered technologies include:

Resume and Application Screening
  • Software that automatically ranks, filters, or scores applicants
  • Systems that parse resumes and identify keywords or qualifications
  • Tools that recommend candidates for interview based on application data
Interview and Assessment Tools
  • Video interview analytics that assess facial expressions, tone, or word choice
  • AI-driven chatbots or virtual recruiters that conduct initial screenings
  • Personality or cognitive assessments using algorithms
  • Voice analysis systems claiming to measure engagement or competence
  • Puzzle or game-based assessments that score candidates
Recruitment and Advertising
  • Platforms that direct job advertisements to specific demographics
  • Systems that identify and target potential candidates
  • Tools that predict which candidates are most likely to accept offers
Internal Employment Decisions
  • Productivity or behavioral scoring systems
  • Algorithms that recommend employees for promotion
  • Systems that determine compensation increases
  • Scheduling software that uses predictive algorithms

Don't assume this only applies to sophisticated AI platforms. According to the regulations, even relatively simple computational processes that facilitate human decision-making fall within the definition[8].

Your Critical Compliance Obligations

Now that the regulations are in effect, every California employer using AI tools in employment must address several specific requirements:

1. Prohibition on Discriminatory Outcomes

Employers may not use an automated decision system that discriminates against applicants or employees based on any category protected by FEHA. This includes both direct discrimination (explicitly considering protected characteristics) and disparate impact (neutral policies that disproportionately affect protected groups).

2. Extended Recordkeeping Requirements

The regulations require employers to "maintain employment records, including automated decision data, for a minimum of four years". This significantly extends the standard retention period and includes:

  • Descriptions of each ADS and its intended purpose
  • Documentation of bias testing methodology and results
  • Data inputs and outputs from the ADS
  • Vendor contracts and technical specifications
  • Any complaints or concerns raised about the system
  • Records of accommodations provided related to ADS use

If you've been using AI tools without maintaining these comprehensive records, you're already out of compliance. The four-year clock begins when you first implement each tool.

3. Expanded Definition of Employer "Agent"

For the first time, the regulations formally define an employer's "agent" as anyone acting on behalf of an employer—directly or indirectly—"to exercise a function traditionally exercised by the employer". The regulations specifically identify applicant recruitment, screening, hiring, promotion, or decisions regarding pay, benefits, or leave as examples of such functions.

4. Accommodation Requirements for ADS Tools

Employers must provide reasonable accommodations when automated decision systems create barriers for individuals with disabilities or conflict with religious practices. Examples include:

  • Modifying video interview requirements for candidates whose disabilities affect facial expressions or speech patterns
  • Providing alternative assessment formats for candidates with cognitive disabilities
  • Adjusting scheduling algorithms that conflict with religious observances

Waiving certain AI-driven tests when they pose disability-related barriers

Your accommodation policy must now explicitly address ADS-related barriers, and your teams need training on identifying and resolving these issues.

Liability and Enforcement: What's at Stake?

The regulations make clear that both employers and their agents can be held liable for discrimination resulting from automated decision systems. This creates several significant risks:

Direct Liability for Employers Regardless of whether you developed the AI tool in-house or purchased it from a vendor, you remain fully responsible for any discriminatory outcomes. The fact that a sophisticated vendor created the algorithm provides no defense.

Vendor Liability as "Agents" Third-party vendors who provide ADS tools to California employers may themselves qualify as "agents" and face direct liability under FEHA. This should motivate vendors to provide better compliance documentation and support.

Disparate Impact Claims The regulations explicitly apply disparate impact theory to AI tools. Even if your system contains no explicit bias, if it produces substantially different outcomes for protected groups, you may face claims unless you can prove the system is job-related and consistent with business necessity.

Class Action Potential Because AI tools often affect large numbers of applicants or employees simultaneously, any discriminatory pattern could expose employers to class action claims with potentially massive damages.

Available Damages Successful claims under FEHA can result in back pay, front pay, emotional distress damages, punitive damages, and attorney's fees. For systemic discrimination affecting dozens or hundreds of applicants, these damages quickly accumulate.

Your Next Steps: Don't Wait Any Longer

The October 1 effective date has passed, which means every day of non-compliance increases your legal risk. Employment discrimination claims can be brought not just by individual applicants but also as class actions representing potentially hundreds of affected individuals.

At Tyler Law, LLP, our employment law practice focuses on helping California businesses navigate complex compliance challenges like these new AI regulations. We understand that most employers adopted AI tools to improve efficiency and effectiveness—not to create discrimination. Our goal is to help you maintain those benefits while ensuring full legal compliance.

Give Us a Call

Riverside County: (951) 600-2733

Orange County: (714) 978-2060

Northwest Arkansas: (479) 377-2059

November 13, 2025

Navigating California's New AI Employment Rules: A Practical Compliance Roadmap for California Businesses

The October 1 deadline has passed. Here's what California employers must do now to avoid liability under the new AI regulations.

Read full post

August 27, 2025

Special Needs Trusts: Protecting Benefits and Securing the Future

Learn how a Special Needs Trust preserves government benefits, enhances financial security, and protects loved ones with disabilities. Discover why families need this vital planning tool.

Read full post

August 15, 2025

Moving to a New State? Don’t Forget Your Family Trust

Moving to a new state? Don’t overlook your Family Trust. Learn how to update your trust to comply with new state laws, avoid legal pitfalls, and protect your assets.

Read full post