This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

News & Insights

| 4 minute read

AB 2930 and Regulating AI Decision-Making in the Workplace

As Artificial Intelligence (AI) continues to be deployed and used in the workplace, California employers should take note of a bill that could impact how they utilize AI in decision-making processes.  

Introduced on February 15, 2024, by California Assemblymember Rebecca Bauer-Kahan, AB 2930 seeks to regulate the use of artificial intelligence to regulate and prevent “algorithmic discrimination.” AB 2930 targets the use of AI when employers "make a consequential decision," where “consequential decision” is defined as a "decision or judgment that has a legal, material, or similarly significant effect on an individual’s life. Specific to employment, AB 2930 focuses on decisions regarding pay, promotion, hiring, termination, and automated task allocation systems.

AB 2930 would apply to employers with more than 25 employees. For employers with less than 25 employees, it is most likely that AB 2930 will not apply unless the deployed automated system impacts more than 999 people. However, employers should note that AB 2930 may be included within the Unruh Act and FEHA, among others, so in practice, any employer with over 5 employees may be subject to its regulations if it is passed into law.

Employer’s Compliance with AB 2930

If AB 2930 passes, businesses, and employers that currently use an automated system in their operations should be prepared to perform an impact assessment before January 1, 2026, to ensure compliance. The impact assessment should at least include the following:

  • A statement of the automated system's purpose and its intended benefits, uses, and deployment contexts;
  • A description of the automated system's outputs and how the decisions generated from the system will be used in making the consequential decision;
  • A summary of the categories of information collected from natural persons and processed by the automated system;
  • The employer's use of the automated system is consistent with the automated system developer's statement;
  • An analysis of the potential adverse impacts the automated system has on the basis of sex, race, color, ethnicity, religion, age, national origin, limited English proficiency, disability, veteran status, or genetic information;
  • Safeguards that are in place to address any reasonably foreseeable risks of algorithmic discrimination;
  • How the automated system will be used or monitored when used to make a consequential decision; and
  • A description of how the automated system will be evaluated for validity or relevance. 

Moving forward, employers may need to establish a governance program with at least one employee who oversees and ensures compliance while also assessing the ongoing risks of algorithmic discrimination associated with using an automated system. In addition, employers will need to notify potential job candidates or employees that an automated system will be used for a consequential decision that impacts them. If a consequential decision is based solely on an automated system, an employer must "if technically feasible," accommodate an individual's request to opt out of the automated system. Therefore, employers should have language prepared to send to individuals, but also keep alternative selection processes or accommodations handy as fully automated decision-making systems become more prevalent in everyday operations. 

Consequences for AB 2930 Violations

Employers should also be aware of AB 2930 violations and legal claims arising from the bill. If an employee has a good faith belief that the automated system is perpetuating algorithmic discrimination, the employer will need to promptly conduct a complete assessment of compliance issues. After notification of this issue, if an employer is unable to address these issues, the employee may bring an action against the employer for violating the bill. Employers will also need to stay alert as they will have seven days to respond to a request by the Civil Rights Department, formerly known as the DFEH, to provide an impact assessment. Once the time to respond expires, the Civil Rights Department may recover up to $10,000 per violation in an administrative enforcement action for failure to disclose an assessment. 

On the other hand, AB 2930 ensures business and employer compliance by allowing public attorneys such as the Attorney General and the Civil Rights Department to first provide a 45-day written notice to an automated system deployer (an employer), of any violations and the opportunity to cure any alleged violations. Once the opportunity to cure has passed, a civil action and penalty of $25,000 per violation may be brought.

Support and Opposition to AB 2930

Support and opposition for AB 2930 have been articulated in the past few months, and both sides share concerns about certain definitions and clarifications that the bill needs. A letter from various workers, consumer rights, and civil rights organizations, including Consumer Reports, Legal Aid at Work, and the California Nurses Association, have voiced their support for AB 2930. In their April 3, 2024, letter to Assemblymember Bauer-Kahan, the organizations raise several points, including revising definitions such as the “controlling factor” requirement for automated system deployers. Another recommended key change was expanding the bill’s explanation provisions to ensure that workers and consumers have a meaningful understanding of how and why consequential decisions are being made through an automated system. 

On the other hand, an April 9, 2024 letter from Ms. Ronak Daylami, a policy advocate on behalf of many professional associations, such as the California Association of Realtors, California Chamber of Commerce, and the Civil Justice Association of California, raised concerns that definitions including that "automated decision tool" and "consequential decision" were overly broad and vague, which could risk that the bill would reach beyond its intended uses. The letter also voiced a concern that the phrase “technically feasible” was not fully defined and did not specify what reasonable accommodations could be deployed.

The fate of AB 2930 is also uncertain, given its standard is at odds with proposed federal guidelines, including, but not limited to, the Equal Employment Opportunity Commission and its regulations regarding Title 1 of the Americans with Disabilities Act.

With the Legislative session ending on August 31, 2024, we will continue to note the bill’s final amendments and provide updates. 

“Usually, you don’t have industries saying, ‘Regulate me’, but various communities don’t trust AI, and what this effort is trying to do is build trust in these AI systems, which I think is really beneficial for industry,” Bauer-Kahan said.

Tags

ab 2930, unruh act, feha, ai, artificial intelligence, employment law