Senate Approves Sweeping AI Legislation

Issues & Policies

The state Senate passed sweeping legislation April 24 regulating artificial intelligence, criminalizing deceptive synthetic media in elections and the nonconsensual dissemination of synthetic intimate information, and creating a number of AI-related workforce development programs.

SB 2, which if enacted will be the first statute in the country regulating AI applications developed and deployed by private industry, was approved on a party line vote, with all Republicans opposed.

Sen. James Maroney (D-Milford), the bill’s chief proponent, led floor debate on both the amendment and underlying bill.

The bill that passed the Senate made a series of changes to the originally proposed underlying regulatory framework.

Reporting Requirements

In general, sections 1 through 7 establish a number of reporting requirements for developers and deployers who utilize “High-Risk [AI]”—defined as “any [AI] system that, when deployed, makes, or is a substantial factor in making, a consequential decision.”

In turn, “consequential decision,” a definition which has changed multiple times up to this point, now means “any decision that has a material or similarly significant effect on the provision or denial to any consumer of, or the cost or terms of, (A) any criminal case assessment, any sentencing or plea agreement analysis or any pardon, parole, probation or release decision, (B) any education enrollment or opportunity, (C) any employment or employment opportunity, (D) any financial or lending service, (E) any essential government service, (F) any healthcare service, or (G) any housing, insurance, or legal service.”

The bill requires a number of reporting requirements to the state and consumers.

When developers “develop” and deployers “deploy” these high-risk systems, the bill requires a number of reporting requirements to the state and the entity or consumer utilizing said system. 

Should the developer or deployer fail to use “reasonable care” to protect consumers from “any known or reasonably foreseeable risk of algorithmic discrimination,” that entity (1) must notify respective parties utilizing the system; and (2) may be subject to an enforcement action by the Attorney General to cure such discrimination within 60 days.

Rebuttable Presumption

In any action brought by the Attorney General, the developer and deployer both have a rebuttable presumption that reasonable care was used if they comply with the reporting, disclosure, and notification requirements under the bill.

For example, a deployer would satisfy the presumption if it (1) creates and maintains impact assessments; (2) establishes a risk management programs that is an “iterative process that is planned, implemented and regularly and systematically reviewed and updated” over the lifecycle of the AI system; (3) notifies the consumer that they are interacting with a high-risk system; (4) is a controller under the Connecticut Data Privacy Act, informs the consumer of their right to opt-out of having their personal data processed in the AI system; and (5) provides an opportunity for a consumer to appeal an adverse decision and, if technically feasible, allow that appeal to include human review.

Small Business Concerns

The latest version of the bill fails to address concerns of small and medium-sized employers who simply buy high-risk AI systems “off-the-shelf” and utilize such systems, without modification, to streamline key business functions.

For example, take an employer who purchases hiring software from Company A (a developer under the bill) and uses that software to screen applicants for an open position.

The latest version of the bill fails to address concerns of small and medium-sized employers.

That employer, through the assistance of AI hiring software, will eventually make a “consequential decision” (i.e. a decision to hire someone, or inversely, not hire someone).

Under the new exception that attempts to address this issue, an employer making that hiring decision would not need to comply with any of the reporting and notification requirements, nor be subject to enforcement actions, if it “(a) employs less than 50 full-time equivalent employees; and (b) does not use such deployer’s own data to train such high-risk [AI] system.” “Training” is not defined.

Disclosure Mandate

Connecticut small employers will have a difficult time conceptually and operationally understanding these terms and requirements.

Because the hiring software in the example above would need to be fine-tuned with the employer’s “own data,” it presumably would not fall under the exception.

Small employers will have a difficult time conceptually and operationally understanding the bill’s terms and requirements.

What makes matters more complicated are the reporting requirements for that small employer, should it not meet the above exemption, when it inevitably makes a hiring decision.

Under new requirements in Section 3, that small employer must disclose to each rejected applicant (1) a statement disclosing the principal reason for the decision, which includes, (a) the degree to which AI contributed to the decision; (b) the data that was processed in making the decision; and (c) the source of the data used in making the decision; and (2) an opportunity to correct any incorrect personal data that the system processed in making the decision.

Further, the employer must also provide each rejected applicant an opportunity to appeal the decision. If technically feasible, the appeal must allow for human review.

Competitiveness, Economic Concerns

Because Connecticut would be the first state in the country to regulate private artificial intelligence systems, many employers, as well as the Governor’s office, have raised concerns that businesses, especially start-up AI companies, looking to either relocate or remain in Connecticut, would be dissuaded from doing so if other Northeastern states do not have the same reporting and notification requirements.

So far, Colorado and California have raised similar bills, but no state in the Northeast has proposed similar legislation nor are they remotely close to doing so.

No state in the Northeast has proposed similar legislation nor are they remotely close to doing so.

Should Connecticut move forward with the regulations contemplated in SB 2, businesses and economic development leaders have argued that such regulations should only go into effect if a critical mass of surrounding states adopt similar regulations.

No provision in the bill addresses this issue.

The bill awaits action in the state House.

For more information, contact CBIA’s Wyatt Bosworth (860.244.1155) | @WyattBosworthCT.


Leave a Reply

Your email address will not be published. Required fields are marked *

Stay Connected with CBIA News Digests

The latest news and information delivered directly to your inbox.