Conn. Privacy Law's Amendments Pass Legislature 'Under the Radar'
Amendments to Connecticut’s privacy law passed the legislature on Tuesday as part of a different bill that included other subjects. Changes to the Connecticut Data Privacy Act would take effect July 1, 2026, if the bill is signed by Gov. Ned Lamont (D).
Sign up for a free preview to unlock the rest of this article
While some privacy observers were still waiting Wednesday to see the fate of Sen. James Maroney's (D) amendments bill, SB-1356, the legislature had already approved SB-1356's language as part of a last-minute amendment to SB-1295. The Senate supported the amended SB-1295 on a 33-2 vote Tuesday and then the House voted 127-15 at 12:16 a.m. on Wednesday.
“I'm guessing this has flown under the radar,” given the unexpected change in legislative vehicle, noted Keir Lamont, Future of Privacy Forum senior director, on LinkedIn Wednesday.
The privacy amendments bill makes a variety of changes, including refining the law’s data-minimization requirements and tightening an exemption for the Gramm-Leach-Bliley Act to a data-level carveout from an entity-level one.
Also, it would lower applicability thresholds to cover businesses that process data of at least 35,000 consumers, down from 100,000 in the current law. Another change was adding neural and financial account data to the definition of sensitive data (see 2505140067). Additionally, the privacy bill would now cover kids younger than 18, up from 16 previously.
“One notable sub-trend here,” said Lamont in his post, is “Connecticut, Colorado, and Oregon have all introduced bills this year which would have incorporated elements of the Maryland approach to data minimization, but each state ultimately backed off from Maryland's approach in favor of other mechanisms to protect data.”
Commenting on Lamont’s post, Vermont Rep. Monique Priestley (D) said the so-called “notable sub-trend” represents “the need to back off of data-minimization efforts due to incredibly intense lobbying pressure as all eyes turned to the states.”
Under the Connecticut measure's amended data-minimization language, controllers must “limit the collection of personal data to what is reasonably necessary and proportionate in relation to the purposes for which such data are processed, as disclosed to the consumer … unless the controller obtains the consumer's consent, not process the consumer's personal data for any material new purpose that is neither reasonably necessary to, nor compatible with, the purposes that were disclosed to the consumer.”
An entity must account for “the consumer's reasonable expectation regarding such personal data at the time such personal data were collected based on the purposes that were disclosed to the consumer.”
Also, it must consider “the relationship that such new purpose bears to the purposes that were disclosed to the consumer [and] the impact that processing such personal data for such new purpose might have on the consumer.”
In addition, consider “the relationship between the consumer and the controller and the context in which the personal data were collected, and … the existence of additional safeguards, including, but not limited to, encryption or pseudonymization, in processing such personal data for such new purpose.”
The amendments to Connecticut's privacy law "avoid Maryland-style data minimization provisions that were included in early versions of SB 1356 and instead incorporate Colorado-style factors for determining whether secondary processing activities are compatible with originally specified purposes or require additional consent," FPF's Lamont emailed us Thursday.
Maroney’s separate bill regulating AI (SB-2) didn’t come up for a House vote and died on the final day of the legislative session (see 2506040051).
The bill amending the Connecticut privacy law added several AI rights similar to what’s in the Minnesota privacy law, noted Kara Williams, law fellow at the Electronic Privacy Information Center, on a panel Thursday at the Consumer Federation of America’s Consumer Assembly. These include giving consumers the ability to “know if AI was used, and be entitled to some sort of explanation as to why a certain decision was made," she said.
Those AI rights were a strength of the 2024 Minnesota privacy law, though it’s deficient in other ways, said Matt Scherer, who leads the worker’s rights project at the Center for Democracy and Technology.
Overall, 2025 hasn’t been a good year for regulating AI, Scherer said. “On the thing that I'm most passionate about, which is … people's lives increasingly being determined by hidden algorithms that decide whether they get a job, a house, health care or coverage … the venture capitalists have, frankly, been running the table. They've been flooding every state with lobbyists.” They even killed an effort to amend Colorado's AI law that would have weakened it because they didn’t think it was already weakened enough, the CDT official said.
Governors “have been the problem” in Connecticut, Colorado and California, added Scherer. “I think that they genuinely believe that AI is the future, and they've been seduced by this notion that if you do anything to rein it in, that you are harming innovation and that you're going to harm your state's economy going forward,” he said. “They don't talk to labor and consumer groups about any of this. … They only are listening to the companies.”
Connecticut's privacy statute "was a good law and a good start, but it definitely had weaknesses," as shown in annual reports by the state's attorney general, Justin Brookman, Consumer Reports director of technology policy, said in an email. "We support a lot of the new improvements [but] had been hoping to see more robust data minimization requirements," he said. "But it's good to [see] continued efforts to incrementally improve the bill."