Privacy Daily is a service of Warren Communications News.
Trade Groups Unsatisfied

Calif. Privacy Agency Bent to Industry on ADMT, Consumer and Worker Groups Say

Consumer and labor groups condemned May 9 revisions to California Privacy Protection Agency (CPPA) draft rules on automated decision-making technology (ADMT) and other California Consumer Privacy Act (CCPA) issues. The changes “represent significant concessions by the Agency and its board to a campaign of industry pressure,” civil society groups said in comments at the agency this week.

Sign up for a free preview to unlock the rest of this article

But while business groups welcomed many of those same changes, trade associations told the agency this week that they’re still not satisfied.

For example, the Software & Information Industry Association said that while it “recognizes that the Agency has made significant improvements in both clarity and workability since past drafts,” some parts of the rules need additional clarity “or remain likely to have unintended consequences.”

CPPA Board Chairperson Jennifer Urban last month described the latest round of revisions as paring the draft “to the bone” (see 2505010048). The agency had a Monday deadline for comments (see 2505150064). Privacy Daily obtained filings from several commenters this week.

Consumer advocates urged the CPPA to reverse course. “Californians voted for a privacy law that promised to put in place regulations that would give them meaningful rights to control how their personal information was used, including in automated and algorithmic systems,” said joint comments by American Civil Liberties Union, Privacy Rights Clearinghouse, the Electronic Frontier Foundation and three other civil society groups. “With the most recent draft regulations, the Agency is poised to deprive Californians of the benefit of one of the most important provisions of the state’s privacy law.”

Various labor groups joined consumer advocates and several professors in expressing disenchantment in a separate joint filing. “We are deeply disappointed at the substantial weakening of the proposed regulations -- and at the lack of responsiveness to our coalition of labor and civil society groups, which represent hundreds of thousands of workers and consumers.” The agency failed to adopt any of their recommendations but instead “conceded more and more to concerns of the business community and the tech sector,” they wrote. “This is a profound lost opportunity, especially for workers.”

However, while the CPPA staff said at the agency’s last meeting that changes resulted in trimming certain estimated business compliance costs by nearly two-thirds, SIIA said that the CPPA’s $3.4 billion standardized regulatory impact assessment “is likely well below the true cost that will be incurred.” That’s “because the analysis underestimates the number of businesses affected and does not consider the regulations’ continuing effects on California businesses’ operating costs and productivity.”

EPIC, CDT Seek U-Turn

The CPPA should “reinstate the previous proposed provisions that offered consumers stronger protections from the harms caused by unchecked data collection and automated decisionmaking technologies and … resist industry pressure to weaken the proposed regulations,” the Electronic Privacy Information Center (EPIC) urged.

“The initial proposed regulations were a promising start to providing more consumer privacy protections and transparency and accountability mechanisms through risk assessments,” said EPIC, which also signed the two other comment filings by consumer and labor groups. “However, under significant pressure from industry lobbyists and [Democratic Gov.] Gavin Newsom, every iteration of the proposed regulations has been weakened in terms of consumer protection, transparency, and accountability.”

The Center for Democracy and Technology also urged the CPPA to undo recent changes to the draft rules. "Only ten percent of California businesses subject to the CCPA would have been subject to the version of the proposed rules as updated on May 1," it commented. "The revisions made to the proposed rules since then would further reduce the number of California businesses that would be subject to the ADMT requirements. Considering the increasing role that ADMTs play in causing privacy harms that the CCPA was intended to prevent, this move to minimize industry’s transparency obligations would undercut the purpose of the ADMT regulations and the CCPA itself."

It was particularly bad to narrow the definition of ADMT, remove criminal-justice decisions from what counts as a significant decision and remove a proposed ban on processing personal information when risks outweigh benefits, said the other filing that included six consumer advocates.

For example, the automated-decision definition now “explicitly carves out ADMTs where a human has even glancing involvement in making the decision,” they said. “Under this new narrower standard, many more consumers will be denied the notice and opt-out protections they need and deserve.”

Proposed rules won’t provide enough transparency for workers, said the separate filing that included labor groups. “Workers need to know which data collection and ADMT systems are being used in the workplace, and they need to know when one of those systems has actually been used to make a significant decision about them. Without the latter use-notice, a fast food worker, for example, won’t know that an algorithm was used to fire them -- and without that knowledge, they won’t be able to exercise their right to access data about that decision.”

Not Enough for Industry

Advertisers appreciate the CPPA’s changes, but “the definition of ADMT is still significantly broad and is not cabined to the use of automated processing for significant decisions,” complained the Association of National Advertisers, Interactive Advertising Bureau and three other ad associations. “In addition, the proposed rules would create costly new assessment, opt-out, and rights request processing requirements,” the advertisers commented. “These requirements would disrupt automated processing functions that benefit consumers, stifling the economy, slowing innovation, and burdening both consumers and businesses alike.”

Concerning proposed risk assessment rules, the advertisers said “prescriptive terms surrounding stakeholder involvement in an assessment, timelines for required assessment updates, executive accountability for assessments, and disclosure of assessments to the Agency contain rigid obligations that would be supremely challenging for small and mid-sized businesses to implement.” Meanwhile, an overly broad proposed definition of sensitive location “risks imposing undue risk assessment burdens on businesses of all sizes that engage in benign marketing practices -- such as offering a discount at a coffee shop in or near a hospital waiting room or a college cafeteria.”

Among other concerns, the advertisers said it would be hard for smaller companies to comply with a proposed requirement that they display on websites when an opt-out preference signal has been honored. This is voluntary today. “Given the scope and breadth of the regulatory package, the Agency should afford businesses more than a year to comply with new mandates before they become enforceable,” they added.

“It is not clear what risk to privacy is posed by the training of ADMT,” SIIA commented. “The ability of ADMT to deliver faster, fairer and more inclusive outcomes to consumers depends on developers’ ability to alter algorithms to incorporate both new information and wider datasets representative of all consumers. Restricting developers from tweaking algorithms at scale would be incredibly burdensome; it would have a disproportionate impact on smaller California firms, and also, inevitably, harm consumers’ use of and experience with AI tools.”

Among other suggestions, SIIA said the agency should rethink its definition of sensitive locations, which was new to the latest draft. It should “comport with the CCPA and avoid unintentionally capturing a range of low-risk locations,” the industry group said. “The definition … remains overbroad, constitutionally problematic, and most of all unnecessary.”

Similarly, the Computer & Communications Industry Association (CCIA) asked the CPPA to further refine how it defines ADMT. “While CCIA appreciates the removal of subjective language from this definition, the standards for “human involvement” still create uncertainty.” CCIA agreed with SIIA on replacing “relevant” with “necessary,” adding that it “should suffice for businesses to have protocols in place listing the factors that their human reviewers consider when reviewing technological outputs.”

Also, CCIA recommended “allowing a risk assessment that satisfies another jurisdiction’s requirements to be substituted for the list of requirements in” the proposed CPPA rules, “even if they do not meet every requirement listed in” the section. And the tech group pushed back against various notice requirements for using ADMT. “Forcing businesses to disclose how they use ADMT to perform the specified functions risks undermining the security of consumers and businesses, and requirements to make such disclosures should be minimized.”

The Business Software Alliance (BSA) agreed that the latest CPPA draft is better, but more changes are needed. Proposed ADMT rules should “(1) address practical concerns with treating allocation of work as a significant decision; (2) address issues with implementing pre-use notices, opt-outs, and access requests; and (3) harmonize them with other legislative and regulatory efforts.”

“Although BSA supports the use of risk assessments to identify and mitigate potential privacy risks, California will be an outlier in requiring businesses to proactively provide risk assessment information to the CPPA,” it added.

Meanwhile, the telecom industry warned the CPPA not to fragment the U.S. framework for cybersecurity. “We remain deeply concerned that the proliferation of state-specific cybersecurity mandates may lead to a fragmented regulatory landscape -- one that risks undermining a coherent, unified national cybersecurity strategy,” commented USTelecom: The CPPA should “adopt a risk-based framework rooted in widely accepted and operationalized industry standards.”