Legislator Seeks to Add Teen Rules, Universal Opt-Out to Va. Privacy Law
The second U.S. state privacy law could be updated this year. Virginia’s legislative session opened Wednesday with a bill by Del. Michelle Maldonado (D) that would add protections for teens, include support for universal opt-out mechanisms and revise other parts of the 2021 Virginia Consumer Data Protection Act. Maldonado's measure would also add an AI section called the "Artificial Intelligence Training Data Transparency Act,” which includes a private right of action.
Sign up for a free preview to unlock the rest of this article
The changes would take effect in 2026. A possible hurdle for HB-2250 could be politics -- Maldonado is a Democrat in a state where the Republicans control the Senate and the governor’s mansion. However, privacy experts said Thursday this might not be a barrier to at least some parts of the measure.
Maldonado’s bill would add to the existing Virginia privacy law that controllers may not process sensitive data of adolescents, defined as those aged 13-15, for targeted advertising, personal data sale or profiling that produces legal or other significant effects, unless the processing is "reasonably necessary" for providing the service.
Also, the controller may not process adolescents' data for any purpose it didn't originally disclose or for longer than is reasonably necessary to provide the service. Controllers wouldn’t be allowed to collect precise geolocation data from adolescents unless it’s reasonably necessary to provide service. In that case, a business would have to send a signal indicating it is gathering location data "for the entire duration of such collection.”
Virginia’s privacy law would require support for global opt-out signals if amended by HB-2250. "A consumer may authorize a third party, acting on the consumer's behalf, to opt out of the processing of the consumer's personal data … Such authorization may be made using technology that indicates the consumer's intent to opt out, including a browser setting, browser extension, global device setting, or other user-selected universal opt-out mechanism.”
However, the opt-out mechanism users select should not "unfairly disadvantage a controller," the bill says. Moreover, it must require "the consumer to make an affirmative, freely given, and unambiguous choice to opt out of any processing of a consumer's personal data." It should be easy to use, follow state and federal laws and rules and "allow the controller to accurately determine whether the consumer has made a legitimate request to opt out."
If the universal opt-out signal conflicts with the consumer's existing privacy setting in a loyalty program, "the controller shall comply with the consumer's opt-out but may notify the consumer of the conflict and provide the consumer with the choice to confirm his preferred privacy setting,” the bill adds. “If a controller charges a fee for the use of a user-selected universal opt-out mechanism," it must inform the consumer.
Among other changes, the bill would update the definition of "sale of personal data" to instead read monetary "or other valuable" consideration. Sensitive data would include physical health conditions, not just diagnoses. And whereas the law previously included a blanket carve-out for photos and audio or video recordings in the definition of biometric data, HB-2250 adds: "unless such data is generated for the purpose of uniquely identifying a natural person.”
Regarding the Virginia law’s 30-day right to cure, Maldonado proposed clarifying what the attorney general should consider when deciding whether to grant such an opportunity. The enforcer should consider the company's history of previous violations, the organization's size and complexity, the "nature and extent of the ... processing activities," the "substantial likelihood of injury to the public," the "safety of persons or property after the alleged violation," if "the alleged violation was likely caused by human or technical error," and the "demonstrated good faith of the controller or processor" in trying to comply, the bill says.
The new AI section would require disclosures about data used to train generative AI systems, within 72 hours of a system's availability or a significant update. It would exclude systems used solely for security or integrity. Also, developers would have to keep detailed records of datasets used for training AI, in line with National Institute of Standards and Technology standards. "A developer shall provide a clearly designated and publicly available mechanism for the submission of” requests to verify or delete AI training data, and respond to those requests within 30 days.
In an email Thursday, Kelley Drye’s Alysa Hutnik wrote it's not surprising “that the legislature is looking to amend Virginia’s privacy law to bring it in line with other state laws on tightening the requirements for targeted ads and sensitive data, and adding more rigorous protections for teen data.” Hutnik added, “I could see those changes likely not facing that much resistance.”
“But the proposed language on generative AI would be significant and add a lot more process and substantive consumer rights requirements that would stand out among states, and is not dependent on whether the AI software poses a high risk,” said Hutnik. “I imagine that portion, particularly as to the risk threshold and restrictions around what data can be used to train and rights to opt out of such training, as well as the specificity of the disclosure obligations, are going to get a hard look.”
Consumer Reports largely supports Maldonado’s proposed changes, emailed CR Policy Analyst Matt Schwartz. As to the bill’s chances, Schwartz noted that the edits “are largely in line with what has passed in other states using the ‘Connecticut model’ -- including in some states controlled by Republicans. For example, Texas, Nebraska, and Montana have all passed privacy laws with the type of universal opt-out controls contemplated here.”