Privacy Daily is a service of Warren Communications News.

Panelists See Enforcement of Surveillance Pricing and Other AI Discrimination

To prevent AI discrimination, regulators will soon tackle surveillance pricing, also known as behavioral data mining, said Wayne Stacey, executive director at the Berkley Center for Law & Technology, during a panel at a Troutman Amin law conference Monday.

Sign up for a free preview to unlock the rest of this article

Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.

“Personalized pricing has always built in some pretty hideous things about personal discrimination, but you couldn't prove it,” he said. “But when you let [an AI tool] determine who's going to pay more based on whatever available information is out there,” like ZIP code, credit report, race or gender, that builds in protected class information.

Some laws state, in a general way, "you can't discriminate based on these protected classes,” but that "doesn't port nicely over to AI,” Stacey said. In addition, global regulators "are having trouble" with personalized pricing because, in capitalist economies, people don't want to get involved in regulating it. "But you also want to stop discrimination.”

Enforcement won't aim only at large companies, he said. "With these powerful AI engines that are coming out to help small companies with pricing, this is going to hit everybody that's dealing with online sales.”

Chris Kindt, senior vice president of digital marketing at Qualfon, a business-services firm, said this is why you can’t take humans out of the equation. “Too many companies are thinking, AI is a solution that replaces people," but humans must monitor safeguards and "constantly keep an eye on and understand the loop" that AI is creating, he said. AI “echo chambers" allow the technology "to kind of go in these weird directions that you need people to” help get things back on track.

Kindt said this is an issue with algorithmic discrimination, too. AI is seen as a quick fix to be deployed now, as opposed to focusing on what it can accomplish eventually with human nurturing.

Stacey agreed. “You can eliminate some of what people would call bias by putting in hard guardrails," but "a company has to be able to spot its own biases to eliminate them,” he said. AI will be "as good as what it's trained on, and it takes humans in the loop to be smart about it.”