23andMe Bankruptcy Will Likely Drive Regulation and Enforcement, Say Privacy Experts
The 23andMe bankruptcy will likely lead to more privacy regulation and enforcement due to significant public awareness of the event and its possible implications for people’s sensitive genetic information, privacy experts at the Osano Privacy Pro Survival Summit said Thursday. Meanwhile, the rise of enamored but potentially dangerous AI technology could increase challenges for privacy pros, said Noelle Russell, founder of the AI Leadership Institute.
Sign up for a free preview to unlock the rest of this article
Negative privacy implications of the 23andMe bankruptcy have been “all over the media,” while attorneys general in many states have issued calls for people to delete their 23andMe data, said France Bélanger, Virginia Tech professor. Bélanger co-hosts the university’s Voices for Privacy podcast with Donna Wertalik, another professor who spoke on the same Osano panel.
The 23andMe incident "really made people think, and a lot of people didn't realize that it wasn't just their data,” said Bélanger. Even if customers ask to delete their 23andMe data, such a request doesn’t cover those in their extended families, she said. The heightened awareness from media coverage and states' calls will “domino across people, and then across states,” prompting an impact on regulation, she added.
While 23andMe users can delete their data, it may have “already been sent out and sold to … many lists” already, said Wertalik. “Yes, you can delete it from that company, and they'll say [they] deleted it. Well what about all the times they've sold your name over and over, and your profile and your family's profile?” However, she said privacy quagmires like this must keep rearing their heads “for someone to really, really do something.”
The 23andMe case may fuel what was already an increasing amount of privacy enforcement, said Matt Simpson, Osano senior product manager. The aggressiveness of the California Privacy Protection Agency’s March enforcement action against Honda (see 2503120037) surprised Simpson. The 23andMe case “just adds gasoline to the fire.”
23andMe said earlier this week that it will appoint a “customer data representative” to assess the company’s handling of user data in its bankruptcy sale (see 2504090048).
Data privacy awareness is on the rise across many generations, Wertalik said. Bélanger cited rising concerns about data use and control over data collection and ownership, as well as increasing distrust of companies’ privacy policies and how they use new technology.
At the same time, a spike in the number of state privacy laws in the past few years is leading to an increase in subject right requests from consumers exercising their privacy rights, noted Bélanger. “We’re going to see this accelerating,” she said. “If consumers get more educated, if we have more laws, this is going to be really growing this year, next year and the following year.”
Consumers still need more education about data privacy, “and it needs to start in the schools,” said Wertalik. “It's well beyond the birds and the bees. It is the dangers of online tracking.” Although she agreed consumers may be increasingly concerned about this issue, “I'm more concerned about the people [who] aren't aware of it and giving all that data and information.”
Another growing privacy concern is the rise of AI, a technology that makes large use of personal and other data. AI is transforming every industry, said Russell, who previously worked at Amazon and Microsoft. “The one thing that can sink all the ships on the rise to the top is not taking care [of] … security.”
When considering how to use AI, organizations should “move fast with intention” rather than “move fast and break things,” she told privacy pros in attendance. “We have a chance to ask questions we wish we would have asked 10 years ago of the systems that are monolithic today -- and I mean social media platforms.” It’s important to ask what’s the worst that could happen, added Russell: A “baby tiger” like AI is cute now but potentially dangerous later.