Femtech Poses Major Privacy Threats in US, EU and UK, Lawyers Say
Femtech offers groundbreaking innovations in women's health but also poses serious privacy threats, data protection lawyers said. Even the EU, with its General Data Protection Regulation and AI Act, and the U.K., with its version of the GDPR, may not always provide adequate protection for the highly sensitive personal data that femtech apps collect and use, they added.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
Femtech consists of technology products and apps designed specifically for women’s health, Joseph Lyon of the Lyon Firm noted in a blog post Saturday. The platforms "collect vast amounts of sensitive data, including menstrual cycles, fertility status, sexual activity, mood patterns, and pregnancy information."
The sensitive data relates to reproductive health, making it highly valuable to advertisers but also deeply personal to users, the firm noted.
Key privacy risks from femtech include unauthorized data sharing, third-party tracking, weak cybersecurity, legal exposure and opaque privacy policies that fail to clarify how the data is used, Lyon said. Many users assume their data is confidential, but lawsuits and investigations have shown that femtech companies often sell the data without meaningful consent, he said.
Lyon noted several illustrative lawsuits in the U.S. involving femtech. For example, Flo Health, a period- and fertility-tracking app that shared sensitive data with third-party analytics and marketing companies, recently settled after an FTC investigation (see 2508010048).
In the same case, a federal grand jury in August found Meta liable for breaching the California Invasion of Privacy Act by intentionally eavesdropping on Flo users and receiving sensitive data about their menstrual cycles and reproductive health (see 2508040041).
Meanwhile, a probe of the Premom ovulation tracker app, which allegedly transferred personal health data to third parties without users' consent, resulted in an FTC order that the company clean up its act, Lyon wrote.
Glow, another fertility and pregnancy-tracking platform, settled a class-action lawsuit over claims that its platform was riddled with vulnerabilities, potentially exposing personal details about users’ reproductive health, Lyon added.
AI presents exciting opportunities for women's health but it, too, presents risks, Bristows life-sciences regulatory attorney Ellie Handy posted July 31. AI could exacerbate health inequalities through biased training data and privacy problems from invasive data mining and sharing, she said. The EU AI Act aims to mitigate some of these risks, but its provisions "can be onerous" because of an added layer of complexity for businesses that develop medical devices.
"All of this also takes place against a backdrop of applicable horizontal legislation, such as the GDPR and EU Data Act, which present a significant compliance burden in their own right," Handy noted.
Regulatory gaps around femtech in the U.K. must be closed, argued Erin Thomson, trainee solicitor in Morton Fraser Macroberts' healthcare sector, in an op-ed July 29.
One potential gap is that many femtech apps are "operating outside the boundaries of formal healthcare oversight," she said. "The lack of tailored regulation could damage data privacy, clinical accuracy and safety," Thomson added.
Under the U.K. GDPR, stricter requirements apply when processing "special category data" of the kind that femtech apps generally collect, she noted. In many cases, manufacturers must obtain user consent before processing such data, she said. "Given the scale and sensitivity involved, regulators may need to go further to address risks unique to" femtech to ensure full compliance with the U.K. GDPR.
"It's time to update your threat model," Robert Stribley blogged Aug. 4 on Medium.
Organizations use threat models to highlight weaknesses in their systems' security where bad actors might try to cause harm, noted Stribley, who writes about designing for privacy. But threat models can exist with people, too, especially those who may be members of at-risk communities -- like women in the U.S., he said.
After the Supreme Court overturned Roe v. Wade in 2022, privacy advocates recommended that women and trans men delete their period trackers out of concern that apps like Flo and Clue might surrender sensitive data to law enforcement, Stribley said. Many apps also began asking users to indicate which state they lived in, raising concerns about local authorities using their data, he added.