Amid rising regulatory scrutiny over AI-based therapy, Texas Attorney General Ken Paxton (R) opened a probe into Meta, Character.AI and other chatbot platforms “for potentially engaging in deceptive trade practices and misleadingly marketing themselves as mental health tools,” the AG’s office said Monday.
Meta asked a federal court Monday to reverse the verdict or, alternatively, hold a new trial in a case involving allegations that the company shared sensitive health information with third parties without user consent. The social media platform argued "the evidence at trial does not fit plaintiffs’ legal claim."
A federal court granted a preliminary injunction Tuesday blocking the U.S. Department of Health and Human Services (HHS) from using certain states' Medicaid data for immigration enforcement purposes. The block from the U.S. District Court for Northern California comes after a multistate coalition, led by California, filed a lawsuit against HHS for providing individuals' health data to the Department of Homeland Security (DHS) and its Immigration and Customs Enforcement (ICE) agency (see 2507010060).
While there were significant implications for compliance when a federal court in June vacated most of the Health Insurance Portability and Accountability Act's Privacy Rule to Support Reproductive Health Care Privacy (see 2506200057), HIPAA-regulated entities shouldn't rush to remove all reproductive health-related policies, said Robinson + Cole lawyers in a Monday blog post.
Privacy Daily is providing readers with the top stories from last week, in case you missed them. All articles can be found by searching the title or clicking on the hyperlinked reference number.
People are increasingly using general-purpose AI chatbots like ChatGPT for emotional and mental health support, but many don’t realize that regulations like the Health Insurance Portability and Accountability Act (HIPAA) fail to cover these sensitive conversations, a Duke University paper published last month found. Industry self-regulation seems unlikely to solve the issue, which may disproportionately affect vulnerable populations, said Pardis Emami-Naeini, a computer science professor at Duke and one of the report’s authors.
As privacy litigation under older laws has exploded, some have called for amending decades-old statutes often at the center of lawsuits so that they aren't applied to modern technologies. The California Invasion of Privacy Act (CIPA) in particular has been subject to more scrutiny as litigation has increased (see 2503030050).
A common misconception is that all health and medical data is subject to the Health Insurance Portability and Accountability Act (HIPAA), though the consequences for mistakes could be severe.
New York Assemblymember Alex Bores (D) expects his AI safety bill, the Responsible AI Safety and Education (RAISE) Act, could be revised through the chapter-amendment process, the legislator told Privacy Daily on Wednesday. “We’re open to amendments that … strengthen or clarify the bill.”
Privacy Daily is providing readers with the top stories from last week, in case you missed them. All articles can be found by searching the title or clicking on the hyperlinked reference number.