Privacy Daily is a service of Warren Communications News.

Lawyers Recommend Clarity, Precision in Health Care AI Contracts

Clearly defined data rights are critical aspects of commercial transactions, especially when health care agreements include AI technology, Foley & Lardner attorneys blogged Wednesday.

Sign up for a free preview to unlock the rest of this article

Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.

"Whether you are contracting with a health system, integrating into a digital health platform, or partnering with an enterprise vendor, your data strategy needs to be reflected clearly and precisely in your contracts," said Foley lawyers Aaron Maguregui and Jennifer Hennessy. However, when data rights are ambiguous, "you may find yourself locked out of the very assets you need to grow -- or worse, liable for regulatory violations you never anticipated."

The highest risk comes from training rights, revocation and retention terms and shared liability, the blog said.

Imprecise language can be legally problematic, especially when it comes to things like training rights, as other laws like the Health Insurance Portability and Accountability Act (HIPAA) can more strictly regulate protected health information (PHI), Maguregui and Hennessy said. Clarity about "whether the data is identifiable or de-identified, what de-identification method is being used, and whether the outputs of that training are limited to the specific client’s model or can be used across your entire platform" is needed.

Additionally, "too many AI contracts are silent on what happens to data, models, and outputs after a contract is terminated." This "creates risk on both sides," said the lawyers. It's therefore crucial to "define whether rights to use data or model outputs survive termination, and under what conditions."

"If you want to retain the ability to use data or trained models post-termination, that should be an express, bargained-for right," the lawyers said. "If not, you must be prepared to unwind that access, destroy any retained data, and potentially retrain models from scratch."

Liability allocation is often overlooked in health contracting, despite risk on both sides, Maguregui and Hennessy added. One way to combat this is requiring clients to have proof of "appropriate authorizations or consents under applicable laws," while uses of client data are "clearly articulated in the agreement." Moreover, indemnity provisions in contracts can help with third-party liability, they said.

Data terms "are a core part of your business model, your compliance posture, and your defensibility in the market," the lawyers said. "For AI vendors, the goal should be to build trust through transparency."