Explore more publications!

Attorney General Bonta Opposes Trump Administration’s Proposed Slashing of Healthcare AI Transparency and Bias Protections

Proposed rule would eliminate one of the most significant guardrails currently in place on a federal level for the use of AI in healthcare 

OAKLAND — California Attorney General Rob Bonta today announced that he sent a letter to the U.S. Department of Health and Human Services (HHS) opposing a proposed rule that would roll back regulations that help ensure technology used by healthcare providers is safe, effective, and deployed without reinforcing unjust racial bias. The proposed rule at issue — entitled “Health Data, Technology, and Interoperability: ASTP/ONC Deregulatory Actions To Unleash Prosperity” — would remove certification criteria requiring that model cards accompany health products that use artificial intelligence (AI). Model cards function like nutrition labels, providing critical information to providers and regulators, such as potential risks to patients and how AI models are developed and tested. 

“New and emerging AI tools are used by many healthcare providers to make life-changing decisions, such as which patients to refer to specialists, which diseases to screen a patient for, or whether a reaction to an infection might be deadly. So, when AI gets it wrong in healthcare, the consequences can be deadly,” said Attorney General Bonta. “I oppose the Trump Administration’s proposed rollback of regulations that require clarity about how AI tools used in healthcare were developed and tested. Delivering safe, effective, and equitable access to healthcare services must be at the forefront of any attempt to integrate AI and healthcare.”

In response to the increase of automated decision-making tools trained on electronic health records, the Biden Administration unveiled the model card requirement. The Biden-era rule, previously supported by Attorney General Bonta, requires healthcare software developers seeking certification of their products to be more transparent about the data they are using to model their algorithms and whether they have been tested to ensure their outcomes are fair and unbiased. This is important because if algorithms are trained on a narrow or limited dataset, they can inadvertently learn and perpetuate biases present in that data. For example, a 2019 study found that a widely used algorithm used to help hospitals identify high-risk patients was racially biased.

In his letter today, Attorney General Bonta warns that HHS’s proposed rule would eliminate one of the most significant guardrails currently in place on a federal level for the use of AI in healthcare and urges the federal administration to reverse course. AI systems are novel and complex, and their inner workings are often not understood even by developers and entities that use AI, resulting in situations where AI tools have generated false information or biased and discriminatory results.

The proposed rule also does not take into consideration the significant burden it is placing on health providers by removing the model card requirement. For example, healthcare providers’ compliance with both federal and state laws becomes much more difficult without model card requirements. The Affordable Care Act prohibits providers from discriminating based on a patient’s protected status. And last year, Attorney General Bonta issued an AI advisory about the application of California law to AI in healthcare, providing guidance specific to healthcare entities about their obligations under California law. Removing the model card requirement eliminates a critical tool for providers to ensure that they are providing nondiscriminatory healthcare in compliance with these laws.

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Share us

on your social networks:
AGPs

Get the latest news on this topic.

SIGN UP FOR FREE TODAY

No Thanks

By signing to this email alert, you
agree to our Terms & Conditions