STAT+: New federal rules demand transparency into AI models used in health decisions

New federal rules will force software vendors to disclose how AI tools used in health care are trained, developed and tested.

Dec 13, 2023 - 18:00
STAT+: New federal rules demand transparency into AI models used in health decisions

Federal health technology regulators on Wednesday finalized new rules to force software vendors to disclose how artificial intelligence tools are trained, developed, and tested — a move to protect patients against biased and harmful decisions about their care.

The rules are aimed at placing guardrails around a new generation of AI models gaining rapid adoption in hospitals and clinics around the country. These tools are meant to help predict health risks and emergent medical problems, but little is publicly known about their effectiveness, reliability, or fairness.

Starting in 2025, electronic health record vendors who develop or supply these tools, which increasingly use a type of AI known as machine learning, will be required to disclose more technical information to clinical users about their performance and testing, as well as the steps taken to manage potential risks.

Continue to STAT+ to read the full story…

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow