Insights from CODEX Director Sumant Ranji, MD, SFHM: Analyzing This Week's Federal Healthcare AI Announcements

A series of announcements from the federal government this week provided clarity on the administration’s priorities for artificial intelligence in health care.
- The Food and Drug Administration released new guidance on its regulatory approach to Clinical Decision Support Software (CDS) and Low Risk Devices for General Wellness
- The Assistant Secretary for Technology Policy (ASTP) and Office of the National Coordinator for Health Information Technology (ASTP/ONC) released the proposed fifth iteration of the Health Data, Technology, and Interoperability rule (HTI-5)
- Just before the holidays, the Department of Health and Human Services (HHS) released a Request for Information on “what HHS can do to accelerate the adoption and use of AI as part of clinical care.”
Taken together, these announcements, as well as statements from FDA Commissioner Marty Makary, indicate that the administration has a strongly pro-AI stance, seeking to minimize regulatory and other barriers to AI implementation throughout the healthcare system.
How this approach will affect the use of AI for diagnosis—especially the rapidly growing use of generative AI—will emerge as developers, health system leaders, and patients digest the new guidance, but some likely short-term developments can be predicted based on recent trends in AI diagnostics.
First, the FDA significantly relaxed its approach to oversight of CDS by tightening the criteria under which a CDS would be considered a “medical device”, and thus subject to rigorous FDA approval criteria and oversight. In particular, CDS that “support” clinical decision-making (but do not replace the role of clinicians) will generally not be subject to medical device regulations. These include diagnostic applications that provide a prioritized differential diagnosis list based on patient-specific information. CDS that provide specific diagnostic or treatment recommendations will also not be considered medical devices if there is “only one clinically appropriate recommendation” for a given situation.
The new proposed guidance also lessens the requirements for transparency in how CDS generate their recommendations and clarified that CDS can utilize certain types of clinical summaries (such as hospital discharge summaries) to produce recommendations without being considered a medical device. The HTI-5 rule also eliminates the requirement for CDS to be certified by ASTP/ONC; although this is a non-binding requirement, removal of the need for certification will likely speed development of new AI-based CDS. Although the FDA guidance continues to define CDS that are used in time-sensitive situations as medical devices, the examples provided in the guidance of non-device CDS include CDS that recommend appropriate antibiotics for an infection or intravenous fluids for electrolyte imbalances – common situations in hospitalized patients that can certainly be urgent.
These changes should open the door to wider implementation of generative AI for diagnostic decision support, including integration of diagnostic CDS into electronic health records. Even CDS that provide relatively specific recommendations for acutely ill patients may not be considered medical devices, which significantly broadens the scope of how generative AI may be implemented at the bedside.
There are potential concerns with the administration’s approach that will require scrutiny over time. Although the FDA guidance continues to emphasize the potential for automation bias (overreliance on technology instead of human judgment), more widespread CDS implementation certainly could increase the potential for automation bias as well as deskilling of healthcare providers if a new generation of clinicians becomes overly reliant on decision support for routine clinical tasks. As generative AI is more widely used, the role of the “human in the loop” will become even more important, especially as AI-based CDS are implemented across clinical specialties that as yet have little direct experience with AI. Radiologists and pathologists who already use AI routinely have consistently found that although the technology can improve diagnosis, optimal diagnostic accuracy is achieved by human clinicians working with AI support and mutual cross-checking. AI-based diagnostic decision support could lead to overdiagnosis and overuse of diagnostic testing, which carries physical, psychological, and financial risks for patients.
Another piece of news on AI in healthcare this week may provide insight into how the FDA will regulate AI-based clinical applications. Utah announced a pilot program with the AI company Doctronic, an LLM-based chatbot that provides diagnostic guidance with linkage to telehealth clinicians, which will allow Doctronic to refill prescriptions for patients with chronic diseases. On the surface, an AI program that performs specific clinical tasks in place of human clinicians would seem to be a “medical device”; however, the FDA has not yet sought to regulate Doctronic. The FDA’s stance on the Utah AI prescribing pilot program will likely signal its overall approach to the expansion of AI applications for diagnosis and treatment.
__
Let us know your thoughts and join the conversation on LinkedIn.
///

Sumant Ranji, MD (Moderator)
Inaugural Director, UCSF CODEX
Director of Quality Improvement and Patient Safety, Division of Hospital Medicine at the Zuckerberg San Francisco General Hospital (ZSFG)
Dr. Ranji is a renowned leader in patient safety, quality improvement, and medical education. He first joined UCSF as a fellow in hospital medicine and clinical research, then went on to become a faculty member in the Division of Hospital Medicine at UCSF Health, where he held multiple leadership positions. As director of CoDEx, Sumant leads the center’s strategic planning and the execution of its work to champion diagnostic excellence research and build awareness of and engagement within the field.
Learn more about Sumant here.
About UCSF CODEX (Coordinating Center for Diagnostic Excellence)
Every person deserves access to an accurate and timely diagnosis. At CODEX, we never stop working toward making that a reality. We serve as a national coordinating entity, engaging the diagnostic excellence community to promote novel findings, catalyze action, and advance the field. Our mission is to lead change in the field of diagnostic excellence by facilitating activities that result in measurable improvement in diagnostic quality, safety, and equity.
Contact
Mika Rivera, Director of Communications
Follow Us
Bluesky: @ucsfcodex.bsky.social
LinkedIn: https://www.linkedin.com/company/ucsfcodex
Newsletter: https://ucsf.co1.qualtrics.com/jfe/form/SV_9zU2iVMliYFqBvg