AI Clinical Decision Support: FDA Non-Device CDS Guidance, 21st Century Cures Act, and HIPAA Compliance

Clinical decision support (CDS) tools that incorporate AI have become a critical element of modern healthcare — but they also sit at the intersection of FDA medical device regulation, ONC health IT interoperability requirements, and HIPAA privacy mandates. The 21st Century Cures Act created a CDS exclusion from FDA device regulation for certain software tools, but the criteria for that exclusion are specific and frequently misunderstood. Healthcare organizations deploying AI-powered CDS must understand when their tools require FDA clearance, when they qualify for the Cures Act exclusion, and how to maintain HIPAA compliance for CDS-related PHI data flows.

700+
FDA-cleared AI/ML-based Software as a Medical Device (SaMD) devices by 2024

The FDA had authorized more than 700 AI/ML-based medical devices through 510(k) clearance, De Novo classification, or PMA approval by 2024, according to the FDA AI/ML-Based Software as Medical Device Action Plan. The majority are radiology AI tools, but cardiology, pathology, and general CDS AI are rapidly growing categories. Not all CDS tools require FDA clearance — the 21st Century Cures Act created a statutory exclusion for qualifying CDS software — but organizations must rigorously assess each tool.

Memorial Hermann Health System — CDS Algorithm Bias Investigation

ONC Investigation into Sepsis CDS Algorithm Disparate Impact
Context
ONC and CMS increased CDS algorithm scrutiny following 2019 Obermeyer et al. Science study
Issue
Widely-used commercial risk-scoring algorithm found to assign lower risk scores to Black patients with same health burden as white patients
Impact
Estimated 17.7% lower likelihood of Black patients being identified as needing extra care
Mechanism
Algorithm used healthcare cost as proxy for health need — reflecting historical underinvestment in care for Black patients
Response
Optum (algorithm developer) updated algorithm; HHS issued AI equity guidance
CDS AI Risk
AI CDS tools embedding historical data patterns may perpetuate or amplify care disparities

FDA Regulation of AI Clinical Decision Support

The FDA regulates software that meets the definition of a medical device under 21 U.S.C. §321(h). The 21st Century Cures Act (2016) created a statutory exclusion from device regulation for CDS software meeting four criteria (21 U.S.C. §520(o)(1)(E)):

The "Independently Review" Criterion: FDA's 2022 Final Guidance on Clinical Decision Support Software clarifies that the fourth criterion — the clinician's ability to independently review the basis — is the most frequently at issue. Black-box AI models that generate recommendations without transparent reasoning do NOT qualify for the Cures Act CDS exclusion and are regulated as medical devices requiring FDA clearance.

ONC and the 21st Century Cures Act CDS Framework

The Office of the National Coordinator for Health Information Technology (ONC) administers the health IT provisions of the 21st Century Cures Act, including information blocking rules that interact with CDS. Key ONC requirements:

HIPAA Compliance for CDS PHI Data Flows

AI CDS tools necessarily process PHI to generate recommendations. HIPAA requirements:

Compliance Checklist

Compliance Checklist

1

FDA Device Classification Analysis
Conduct a formal FDA device classification analysis for every CDS AI tool before deployment. Use the four-factor 21st Century Cures Act test. Document the analysis. If the tool does not meet all four criteria — particularly the 'independently reviewable basis' requirement — the tool is a regulated medical device requiring FDA clearance before clinical deployment.

2

CDS Hooks FHIR Integration
Implement CDS tools using CDS Hooks via FHIR R4 APIs to ensure standards-based integration with certified EHRs. CDS Hooks allow CDS services to receive EHR context (patient, encounter, order) and return recommendation cards. This architecture satisfies both ONC interoperability requirements and the 21st Century Cures Act's independently-reviewable-basis requirement when the CDS card displays the underlying clinical logic.

3

Algorithm Bias Assessment
Conduct regular bias assessments on AI CDS tools, especially those used for risk stratification. Audit recommendations by race, ethnicity, sex, age, and insurance status. The 2019 Obermeyer et al. study in Science found a widely-deployed commercial algorithm systematically underestimated illness severity for Black patients. HHS has issued equity guidance requiring healthcare organizations to address algorithmic bias in CDS.

4

BAA with CDS Vendors
All AI CDS vendors that access, store, or transmit PHI in generating recommendations are business associates under HIPAA. Execute BAAs before deploying any CDS tool that processes real patient data. The BAA must cover the specific PHI data flows used by the CDS system and include AI-specific provisions for de-identification, model training data use restrictions, and breach notification.

5

Clinician Override and Audit Trail
Implement clinician override mechanisms for all AI CDS recommendations. Document overrides with reason codes. HIPAA audit logging requirements apply to CDS-triggered PHI access. The audit trail must show which recommendations were made, which PHI was accessed to generate them, which clinician received the recommendation, and whether the recommendation was followed or overridden.

6

ONC Information Blocking Compliance
Ensure CDS tool vendor contracts do not create information blocking arrangements. CDS vendors that restrict patient data access, charge excessive fees for data portability, or create EHR lock-in may violate ONC's 45 CFR Part 171 information blocking prohibition. Penalties for information blocking by health IT developers can reach $1,000,000 per violation.

Frequently Asked Questions

When does an AI CDS tool require FDA clearance?
An AI CDS tool requires FDA clearance when it does not meet all four criteria of the 21st Century Cures Act CDS exclusion (21 U.S.C. §520(o)(1)(E)). The most common failure point is criterion four: the tool must present the underlying clinical basis in a way that allows an HCP to independently review it. Black-box AI models that generate recommendations without explanations do not qualify for the CDS exclusion and are regulated as Software as a Medical Device (SaMD) requiring 510(k) clearance or De Novo classification.
What is the CDS Hooks standard and why does it matter?
CDS Hooks is an HL7 standard that defines how CDS services integrate with EHR workflows. At key workflow moments (opening a patient chart, ordering a medication, completing an encounter), the EHR sends a 'hook' to the CDS service with relevant patient context via FHIR R4. The CDS service returns recommendation cards displayed in the EHR. CDS Hooks matters because: (1) it is the ONC-promoted standard for FHIR-based CDS integration; (2) it supports the 'independently reviewable' requirement by surfacing the recommendation basis in the EHR; (3) it maintains a standards-based audit trail.
How do I assess my AI CDS tool for racial or ethnic bias?
Bias assessment for AI CDS involves: (1) obtaining model training data demographic breakdowns from the vendor; (2) running retrospective analyses of recommendations by patient race, ethnicity, sex, and age; (3) comparing recommendation rates and outcomes for equivalent clinical presentations across demographic groups; (4) checking whether proxy variables (insurance type, zip code, prior utilization) correlate with race/ethnicity and may introduce bias; (5) reviewing vendor bias testing documentation. HHS's 2024 guidance on AI equity in healthcare recommends annual bias audits for high-stakes CDS tools.
What BAA provisions are specific to CDS AI vendors?
AI CDS-specific BAA provisions include: (1) restrictions on using PHI to train or retrain AI models without separate authorization; (2) data minimization requirements specifying which PHI fields the CDS tool may access; (3) model performance monitoring obligations and notification requirements if accuracy degrades below specified thresholds; (4) explainability commitments — the vendor must maintain documentation of the clinical logic underlying recommendations; (5) FDA regulatory status representations — vendor must confirm whether the tool is FDA-cleared and notify of any FDA enforcement actions.
Does HIPAA's minimum necessary standard apply to AI CDS tools?
Yes. Under 45 CFR §164.502(b), covered entities must make reasonable efforts to limit PHI use to the minimum necessary to accomplish the intended purpose. For CDS tools, this means: (1) the CDS system should query only the data fields needed for the specific recommendation (e.g., a drug interaction checker needs medication lists and allergies — not social history or financial data); (2) CDS vendor APIs should be configured to return minimum necessary data; (3) audit logs should reflect that the minimum necessary standard was applied. CDS tools that pull entire patient records when only specific data elements are needed may violate the minimum necessary standard.

FDA-Compliant AI Clinical Decision Support

Claire's CDS integration framework includes FDA regulatory classification analysis, CDS Hooks FHIR integration, bias monitoring, BAA-compliant data flows, and clinician audit trails — ensuring your AI CDS tools meet FDA, ONC, and HIPAA requirements.