eu-mdraisamdce-marking

Can an AI algorithm get CE-certified as a medical device?

RegAid Team6 min read
Can an AI algorithm get CE-certified as a medical device?

Yes. An AI algorithm qualifies as a medical device under EU MDR 2017/745 if it is intended for a medical purpose — diagnosis, monitoring, treatment decisions — and does not achieve its principal action pharmacologically. Classification follows MDR Annex VIII Rule 11, and from August 2026 the EU AI Act 2024/1689 adds a parallel layer of obligations for high-risk AI systems.

Does the AI Act replace MDR for AI devices?

No. The two regulations apply in parallel. AI Act Article 6(1) classifies AI systems embedded in products covered by EU harmonisation legislation listed in Annex I — which includes MDR 2017/745 and IVDR 2017/746 — as high-risk AI systems. This means an AI-based medical device must satisfy both:

  • MDR: Classification, conformity assessment, technical documentation, clinical evaluation, QMS, post-market surveillance
  • AI Act: Risk management for AI, data governance, transparency, human oversight, accuracy/robustness requirements

The Notified Body assesses both sets of requirements through a single conformity assessment procedure (AI Act Article 43(1)).

How is an AI device classified under MDR?

Rule 11 of MDR Annex VIII governs software classification. The class depends on the clinical significance of the information the software provides:

Intended purposeClass
Software that drives or influences individual clinical decisions for diagnosis or therapy of conditions that may cause death or irreversible deteriorationIII
Software that drives or influences individual clinical decisions for diagnosis or therapy of serious conditionsIIb
Software intended for monitoring physiological processes or other clinical purposesIIa
All other software not covered aboveI

MDCG 2019-11 Rev.2 provides detailed guidance on qualifying and classifying software as a medical device.

Most AI diagnostic tools land in Class IIa or IIb. An AI that independently recommends cancer treatment without clinician review would be Class III.

What are the AI Act obligations?

The AI Act's requirements for high-risk AI systems (which include all AI medical devices) are set out in Chapter III, Section 2. The key obligations are:

Risk management system (Article 9)

  • Continuous, iterative risk identification and mitigation throughout the AI system lifecycle
  • Must address risks from intended use and reasonably foreseeable misuse

Data governance (Article 10)

  • Training, validation, and testing datasets must be relevant, representative, free of errors, and complete
  • Examination for possible biases that could affect health or safety

Technical documentation (Article 11)

  • Detailed description of the AI system including its intended purpose, design, development process, and validation methodology
  • This extends the MDR Annex II technical file

Record-keeping and logging (Articles 12, 19)

  • Automatic recording of events during operation (audit trail)
  • Traceability throughout the AI system lifecycle

Transparency and human oversight (Articles 13, 14)

  • Instructions for use must enable the user to interpret the AI system's output
  • Design must allow effective human oversight, including the ability to override or disregard the AI output

Accuracy, robustness, and cybersecurity (Article 15)

  • Declared levels of accuracy and robustness with testing evidence
  • Resilience against adversarial attacks and data corruption

Timeline: when do AI Act obligations apply?

DateMilestone
August 1, 2024AI Act entered into force
February 2, 2025Prohibited AI practices apply
August 2, 2025General-purpose AI model obligations apply
August 2, 2026High-risk AI system obligations apply (including AI medical devices)
August 2, 2027Full enforcement for AI systems already on the market

If you are placing a new AI medical device on the EU market after August 2, 2026, you must comply with both MDR and the AI Act's high-risk requirements from day one.

Devices already on the market before that date must comply if there are significant changes to their design after August 2, 2026 (AI Act Article 111).

What does the conformity assessment look like?

For an AI-based Class IIa or higher device, the conformity assessment combines MDR and AI Act requirements:

  1. Classify the device under MDR Rule 11 and confirm it is a high-risk AI system under AI Act Article 6(1)
  2. Implement a QMS covering both MDR Article 10(9) and AI Act Article 17 (AI quality management system)
  3. Prepare technical documentation per MDR Annex II extended with AI Act Annex IV requirements (data governance, training methodology, performance metrics, bias analysis)
  4. Conduct clinical evaluation per MDR Article 61 and MDCG 2020-1 (clinical evaluation of software)
  5. Submit to a Notified Body designated under both MDR and the AI Act — the NB conducts a single assessment covering both regulatory frameworks
  6. Register in EUDAMED (MDR) and the EU AI database (AI Act Article 71)
  7. Affix CE marking — one CE mark covers both MDR and AI Act conformity

What about continuous learning algorithms?

An AI system that continues to learn after deployment (adaptive/continuously learning) poses additional challenges. Under MDR, any change that could affect the device's safety or performance requires a new conformity assessment or at minimum a significant change assessment (MDCG 2020-3).

The AI Act addresses this through the requirement for ongoing monitoring and the obligation to update the risk management system when the AI system's behavior changes (Article 9(2)).

In practice, most manufacturers lock the algorithm at deployment (predetermined change control plan) rather than deploying truly adaptive models.

Key takeaways

  • An AI algorithm is a medical device under MDR if it has a medical intended purpose — classify under Rule 11
  • From August 2, 2026, AI medical devices must comply with both MDR and the EU AI Act's high-risk requirements
  • The Notified Body conducts a single conformity assessment covering both frameworks
  • The AI Act adds data governance, transparency, human oversight, and robustness requirements on top of MDR
  • Most AI diagnostic tools are Class IIa or IIb; Class III if they independently drive life-critical decisions
  • Continuously learning algorithms require a predetermined change control plan and ongoing monitoring

How RegAid helps

RegAid covers the full text of EU MDR 2017/745, the EU AI Act 2024/1689, all MDCG guidance documents, and IMDRF SaMD frameworks. Ask "What are the AI Act requirements for a Class IIb AI diagnostic device?" and get a structured answer with citations to both MDR and AI Act articles side by side.