From August 2, 2026, any AI system embedded in a medical device regulated under EU MDR 2017/745 or IVDR 2017/746 must comply with the full high-risk obligations of the EU AI Act 2024/1689. This applies in parallel with MDR, not as a replacement. Manufacturers must satisfy both regulatory frameworks through a single conformity assessment.
Why AI medical devices are automatically high-risk
AI Act Article 6(1) classifies AI systems as high-risk when they are safety components of products covered by EU harmonisation legislation listed in Annex I. Both MDR 2017/745 and IVDR 2017/746 appear in that annex.
This means any software as a medical device (SaMD) in risk classes IIa, IIb, or III is automatically a high-risk AI system under the AI Act. No separate classification analysis is needed. Class I devices may also qualify if they include an AI component that acts as a safety component.
| MDR Class | AI Act Classification | Rationale |
|---|---|---|
| III | High-risk (automatic) | Listed harmonisation legislation in Annex I |
| IIb | High-risk (automatic) | Same |
| IIa | High-risk (automatic) | Same |
| I | Potentially high-risk | If AI acts as safety component |
What the AI Act adds beyond MDR
MDR already requires a quality management system, clinical evaluation, and post-market surveillance. The AI Act adds specific requirements for the AI component that go beyond what MDR mandates:
Risk management for AI (AI Act Article 9): An AI-specific risk management process, documented and maintained throughout the AI system lifecycle. This must be integrated with your existing ISO 14971 risk management file, but addresses AI-specific risks such as bias, distributional shift, and failure modes under edge cases.
Data governance (AI Act Article 10): Training, validation, and testing datasets must meet quality criteria. You must document data collection methods, address possible biases, and ensure datasets are relevant, representative, and complete for the intended purpose.
Transparency (AI Act Article 13): Users (deployers) must be able to interpret the AI system's output and understand its limitations. Instructions for use must specify the level of accuracy, robustness, and cybersecurity the system achieves.
Human oversight (AI Act Article 14): The system must be designed so that a human can effectively oversee its operation, understand its capabilities and limitations, and intervene or override when necessary.
Accuracy, robustness, cybersecurity (AI Act Article 15): The system must achieve appropriate levels of accuracy, be resilient against errors and inconsistencies, and be protected against unauthorized manipulation.
Technical documentation: dual requirements
Your technical file is assessed against both MDR and AI Act requirements. In practice, this means your existing MDR Annex II/III technical documentation must be supplemented with the AI Act Annex IV technical documentation.
| MDR Technical File | AI Act Annex IV Addition |
|---|---|
| Device description and intended purpose | Detailed description of AI system elements, development process |
| Risk management (ISO 14971) | AI-specific risk management with bias and robustness analysis |
| Clinical evaluation (MDR Annex XIV) | Data governance documentation, training/validation datasets |
| Software lifecycle (IEC 62304) | Algorithm design, training methodology, performance metrics |
| Verification and validation | AI-specific testing including edge cases, adversarial inputs |
| Post-market surveillance plan | AI performance monitoring, drift detection plan |
Conformity assessment: one procedure, two frameworks
AI Act Article 43(1) specifies that the conformity assessment for high-risk AI systems embedded in medical devices follows the procedure already established under MDR. Your Notified Body assesses both MDR and AI Act compliance in a single process.
This does not mean less work. It means your Notified Body will evaluate the additional AI Act requirements during the same audit. Expect longer review cycles and additional questions on data governance, algorithm validation, and bias mitigation.
Key deadlines
| Date | Requirement |
|---|---|
| August 2, 2026 | High-risk AI obligations fully apply to new AI medical devices placed on the EU market |
| August 2, 2027 | Full compliance required for AI medical devices already on the market before August 2026 |
| Ongoing | Post-market monitoring must include AI performance tracking, drift detection, and incident reporting |
Manufacturers placing new AI-based SaMD on the EU market after August 2, 2026 must demonstrate compliance with both MDR and AI Act from day one. There is no transition period for new products.
How to prepare now
- Map your AI components: Identify every AI/ML element in your device and document its role in clinical decision-making
- Extend your risk management: Add AI-specific risk analysis (bias, distributional shift, adversarial robustness) to your ISO 14971 file
- Document your data pipeline: Record data sources, preprocessing, labeling, splits, and quality criteria for all training and validation data
- Define performance metrics: Establish accuracy, sensitivity, specificity, and robustness benchmarks with clear acceptance criteria
- Plan for monitoring: Build a post-market AI performance monitoring system that detects model drift and degradation over time
- Engage your Notified Body early: Discuss AI Act readiness with your NB now, since audit timelines are already stretched
Looking for the right regulatory intelligence tool? Read our guide to evaluating regulatory intelligence platforms.
Key takeaways
- All AI-based SaMD in MDR classes IIa through III are automatically high-risk under the EU AI Act
- The AI Act applies in parallel with MDR, not as a replacement
- New obligations cover data governance, transparency, human oversight, and AI-specific risk management
- Technical documentation must satisfy both MDR Annex II/III and AI Act Annex IV
- One conformity assessment covers both frameworks, conducted by your Notified Body
- New devices must comply from August 2, 2026; existing devices by August 2, 2027
- Start preparing now because Notified Body capacity is already constrained
How RegAid helps
RegAid covers the full text of the EU AI Act 2024/1689 alongside MDR 2017/745 and all MDCG guidance on software and AI. Ask "What data governance requirements apply to my AI diagnostic under the AI Act?" and get a cited answer linking Article 10 requirements to your MDR technical documentation obligations.
