February 24, 2026
As digital health matures, software is no longer peripheral to medical technology it is the medical technology. From AI-driven diagnostic tools to clinical decision support apps, software is increasingly influencing how clinicians diagnose, monitor and treat patients.
Yet under the UK’s current regulatory framework, many innovative software-based medical devices / Software as a Medical Device (SaMD) products can still be placed on the Great Britain market as Class I devices, while the same products would typically fall into Class IIa or higher under the EU regime.
This divergence between the Medicines and Healthcare products Regulatory Agency (MHRA) framework and the European Union Medical Device Regulation (EU MDR) creates a significant regulatory anomaly — one with commercial advantages, but also potential patient safety implications.
The UK currently operates under the Medical Devices Regulations 2002, which are derived from older EU Directives. Under this system, many standalone software products may qualify as Class I, meaning manufacturers can self declare conformity without mandatory third-party assessment.
In contrast, the EU MDR introduced Rule 11 (Annex VIII), specifically targeting software. Under Rule 11:
The difference is not academic, it fundamentally changes the level of pre-market scrutiny. When the MDR came into force, many SaMD products were ‘up-classified’ to Class IIa. Very few SaMD products now fall into Class I classification.
The problem can be partly ascribed to the wording of the MDD rules – SaMD products were perhaps not envisaged at the last update of the MDD (2007), consequently the language does not cover software.
1. AI-Assisted Radiology Triage Tool
Imagine a cloud-based algorithm that analyses chest X-rays and flags potential lung nodules for radiologist review.
Risk consideration:
If algorithm performance is suboptimal or biased, the absence of mandatory independent review could increase the likelihood of diagnostic delay or misinterpretation before issues are detected through post-market surveillance.
2. Clinical Decision Support App for Therapy Guidance
Consider a mobile app used in primary care that recommends physiotherapy based on patient-entered symptoms and history.
Potential patient safety impact:
Inaccurate algorithms or outdated clinical logic could lead to under-treatment, or over-treatment placing the patient at risk. EU MDR requires third-party conformity assessment; UK Class I self-declaration does not.

Classification determines:
For Class I devices in the UK:
For Class IIa in the EU:
The difference in oversight is substantial.
The Strategic and Ethical Tension
From a commercial standpoint, the UK pathway can offer:
For early-stage digital health companies, this can be attractive.
However, there are broader questions:
It is important to emphasise that Class I does not mean unsafe. Many manufacturers voluntarily apply high standards regardless of classification. But the absence of mandatory independent review reduces a layer of systemic safeguard.
What This Means for Digital Health Developers
Even where UK classification permits Class I self-declaration, forward-thinking manufacturers should consider:
✔ Designing evidence generation to meet EU MDR Class IIa standards
✔ Building robust clinical validation strategies early
✔ Implementing strong post-market performance monitoring
✔ Avoiding “regulatory arbitrage” positioning
Products influencing clinical decisions should be engineered to withstand higher scrutiny because eventually, they may face it.
Looking Ahead
The MHRA has signalled ongoing reform of UK medical device regulation, including software and AI oversight. It is likely that alignment pressures, both international and domestic will narrow this gap over time.
Until then, the UK–EU classification divergence remains a defining feature of the regulatory landscape for digital health.
For innovators, it presents opportunity. For regulators and healthcare providers, it raises important questions about proportionality and patient protection.
The real challenge lies in ensuring that speed to market does not outpace safeguards particularly when algorithms, not humans, are shaping clinical decisions.

