Discover how Europe’s new 2025 framework will evaluate AI, SaMD and digital medical devices, with harmonised HTA, evidence standards and pathways to reimbursement.
Artificial intelligence is now at the centre of modern medical innovation — powering diagnostics, clinical decision support, risk prediction, digital therapeutics, and a new generation of Software as a Medical Device (SaMD). But while AI medical devices and digital medical technologies are advancing at extraordinary speed, the systems used to evaluate, validate, and reimburse them in Europe have not kept pace.
Today, innovators face a patchwork of national rules, inconsistent HTA requirements, variable evidence standards, and unclear pathways for reimbursement of AI and SaMD. This fragmentation slows down patient access, creates huge uncertainty for developers, and makes it difficult for European health systems to safely adopt high-value digital tools.
In 2025, that changes.
Europe is launching a new unified framework for evaluating AI-driven medical devices, digital medical devices (DMDs), and SaMD, aligned directly with the upcoming EU HTA Regulation (2025). Developed through the European Taskforce for the Evaluation of Digital Medical Devices, co-led by the French Ministry of Health’s DNS, EUnetHTA, and EIT Health, this framework introduces the first coordinated European approach to AI and SaMD assessment.
The new 2025 framework integrates advanced scientific thinking from recent SaMD research, AI performance validation studies, and real-world evidence (RWE) methodologies, including:
- algorithmic transparency and explain ability
- model drift monitoring and lifecycle evaluation
- clinical utility and comparative effectiveness
- human-factors and safety engineering
- data quality and reproducibility standards
- economic value demonstration for reimbursement
Its purpose is clear:
to create a harmonised, evidence-based system that accelerates safe adoption of AI medical devices and provides innovators with a predictable, science-driven path to European reimbursement.
For the first time, AI technologies, SaMD, and digital therapeutics will be evaluated under a pan-European scientific framework — one that reflects how these technologies actually operate, evolve, and influence clinical outcomes.
Why Europe Needs a New Evaluation Framework for Digital Health

Across the scientific literature, one theme appears repeatedly: traditional health technology assessment (HTA) models — designed for pharmaceuticals and hardware devices — cannot keep pace with AI, SaMD, and digital medical devices (DMDs). Digital technologies operate through rapid iteration, continuous learning, behavioural engagement, and real-world adaptation. Static, single-timepoint evaluation models simply do not reflect how these systems function in clinical practice.
Recent research in digital health evaluation science highlights several critical limitations. Studies show that DMDs and AI medical devices require:
• New clinical endpoints beyond traditional RCT measures
Outcomes must capture behavioural change, patient engagement, adherence, continuous monitoring, and real-world utilisation, which are essential for digital therapeutics and algorithmic care pathways.
• Continuous, lifecycle-based evaluation — not single approvals
AI systems evolve over time. Evidence shows that digital devices must be assessed through dynamic evaluation, post-market performance monitoring, model drift detection, and long-term real-world evidence (RWE).
• AI-specific validation methodologies
Scientific papers emphasise the need for algorithmic transparency, bias assessment, generalisation testing, explainability, training-data quality review, and continuous model recalibration.
• Broader socio-economic assessment for reimbursement
Digital medical devices produce value not only in clinical outcomes but also in:
- workflow efficiency
- resource optimisation
- clinical throughput
- productivity gains
- earlier detection and intervention
- reduced emergency utilisation
Evidence shows these metrics must be part of formal HTA and reimbursement decisions.
• Taxonomies tailored to digital modalities
Leading authors argue that HTA must differentiate between:
- AI diagnostic algorithms
- Software as a Medical Device (SaMD)
- digital therapeutics (DTx)
- clinical decision support systems (CDSS)
- remote monitoring and sensor-based platforms
- predictive analytics models
Each category requires distinct evidence, validation strategies, and economic logic.
Scientific Evidence Supporting Reform
The need for a new framework has been repeatedly documented in the academic literature.
Key studies — Tarricone et al. (2024), Gomes et al. (2022), Wilkinson et al. (2024), Alber et al. (2025) — demonstrate:
- systemic fragmentation across Member States
- inconsistent evidence standards for SaMD and AI
- variable approaches to RWE, safety, clinical utility, and reimbursement
- the absence of harmonised digital health HTA criteria
- the need for a pan-European scientific framework aligned with EU policy
These publications converge on one message: Europe cannot achieve safe, scalable digital health adoption without a continent-wide evidence model.
The EU Taskforce: Converting Science into Policy
For the first time, the European Taskforce for the Evaluation of Digital Medical Devices, led by DNS (France), EUnetHTA, and EIT Health, is attempting to translate this scientific consensus into policy, guidance, and a unified evaluation framework.
This work aligns directly with the EU HTA Regulation (2025) and represents Europe’s most serious effort to modernise how AI, SaMD, and digital medical devices are assessed for clinical value and reimbursement.
1. Who Is Leading the EU Taskforce for Digital Medical Device Evaluation?
(Strengthened, keyword-optimised, clearer authority framing)
The new European Taskforce for the Evaluation of Digital Medical Devices — the most ambitious digital health harmonisation initiative ever launched in Europe — is chaired by the French Ministerial Delegation for Digital Health (DNS). It is co-chaired by EUnetHTA, coordinated by EIT Health, and supported scientifically by the University of Luxembourg’s Digital Medicine Unit.
This consortium brings together Europe’s leading regulators, HTA agencies, AI experts, and SaMD researchers, including contributors from:
- France — DNS, Haute Autorité de Santé (HAS), Inserm, Université Paris Cité, Inria
- Germany — BfArM (DiGA and DiPA evaluation units)
- Luxembourg — University of Luxembourg, digital evidence labs
- Belgium — mHealthBelgium framework teams
- Austria — AI-driven diagnostics and HTA contributors
- Denmark, Finland, Italy, Spain — national digital HTA units and evidence-assessment bodies
The taskforce reflects a shared commitment to building a pan-European scientific framework for evaluating AI medical devices, SaMD, digital therapeutics (DTx), remote monitoring tools, predictive algorithms, and hybrid digital-clinical technologies.
2. The Three Work Packages: Building Europe’s Unified Framework for AI, SaMD and Digital Medical Devices
(Stronger structure, more depth, more keywords)
WP1 — Creating a European Taxonomy for Digital Medical Devices (DMDs)
A unified taxonomy is foundational for consistent evidence generation and reimbursement.
WP1 is developing a function-based, risk-based, and AI-specific classification system that will apply across all EU Member States. This includes:
- categorising AI medical devices, SaMD, DTx, decision-support algorithms, monitoring platforms, and hybrid models
- identifying algorithmic complexity, data dependency, and training data requirements
- mapping interoperability layers, cybersecurity expectations, and lifecycle characteristics
- defining evidence requirements by risk class, intended use, and degree of autonomy
This work aligns closely with recent Nature Digital Medicine proposals for DMD classification grids and emerging scientific taxonomies for SaMD evaluation.
WP2 — Harmonising Clinical and Methodological Evidence Requirements Across Europe
WP2 is gathering methodological and clinical evidence expectations from all 27 Member States — the first systematic mapping of digital HTA requirements in EU history.
Key methodological domains include:
- AI algorithm validation (performance metrics, generalisation, calibration, bias testing)
- explainability and transparency requirements for ML/AI models
- RWE and real-world performance monitoring
- usability, human-factors engineering and patient engagement
- data quality, provenance, reproducibility and auditability
- clinical-behavioural hybrid endpoints specific to digital therapeutics and SaMD
- continuous monitoring, drift detection, and post-market algorithm surveillance
This reflects the growing scientific consensus that digital medical devices require dynamic, lifecycle-based evaluation, integrating both clinical outcomes and behavioural/interaction-based endpoints.
WP3 — Integrating Socio-Economic Value into European Digital HTA
Traditional cost-utility analysis cannot capture the full value of digital health technologies. WP3 is developing a European socio-economic evaluation framework specifically tailored to AI, SaMD, and DMDs.
This framework accounts for:
- reductions in hospital, emergency, and GP utilisation
- earlier diagnosis and accelerated intervention
- workforce optimisation and productivity gains
- algorithm-driven triage efficiencies
- improved remote monitoring and chronic-disease management
- impact on caregivers, equity, and health-system resilience
The methodological approach mirrors the economic models proposed by Wilkinson, Santos, Gensorowsky, and leading HEOR groups who argue that digital health requires multi-dimensional economic assessment, not solely QALYs or ICERs.
3. Scientific Oversight: The External Advisory Group (EAG)

(Strengthened focus on scientific independence and validation science)
The Taskforce is supported by an independent External Advisory Group, coordinated by Bocconi University, a leading centre for digital health economics and HTA methodology.
The EAG brings together:
- academic experts in AI validation, digital HTA, and SaMD evaluation
- clinicians specialising in digital-enabled care pathways
- HTA methodologists and health economists
- regulators and industry representatives with significant SaMD experience
Its purpose is to ensure that:
- all methodologies are scientifically robust and future-proof
- the framework is tested using real AI/SaMD case studies
- standards align with EU HTA Regulation 2025
- evaluation criteria reflect the latest evidence-generation science
The first formal meeting of the EAG takes place at Bocconi University, Milan, on 15 November, marking the beginning of a coordinated European validation phase.
4. Expected Impact: Europe’s New Model for Digital HTA
(More powerful, future-oriented, repayment/reimbursement lens)
The EU Taskforce will deliver its consolidated scientific and policy recommendations in Q1 2025, including:
- a unified EU taxonomy for digital medical devices
- aligned clinical, technical, and methodological evidence standards
- a European socio-economic evaluation matrix for digital health
- guidance for EU HTA Regulation 2025 and digital-specific assessments
- templates for cross-border evidence recognition, bilateral agreements, and joint reviews
If adopted, this would represent the first harmonised, real-world-ready HTA model for AI and digital medical devices in Europe — reducing fragmentation between:
- PECAN / RIHN 2.0 (France)
- DiGA / DiPA (Germany)
- mHealthBelgium
- Spain’s regional digital pilots
- Italy’s regional HTA structures
- Nordic digital HTA agencies
For innovators, this means predictable evidence requirements, faster reimbursement, and EU-wide scalability of digital health solutions.
For patients and health systems, it means safer AI deployment, greater transparency, and faster access to clinically meaningful digital tools.
Official Links
References
Peer-Reviewed Scientific Literature
1. Tarricone et al. (2024). Harmonising Assessment of Digital Medical Devices. npj Digital Medicine.
Keywords: AI medical devices, SaMD evaluation, EU HTA, digital health regulation
Official journal link:
https://www.nature.com/npjdigitmed/
2. Gomes, Murray & Raftery (2022). Methodological Challenges in Evaluating Digital Health Interventions. Pharmacoeconomics.
Keywords: digital HEOR, economic evaluation, reimbursement barriers
Official journal link:
https://link.springer.com/journal/40273
3. Santos et al. (2025). Economic Evaluation of Digital Health Technologies. BMJ Open.
Keywords: digital cost-effectiveness, real-world evidence, socio-economic value
BMJ Open homepage (BMJ Open does not provide pre-release URLs):
https://bmjopen.bmj.com/
4. Wilkinson et al. (2024). Framework for Economic Evaluation of Digital Health Interventions. Oxford Open Digital Health.
Keywords: digital HTA, modelling frameworks, hybrid endpoints
Journal link:
https://academic.oup.com/oodh
5. Alber et al. (2025). State of Economic Evaluation in DiGA. Journal of Medical Internet Research (JMIR).
Keywords: Germany DiGA reimbursement, mHealth economics, HTA methods
JMIR homepage:
https://www.jmir.org/
6. Gensorowsky et al. (Year). Value-Based Pricing of Digital Health Applications (DiGA). Health Economics Review.
Keywords: value-based pricing, DiGA policy, digital reimbursement
Journal link:
https://healtheconomicsreview.biomedcentral.com/
7. Freitag et al. (2024). Economic Modelling of mHealth Tools Under the German DiGA Framework. npj Digital Medicine.
Keywords: digital therapeutics, cost modelling, DiGA validation
npj Digital Medicine issue list:
https://www.nature.com/npjdigitmed/articles
European Policy, Digital Health, and HTA Frameworks
8. EDiHTA Project (2025). European Digital Health Technology Assessment Programme.
Keywords: EU digital HTA framework, AI medical device policy, EU 2025 harmonisation
Official project page:
https://eupha.org/digital-health
9. European Digital Medicine Conference (Luxembourg).
Keywords: European AI policy, digital medicine research, EU HTA taskforce
Official event site:
https://digitalmedicineconference.uni.lu/
10. G_NIUS – France’s National Digital Health Innovation Portal.
Keywords: PECAN, RIHN 2.0, French digital health reimbursement, regulatory pathways
Official French homepage:
https://gnius.esante.gouv.fr/fr
Official English homepage:
https://gnius.esante.gouv.fr/en/international-digital-health-systems
11. EUnetHTA – European Network for Health Technology Assessment.
Keywords: EU HTA Regulation 2025, joint clinical assessments, digital HTA
Official portal:
https://www.eunethta.eu/
12. BfArM – Federal Institute for Drugs and Medical Devices (Germany) – DiGA and DiPA Frameworks.
Keywords: digital therapeutics, DiGA Fast Track, Germany digital HTA
Official DiGA portal:
https://www.bfarm.de/EN/MedicalDevices/DiGA/_node.html
13. Haute Autorité de Santé (HAS) – France Digital Medical Device Evaluation Guidance.
Keywords: PECAN, RIHN, AI medical device evaluation, French HTA
Official HAS website:
https://www.has-sante.fr/
Digital Health Frameworks, Standards, and Scientific Bodies
14. IMDRF – Software as a Medical Device (SaMD) Working Group.
Keywords: SaMD definition, AI lifecycle evaluation, risk frameworks
Official documentation:
https://www.imdrf.org/working-groups/samd
15. European Commission – EU HTA Regulation (2021/2282) Documentation.
Keywords: EU HTA 2025, joint clinical assessments, digital medical devices
Official regulation:
https://health.ec.europa.eu/health-technology-assessment_en
16. European Health Data Space (EHDS) – Digital Health and Interoperability Policy.
Keywords: data governance, real-world evidence, AI training datasets
Official site:
https://health.ec.europa.eu/ehealth-digital-health-and-care/european-health-data-space_en
FAQ
1. What are digital medical devices?
Digital medical devices include AI medical devices, Software as a Medical Device (SaMD), digital therapeutics (DTx), clinical decision-support algorithms, remote monitoring platforms, and predictive analytics tools. They deliver a medical function through software and require dedicated clinical, technical, and socio-economic evaluation.
2. Why does Europe need a new 2025 framework for evaluating digital medical devices?
Europe currently uses fragmented national HTA systems, which results in inconsistent evidence requirements and slow reimbursement. The new 2025 EU framework introduces harmonised evaluation criteria for AI, SaMD, and digital medical devices to improve safety, transparency, scalability, and patient access.
3. How will AI medical devices and SaMD be evaluated under the new EU HTA Regulation?
The 2025 framework incorporates AI-specific validation, including algorithm transparency, bias testing, drift monitoring, real-world evidence collection, behavioural endpoints, and data-quality standards. SaMD and digital tools will be evaluated continuously across their lifecycle rather than via a single approval.
4. Will the new framework change how digital medical devices are reimbursed in Europe?
Yes. The new system introduces a pan-European socio-economic evaluation matrix that considers workflow efficiency, earlier intervention, workforce optimisation, patient engagement, and long-term outcomes. This enables more consistent and predictable reimbursement pathways for AI and digital medical devices across Member States.
5. Who is leading the EU Taskforce for the evaluation of digital medical devices?
The taskforce is chaired by the French Ministerial Delegation for Digital Health (DNS), co-chaired by EUnetHTA, coordinated by EIT Health, and supported scientifically by the University of Luxembourg. Contributors include HTA bodies from France, Germany (BfArM), Belgium, Italy, Spain, and the Nordic countries.
6. What evidence will digital medical devices need under the new 2025 framework?
Developers will need to provide:
- clinical evidence and comparative effectiveness
- algorithm validation + transparency documentation
- real-world evidence (RWE)
- usability and human-factors data
- socio-economic impact analysis
- interoperability and cybersecurity documentation
These requirements apply across AI, SaMD, DTx, and monitoring technologies.
7. When will the new EU digital medical device framework be implemented?
The consolidated taskforce recommendations will be delivered in Q1 2025, alongside alignment with the EU HTA Regulation. Member States are expected to begin integrating the framework throughout 2025–2026.
FAQ (Part 2)
1. How does the new EU framework classify different types of digital medical devices?
The 2025 framework introduces a European taxonomy for digital medical devices, distinguishing AI algorithms, SaMD, digital therapeutics, remote monitoring tools, decision-support systems, and hybrid models. Classifications are based on risk level, intended use, autonomy, data dependency, and clinical impact.
2. What role does real-world evidence (RWE) play in evaluating digital medical devices?
Under the new framework, real-world evidence becomes essential. Digital medical devices must demonstrate safety, performance, engagement, and long-term clinical utility through ongoing data collection, post-market monitoring, and algorithm drift analysis.
3. How will AI transparency and explainability be reviewed in the EU evaluation process?
AI medical devices must provide documentation on explainability, training datasets, model behaviour, bias prevention, generalisability, and performance over time. These transparency requirements help regulators understand how predictions are generated and ensure patient safety.
4. Will national systems like DiGA, PECAN, RIHN 2.0 and mHealthBelgium be replaced?
No — national systems will remain. However, the new EU 2025 framework aims to harmonise core evidence requirements, reducing duplication and creating pathways for mutual recognition. In practice, this can accelerate reimbursement across France, Germany, Belgium, Spain, Italy, and the Nordics.
5. How does the framework support faster reimbursement for digital medical devices?
By defining standardised evidence, clinical endpoints, socio-economic metrics, and AI-specific validation rules, the framework reduces uncertainty for both HTA bodies and payers. This creates more predictable reimbursement timelines, fewer appeals, and smoother cross-border adoption.
6. What makes digital medical devices different from traditional medical devices in HTA?
Digital medical devices evolve continuously after market entry. AI models retrain, algorithms adapt, and patient engagement affects outcomes. The framework recognises these dynamics and introduces lifecycle evaluation, continuous monitoring, and hybrid clinical-behavioural endpoints that traditional devices do not require.
7. How does interoperability affect the evaluation of digital medical devices?
Interoperability is now a core requirement. Digital devices must demonstrate compliance with EU data standards, API protocols, cybersecurity layers, data protection rules, and EHR integration requirements. Interoperability affects safety, clinical utility, and socio-economic value.
8. How does the EU ensure that digital medical device evaluation remains future-proof?
The framework integrates continuous revision cycles, AI drift monitoring, real-world performance updates, stakeholder feedback loops, and scientific oversight through an External Advisory Group coordinated by leading academic institutions. This helps adapt evaluation rules to rapid technological evolution.
Glossary of Key Terms (AIOSEO-Optimised)
AI Medical Device
A software-based or algorithm-driven medical device that uses artificial intelligence or machine learning to support diagnosis, prediction, monitoring, triage, or treatment decisions. AI medical devices require continuous validation, transparency, bias testing, and drift monitoring.
Software as a Medical Device (SaMD)
Software that performs a medical function without being part of a hardware device. Includes digital diagnostics, decision-support algorithms, and digital therapeutics. SaMD has specific regulatory and evidence requirements under EU MDR, IMDRF, FDA, MHRA, and now the EU’s 2025 framework.
Digital Medical Device (DMD)
A broad category covering AI systems, SaMD, digital therapeutics (DTx), clinical decision-support systems, mobile health tools, and remote monitoring platforms. Digital medical devices require hybrid clinical-behavioural evidence, socio-economic assessment, interoperability checks, and real-world evidence (RWE).
Digital Therapeutics (DTx)
Evidence-based digital interventions that produce therapeutic outcomes through software. Used in chronic disease management, mental health, rehabilitation and behaviour change. Must demonstrate engagement, adherence, and real-world clinical effectiveness.
Clinical Decision Support System (CDSS)
Algorithmic or rule-based tools that support clinicians with diagnosis, risk prediction, treatment selection, or triage. CDSS must demonstrate accuracy, transparency, usability, and clinical utility.
Real-World Evidence (RWE)
Evidence generated from routine clinical use, including patient-reported data, EHRs, sensors, and remote monitoring. Essential for evaluating digital medical devices, as AI models evolve after launch and require continuous monitoring for safety and performance.
Algorithmic Drift (Model Drift)
A change in AI model performance over time due to new data, shifting populations, or context variation. The 2025 EU framework requires routine drift monitoring and lifecycle evaluation.
Interoperability
The ability of a digital medical device or SaMD to exchange, interpret, and use data across EHR systems, APIs, clinical platforms, and national data infrastructures. Critical for safety, data quality, workflow integration, and socio-economic value assessment.
EU HTA Regulation (2021/2282)
A regulation that becomes fully applicable in 2025, harmonising how Member States evaluate the clinical effectiveness of medical technologies. The new digital medical device framework aligns directly with this regulation.
Health Technology Assessment (HTA)
A multidisciplinary evaluation of clinical effectiveness, safety, cost-effectiveness, socio-economic value, and system impact. Digital medical devices need HTA methods adapted for AI, SaMD, behavioural endpoints, and real-world evidence.
Socio-Economic Evaluation (SEE)
Assessment of the broader value of digital medical devices, including efficiency gains, productivity improvements, avoided costs, workforce optimisation, and earlier intervention. Traditional QALY models are insufficient for digital tools, hence SEE is central to the 2025 framework.
PECAN / RIHN 2.0 (France)
France’s evolving digital medical device and diagnostic reimbursement pathways, managed by HAS. The new EU framework aims to harmonise evidence expectations with PECAN and RIHN.
DiGA / DiPA (Germany)
Germany’s fast-track pathways for reimbursing digital health apps and digital care applications. Frequently used as a reference model for evidence generation and economic evaluation.
mHealthBelgium
Belgium’s structured validation and reimbursement model for mobile health applications (M1–M3). The new EU framework supports cross-recognition of evidence used in mHealthBelgium.