Skip to content

Stabilarity Hub

Menu
  • ScanLab
  • Research
    • Medical ML Diagnosis
    • Anticipatory Intelligence
    • Intellectual Data Analysis
    • Ancient IT History
    • Enterprise AI Risk
  • About Us
  • Terms of Service
  • Contact Us
  • Risk Calculator
Menu

[Medical ML] Regulatory Landscape for Medical AI: FDA, CE Marking, and Ukrainian MHSU

Posted on February 8, 2026February 10, 2026 by Yoman

Regulatory Landscape for Medical AI: FDA, CE Marking, and Ukrainian MHSU

Article #6 in Medical ML for Ukrainian Doctors Series

By Oleh Ivchenko | Researcher, ONPU | Stabilarity Hub | February 8, 2026


πŸ“‹ Key Questions Addressed

  1. How do FDA, EU, and Ukrainian regulatory frameworks differ in their approach to AI-enabled medical devices?
  2. What are the pathways for market authorization of medical AI software in each jurisdiction?
  3. How can Ukrainian developers prepare for both domestic approval and international market access?

Context: Why Regulatory Understanding Matters

For ScanLab and any medical AI initiative targeting Ukrainian healthcare, regulatory compliance isn’t optionalβ€”it’s existential. Understanding the regulatory landscape determines:

  • Market access: Which markets can you legally enter?
  • Development priorities: What documentation and validation is required from day one?
  • Timeline and cost: Regulatory pathways vary from weeks to years and from thousands to millions in investment
  • Trust building: Physician adoption correlates strongly with regulatory approval status

As we noted in Article #5 on data requirements, medical AI development must build regulatory-ready documentation from the start. This article maps the three key regulatory environments relevant to Ukrainian medical AI developers.


1. United States: FDA Framework for AI/ML Medical Devices

The Regulatory Structure

The U.S. Food and Drug Administration (FDA) has been regulating medical devices since 1976 and has emerged as the global leader in AI-specific medical device regulation. The FDA defines Software as a Medical Device (SaMD) as software intended for medical purposes that performs without being part of a hardware device.

πŸ“Š Key Statistics

  • As of late 2025, the FDA has authorized over 1,200 AI/ML-enabled medical devices
  • Approximately 100 new approvals annually
  • Global SaMD market valued at $18.5 billion

Classification and Pathways

The FDA employs a risk-based classification system:

Class Risk Level Examples Pathway Timeline
Class I Low Fitness trackers, simple displays Generally exempt N/A
Class II Moderate AI triage tools, ECG analysis 510(k) Premarket Notification ~90 days
Class III High Autonomous diagnostic AI, mammography AI Premarket Approval (PMA) ~180 days
Novel Moderate Moderate (new) First-of-kind AI tools De Novo ~60 days
πŸ“Œ Key Finding: A 2015-2020 study found that of 222 FDA-approved AI/ML devices, 204 (92%) used the 510(k) pathway, 15 used De Novo, and only 3 required full PMA.

The 2024-2025 AI/ML Action Plan

In December 2024, the FDA finalized groundbreaking guidance introducing the Total Product Life Cycle (TPLC) approach specifically designed for AI/ML devices:

Predetermined Change Control Plans (PCCP)

  • Manufacturers can pre-specify allowed modifications to AI algorithms
  • Updates within the approved PCCP scope don’t require re-approval
  • Addresses the fundamental tension between AI’s adaptive nature and regulatory requirements for “locked” algorithms

Good Machine Learning Practice (GMLP) – 10 Principles

  1. Representative and unbiased training datasets
  2. Robust cybersecurity practices
  3. Transparent communication with end-users about algorithm updates
  4. Documented data management and retraining protocols

Transparency Gaps

Despite progress, studies reveal significant transparency issues:

Metric Finding
Study design reported 53.3% of devices
Training data size disclosed 46.7% of devices
Demographic data included 4.5% of devices
Randomized trial evidence 1.6% of devices
Reported adverse events 5.2% of devices
⚠️ Critical Concern: One study found that 9.4% of approved AI/ML devices have been recalled, though one-third were subsequently re-approved.

Key FDA Definitions

Term Definition
SaMD Software as a Medical Device – performs medical functions independently
SiMD Software in a Medical Device – operates as part of hardware
CDS Clinical Decision Support – displays information, supports but doesn’t direct clinical judgment

2. European Union: MDR and AI Act Double Regulation

The Regulatory Duality Challenge

European medical AI developers face a unique challenge: compliance with two overlapping regulatory frameworks:

  1. Medical Device Regulation (EU MDR 2017/745) – sector-specific medical device law
  2. EU AI Act (Regulation 2024/1689) – horizontal AI legislation

“Companies risk being caught between overlapping regulatory frameworks.” β€” MedTech Europe

EU AI Act Risk Classification

“`mermaid
graph TD
A[AI System Assessment] –> B{Medical Device?}
B –>|Yes – Requires Notified Body| C[HIGH-RISK AI]
B –>|No| D{General Purpose?}
D –>|Yes| E[GPAI Requirements]
D –>|No| F{Manipulation Risk?}
F –>|Yes| G[PROHIBITED AI]
F –>|No| H[LOW-RISK AI]

C –> I[Full Compliance Obligations]
G –> J[Banned Outright]
E –> K[Transparency Requirements]
H –> L[Minimal Obligations]

style C fill:#dc3545,color:#fff
style G fill:#000,color:#fff
style E fill:#ffc107,color:#000
style H fill:#28a745,color:#fff
“`

Category Examples Requirements
Prohibited AI Social credit scoring, manipulative AI Banned outright
High-Risk AI Medical devices requiring notified body Full compliance obligations
General-Purpose AI Foundation models, LLMs Transparency requirements
Low-Risk AI Simple chatbots, spam filters Minimal obligations
πŸ”‘ Critical Point: Any medical device with AI that requires notified body involvement under MDR/IVDR is automatically classified as high-risk under the AI Act.

Compliance Timeline

“`mermaid
timeline
title EU AI Act Compliance Milestones
May 2025 : General-purpose AI obligations apply
August 2025 : Prohibited AI practices enforced
August 2026 : High-risk AI obligations (most systems)
August 2027 : Full obligations for medical devices under MDR/IVDR
“`

High-Risk AI Requirements (Beyond MDR)

  1. Risk Management System (Art. 9): AI-specific risk assessment aligned with MDR
  2. Data Governance (Art. 10): Requirements for training, validation, and testing datasets including:
    • Assessment of data availability, quantity, and suitability
    • Bias examination for health and safety impacts
    • Consideration of geographic, contextual, and demographic settings
  3. Technical Documentation (Art. 11): Comprehensive documentation of design, training, and validation
  4. Transparency (Art. 13): Clear disclosure of accuracy levels and limitations
  5. Human Oversight (Art. 14): Ensuring appropriate human control over AI outputs
  6. Quality Management System (Art. 17): AI QMS integrating with ISO 13485

CE Marking Pathway

“`mermaid
graph TD
A[1. Classification
MDR Annex VIII] –> B[2. Technical Documentation
B –> C[3. Quality Management System
C –> D[4. Conformity Assessment
D –> E[5. EU Declaration of Conformity]
E –> F[6. CE Mark Affixation]
“`


3. Ukraine: MHSU and EU Integration Path

Current Regulatory Framework

  • Ministry of Health of Ukraine (MHSU/MOH): Central executive body for technical regulation
  • State Service of Ukraine on Medicines and Drugs Control (SSMD): Market supervision authority
  • Technical Regulations: Based on Decrees No. 753, 754, 755 (October 2013)
⚑ Critical Limitation: Ukrainian regulations are currently aligned with older EU Directives (93/42/EEC, 98/79/EC, 90/385/EEC), not the current EU MDR. Harmonization with MDR is expected within approximately 2 years.

2025 Reform: State Regulatory Authority (SRA)

A major reform announced in 2025 establishes the State Regulatory Authority (SRA)β€”a single regulator that will:

  • Grant market access for medicines, medical devices, substances of human origin, and cosmetics
  • Replace the current fragmented regulatory structure
  • Align more closely with EU institutional practices

Medical Device Classification in Ukraine

Class Assessment Route Notes
Class I (non-sterile, no measuring) Self-declaration Notify SSMD only
Class I (sterile or measuring) Notified body examination QMS inspection required
Class IIa/IIb QMS examination Manufacturing site inspection
Class III Design + QMS examination Full technical file review
IVD (general) Self-declaration Lists A/B require notified body

EC Certificate Recognition (Simplified Pathway)

Ukraine offers a simplified pathway for devices already CE-marked in the EU:

  • EC Certificate from EU notified body with mutual recognition agreement
  • Authorized representative in Ukraine (minimum 5-year commitment)
  • Ukrainian language labeling and instructions for use
  • Declaration of conformity for Ukrainian market

AI-Specific Considerations

πŸ‡ΊπŸ‡¦ Current Status

Ukraine has no specific AI regulation for medical devices. AI-enabled devices are assessed under general medical device rules.

Practical Implications for ScanLab:

  1. Develop to EU AI Act standards from the outset (future-proofing)
  2. Maintain documentation that satisfies FDA requirements (optional but valuable)
  3. Leverage EC Certificate recognition pathway for faster Ukrainian market access
  4. Prepare for eventual MDR alignment in Ukrainian regulations

4. Comparative Analysis

Pathway Comparison

Aspect FDA (USA) EU (MDR + AI Act) Ukraine (MHSU)
Primary legislation FD&C Act, 21 CFR MDR 2017/745, AI Act 2024/1689 Decrees 753-755 (2013)
AI-specific rules βœ… Yes (TPLC, PCCP) βœ… Yes (AI Act high-risk) ❌ No
Adaptive algorithm support βœ… Yes (PCCP) πŸ”„ Emerging ❌ No
Notified body required No (FDA reviews directly) Yes (Class IIa+) Yes (Class IIa+)
Timeline (Class II equiv.) ~90 days 6-18 months 3-6 months
Documentation burden High Very high (dual framework) Moderate

Market Access Strategy for Ukrainian Developers

“`mermaid
graph LR
A1[ISO 13485 QMS] –> A2[MDR Documentation]
A2 –> A3[AI Act Annex IV]
A3 –> A4[GMLP Compliance]
B1[Notified Body Selection] –> B2[Conformity Assessment]
B2 –> B3[EU Market Entry]
C1[Authorized Representative] –> C2[EC Certificate Recognition]
“`


5. Practical Implications for ScanLab

Immediate Actions

  1. Establish ISO 13485 QMS: Foundation for all regulatory pathways
  2. Document AI development per GMLP: Training data, validation, testingβ€”all from project inception
  3. Design for transparency: Build explainability features that satisfy both clinical and regulatory needs
  4. Plan for adaptive algorithms: Implement change control processes compatible with FDA PCCP concept

Documentation Requirements Matrix

Document FDA EU MDR EU AI Act Ukraine
Technical file βœ… βœ… βœ… βœ…
Risk management βœ… βœ… βœ… βœ…
Clinical evaluation βœ… βœ… β€” βœ…
Training data documentation βœ… β€” βœ… β€”
Bias assessment βœ… β€” βœ… β€”
Human oversight design β€” β€” βœ… β€”
Post-market surveillance βœ… βœ… βœ… βœ…

Regulatory Timeline Considerations for ScanLab

πŸ‡ΊπŸ‡¦ Ukrainian Market

6-12 months

via CE recognition

πŸ‡ͺπŸ‡Ί EU Market

12-24 months

full CE marking + AI Act

πŸ‡ΊπŸ‡Έ US Market

12-18 months

510(k) pathway


6. Open Questions for Future Research

  1. How will Ukraine’s planned SRA handle AI-specific medical device requirements?
  2. What will be the timeline for Ukrainian MDR harmonization?
  3. How are notified bodies preparing for joint MDR/AI Act assessments?
  4. What predicate devices exist for AI-based X-ray analysis under FDA 510(k)?
  5. How should developers handle the gap between AI Act requirements and current Ukrainian regulation?

Key Insights Summary

Jurisdiction Key Takeaway
πŸ‡ΊπŸ‡Έ FDA Most mature AI-specific framework; PCCP enables adaptive algorithms; 510(k) pathway accessible
πŸ‡ͺπŸ‡Ί EU Dual regulation challenge; AI Act adds significant burden on top of MDR; August 2027 deadline for medical devices
πŸ‡ΊπŸ‡¦ Ukraine Currently based on older EU Directives; SRA reform underway; EC recognition provides efficient market access; no AI-specific rules yet

🎯 For Ukrainian Medical AI Developers

The optimal strategy is develop once, deploy globally:

  1. Build to the highest common denominator (EU AI Act + MDR)
  2. Use this documentation base for FDA and Ukrainian submissions
  3. Leverage EC Certificate recognition for Ukrainian market speed
  4. Plan for eventual Ukrainian MDR harmonization

Questions Answered

βœ… How do FDA, EU, and Ukrainian frameworks differ?

FDA leads in AI-specific guidance with TPLC/PCCP; EU creates dual regulatory burden with MDR + AI Act; Ukraine relies on older Directive-based rules with EC recognition pathway.

βœ… What are market authorization pathways?

FDA: 510(k)/De Novo/PMA; EU: CE marking via notified body; Ukraine: Direct assessment or EC Certificate recognition.

βœ… How can Ukrainian developers prepare for international access?

Develop to EU AI Act + MDR standards; this documentation base supports all three markets with minimal adaptation.


Next in Series: Article #7 – US Experience: FDA-Approved AI Devices

Series: Medical ML for Ukrainian Doctors | Stabilarity Hub Research Initiative


Author: Oleh Ivchenko | ONPU Researcher | Stabilarity Hub

Recent Posts

  • AI Economics: Economic Framework for AI Investment Decisions
  • AI Economics: Risk Profiles β€” Narrow vs General-Purpose AI Systems
  • AI Economics: Structural Differences β€” Traditional vs AI Software
  • Enterprise AI Risk: The 80-95% Failure Rate Problem β€” Introduction
  • Data Mining Chapter 4: Taxonomic Framework Overview β€” Classifying the Field

Recent Comments

  1. Oleh on Google Antigravity: Redefining AI-Assisted Software Development

Archives

  • February 2026

Categories

  • ai
  • AI Economics
  • Ancient IT History
  • Anticipatory Intelligence
  • hackathon
  • healthcare
  • innovation
  • Intellectual Data Analysis
  • medai
  • Medical ML Diagnosis
  • Research
  • Technology
  • Uncategorized

Language

© 2026 Stabilarity Hub | Powered by Superbs Personal Blog theme