Skip to content

Stabilarity Hub

Menu
  • Home
  • Research
    • Medical ML Diagnosis
    • AI Economics
    • Cost-Effective AI
    • Anticipatory Intelligence
    • External Publications
    • Intellectual Data Analysis
    • Spec-Driven AI Development
    • Future of AI
    • AI Intelligence Architecture — A Research Series
    • Geopolitical Risk Intelligence
  • Projects
    • ScanLab
    • War Prediction
    • Risk Calculator
    • Anticipatory Intelligence Gap Analyzer
    • Data Mining Method Selector
    • AI Implementation ROI Calculator
    • AI Use Case Classifier & Matcher
    • AI Data Readiness Index Assessment
    • Ukraine Crisis Prediction Hub
    • Geopolitical Risk Platform
  • Events
    • MedAI Hackathon
  • Join Community
  • About
  • Contact
  • Terms of Service
Menu

Regulatory Landscape for Medical AI: FDA, CE Marking, and Ukrainian MHSU

Posted on February 8, 2026February 26, 2026 by Admin
Medical AI Regulatory Framework

Regulatory Landscape for Medical AI: FDA, CE Marking, and Ukrainian MHSU

📚 Academic Citation: Ivchenko, O. (2026). Regulatory Landscape for Medical AI: FDA, CE Marking, and Ukrainian MHSU. Medical ML Diagnosis Series. Odesa National Polytechnic University.
DOI: 10.5281/zenodo.14672187

Abstract

Navigating the regulatory landscape for medical AI requires understanding three distinct frameworks: the FDA’s mature Software as Medical Device (SaMD) pathway with over 1,200 approved AI/ML devices, the EU’s dual MDR/AI Act compliance burden, and Ukraine’s transitional system awaiting MDR harmonization. This analysis maps pathways for ScanLab and similar Ukrainian medical AI initiatives, identifying the optimal “develop once, deploy globally” strategy anchored on EU AI Act compliance. We examine the FDA’s innovative Predetermined Change Control Plans (PCCP) enabling adaptive algorithms, the EU’s August 2027 deadline for medical device AI compliance, and Ukraine’s EC Certificate recognition pathway for accelerated market access.

Keywords: Medical AI regulation, FDA SaMD, EU MDR, AI Act, MHSU, CE marking, Ukrainian healthcare


Context: Why Regulatory Understanding Matters

For ScanLab and any medical AI initiative targeting Ukrainian healthcare, regulatory compliance isn’t optional—it’s existential. Understanding the regulatory landscape determines:

  • Market access: Which markets can you legally enter?
  • Development priorities: What documentation and validation is required from day one?
  • Timeline and cost: Regulatory pathways vary from weeks to years and from thousands to millions in investment
  • Trust building: Physician adoption correlates strongly with regulatory approval status

1. United States: FDA Framework for AI/ML Medical Devices

The Regulatory Structure

The U.S. Food and Drug Administration (FDA) has been regulating medical devices since 1976 and has emerged as the global leader in AI-specific medical device regulation. The FDA defines Software as a Medical Device (SaMD) as software intended for medical purposes that performs without being part of a hardware device.

pie title FDA AI/ML Device Approvals by Pathway (2015-2025)
    "510(k) Premarket" : 92
    "De Novo" : 6.8
    "PMA (Full Approval)" : 1.2

As of late 2025, the FDA has authorized over 1,200 AI/ML-enabled medical devices, with approximately 100 new approvals annually. The global SaMD market is valued at $18.5 billion.

The 2024-2025 AI/ML Action Plan

In December 2024, the FDA finalized groundbreaking guidance introducing the Total Product Life Cycle (TPLC) approach specifically designed for AI/ML devices. This includes:

flowchart TD
    A[Traditional Medical Device] -->|Static Algorithm| B[One-time Approval]
    C[AI/ML Medical Device] -->|Adaptive Algorithm| D[PCCP Framework]
    D --> E[Pre-specified Changes]
    D --> F[Continuous Learning]
    E --> G[No Re-approval Needed]
    F --> H[Monitored Updates]

Predetermined Change Control Plans (PCCP): Manufacturers can pre-specify allowed modifications to AI algorithms. Updates within the approved PCCP scope don’t require re-approval, addressing the fundamental tension between AI’s adaptive nature and regulatory requirements for “locked” algorithms.


2. European Union: MDR and AI Act Double Regulation

The Regulatory Duality Challenge

European medical AI developers face a unique challenge: compliance with two overlapping regulatory frameworks:

  1. Medical Device Regulation (EU MDR 2017/745) – sector-specific medical device law
  2. EU AI Act (Regulation 2024/1689) – horizontal AI legislation
graph TD
    A[AI Medical Device] --> B{Requires Notified Body?}
    B -->|Yes - Class IIa+| C[HIGH-RISK under AI Act]
    B -->|No - Class I| D[May be Lower Risk]
    C --> E[Full MDR Compliance]
    C --> F[Full AI Act Compliance]
    E --> G[Conformity Assessment]
    F --> G
    G --> H[CE Marking]

Critical Point: Any medical device with AI that requires notified body involvement under MDR/IVDR is automatically classified as high-risk under the AI Act.

Compliance Timeline

gantt
    title EU AI Act Compliance Milestones
    dateFormat  YYYY-MM
    section Obligations
    General-purpose AI      :2025-05, 2025-08
    Prohibited practices    :2025-08, 2025-11
    High-risk systems       :2026-08, 2027-02
    Medical devices MDR     :2027-08, 2028-02

3. Ukraine: MHSU and EU Integration Path

Current Regulatory Framework

Ukraine’s medical device regulation is administered by:

  • Ministry of Health of Ukraine (MHSU/MOH): Central executive body for technical regulation
  • State Service of Ukraine on Medicines and Drugs Control (SSMD): Market supervision authority
  • Technical Regulations: Based on Decrees No. 753, 754, 755 (October 2013)

Critical Limitation: Ukrainian regulations are currently aligned with older EU Directives (93/42/EEC, 98/79/EC, 90/385/EEC), not the current EU MDR. Harmonization with MDR is expected within approximately 2 years.

flowchart LR
    subgraph Ukraine[Ukrainian Pathway]
    UA1[EC Certificate] --> UA2[Recognition]
    UA2 --> UA3[Ukrainian Market]
    end
    subgraph EU[EU Pathway]
    EU1[MDR Compliance] --> EU2[Notified Body]
    EU2 --> EU3[CE Marking]
    EU3 --> UA1
    end
    subgraph Direct[Direct Registration]
    D1[Local Testing] --> D2[SSMD Review]
    D2 --> D3[National Approval]
    end

EC Certificate Recognition (Simplified Pathway)

Ukraine offers a simplified pathway for devices already CE-marked in the EU:

  • EC Certificate from EU notified body with mutual recognition agreement
  • Authorized representative in Ukraine (minimum 5-year commitment)
  • Ukrainian language labeling and instructions for use
  • Declaration of conformity for Ukrainian market

4. Comparative Analysis

graph LR
    subgraph FDA[FDA - USA]
    F1[510k ~90 days]
    F2[De Novo ~60 days]
    F3[PMA ~180 days]
    end
    subgraph EU[EU MDR + AI Act]
    E1[Notified Body 6-18mo]
    E2[AI Act Compliance]
    end
    subgraph UA[Ukraine MHSU]
    U1[Direct 3-6mo]
    U2[EC Recognition 1-3mo]
    end

Key comparison points:

  • FDA: Most mature AI-specific framework; PCCP enables adaptive algorithms; 510(k) pathway accessible
  • EU: Dual regulation challenge; AI Act adds significant burden on top of MDR; August 2027 deadline
  • Ukraine: Based on older EU Directives; SRA reform underway; EC recognition provides efficient access

5. Strategic Recommendations for Ukrainian Developers

The optimal strategy for Ukrainian medical AI developers is “develop once, deploy globally”:

flowchart TD
    A[Development Phase] --> B[EU AI Act + MDR Standards]
    B --> C{Market Entry}
    C -->|Primary| D[Ukraine via EC Recognition]
    C -->|Secondary| E[EU Direct]
    C -->|Tertiary| F[FDA 510k]
    D --> G[6-12 months]
    E --> H[12-24 months]
    F --> I[12-18 months]

Immediate Actions

  1. Establish ISO 13485 QMS: Foundation for all regulatory pathways
  2. Document AI development per GMLP: Training data, validation, testing—all from project inception
  3. Design for transparency: Build explainability features that satisfy both clinical and regulatory needs
  4. Plan for adaptive algorithms: Implement change control processes compatible with FDA PCCP concept

6. Navigating Regulatory Uncertainty

Medical AI developers face unique challenges in regulatory environments that are themselves evolving. The FDA’s AI/ML regulatory framework continues to mature with new guidance documents, while the EU AI Act’s implementing regulations are still being finalized. Ukrainian regulations will likely undergo significant changes as EU integration progresses.

Managing Regulatory Risk:

  • Conservative Interpretation: When regulations are ambiguous, interpret conservatively to avoid future compliance gaps
  • Regulatory Intelligence: Monitor guidance documents, Q&A publications, and industry consortium interpretations
  • Pre-submission Meetings: Utilize FDA pre-sub meetings and EU notified body consultations to validate approach
  • Documentation Depth: Maintain documentation beyond current requirements in anticipation of stricter future rules

The Role of Standards

Harmonized standards provide practical implementation guidance that regulations intentionally lack:

  • IEC 62304: Software lifecycle processes—essential for any SaMD
  • ISO 14971: Risk management—increasingly important for AI-specific risks
  • IEC 82304-1: Health software requirements—covers standalone AI applications
  • ISO/IEC 23053: Framework for AI systems using machine learning (emerging)

Compliance with these standards creates a “presumption of conformity” with regulatory requirements, simplifying the approval process.

7. Post-Market Surveillance for AI

Unlike traditional medical devices, AI systems may drift in performance over time due to distribution shift in input data. All three regulatory frameworks increasingly emphasize post-market surveillance:

  • Performance Monitoring: Continuous tracking of sensitivity, specificity, and calibration
  • Adverse Event Reporting: Mandatory reporting thresholds for AI failures affecting patient outcomes
  • Real-World Evidence: Systematic collection of deployment performance data
  • Update Protocols: Procedures for model updates that maintain regulatory compliance

8. Clinical Evidence Requirements Across Jurisdictions

Each regulatory jurisdiction has distinct expectations for clinical evidence supporting AI/ML medical devices. Understanding these requirements early in development prevents costly delays during the approval process.

FDA Clinical Evidence Framework

The FDA’s approach to clinical evidence has evolved significantly for AI/ML devices. While traditional medical devices often require prospective clinical trials, AI-enabled devices may leverage retrospective validation studies under certain conditions:

  • Analytical Validation: Technical performance testing demonstrating algorithm accuracy on characterized datasets
  • Clinical Validation: Evidence that the device’s output has clinical meaning and utility in intended use population
  • Standalone vs. Adjunctive Claims: Standalone diagnostic claims require more rigorous evidence than decision-support tools

A 2024 FDA analysis revealed that only 1.6% of approved AI/ML devices submitted randomized clinical trial data. Most relied on retrospective studies using previously collected imaging and clinical data. This creates opportunities for Ukrainian developers with access to existing clinical databases.

EU MDR Clinical Evaluation

The EU MDR imposes stringent clinical evaluation requirements through several mechanisms:

  • Clinical Evaluation Report (CER): Comprehensive analysis of clinical data demonstrating safety and performance
  • Post-Market Clinical Follow-up (PMCF): Ongoing evidence collection throughout device lifecycle
  • Common Specifications: For high-risk devices, specific performance thresholds may be mandated
  • MEDDEV 2.7/1 Rev 4: Guidance document specifying clinical evaluation methodology

The MDR explicitly limits reliance on equivalence claims—a significant change from the previous MDD regime. AI developers can rarely claim equivalence to existing devices due to the unique characteristics of their algorithms, necessitating device-specific clinical evidence.

Ukrainian Evidence Requirements

Ukraine’s current framework follows the older EU Directive approach, with clinical evaluation requirements that are generally less stringent than MDR. However, developers should anticipate alignment with MDR standards as Ukraine progresses toward EU integration:

  • Current State: Clinical evaluation following MDD-era guidance documents
  • EC Recognition Path: EU clinical evidence accepted without additional local studies
  • Direct Registration: May require Ukrainian clinical data for novel devices
  • Future State: Full MDR alignment expected, including PMCF obligations

9. Cybersecurity and Data Privacy Considerations

Medical AI devices increasingly face cybersecurity scrutiny from regulators. The intersection of AI, medical devices, and patient data creates unique vulnerability surfaces that all three jurisdictions now address:

FDA Cybersecurity Requirements

The FDA’s 2023 cybersecurity guidance (final guidance on “Cybersecurity in Medical Devices”) applies to all networked medical devices, including AI systems:

  • Threat Modeling: Systematic identification of potential attack vectors
  • Security Risk Assessment: Integration with traditional device risk management
  • Software Bill of Materials (SBOM): Required documentation of all software components
  • Vulnerability Disclosure: Coordinated disclosure policies for identified vulnerabilities
  • Patch Management: Processes for deploying security updates without full re-submission

EU Cybersecurity Integration

The EU addresses cybersecurity through multiple overlapping frameworks:

  • MDR Annex I: General safety requirements including software security
  • GDPR: Data protection requirements for patient health information
  • NIS2 Directive: Network and information security for essential services
  • Cyber Resilience Act: Upcoming product security legislation affecting connected devices

Medical AI developers must navigate this regulatory stack, ensuring compliance with device regulations, data protection requirements, and emerging cybersecurity mandates simultaneously.

Ukrainian Data Protection

Ukraine’s Law on Personal Data Protection provides baseline requirements for health data processing. Key considerations for medical AI include:

  • Consent requirements for AI processing of patient data
  • Data minimization principles
  • Cross-border data transfer restrictions
  • Patient rights to explanation of AI-based decisions

As Ukraine aligns with EU standards, GDPR-equivalent requirements are anticipated, making early compliance with EU data protection standards a prudent development choice.


10. Documentation Best Practices

Regulatory success depends heavily on documentation quality. The following elements should be established from project inception:

Technical Documentation

  • Software Architecture: Complete system design including AI component integration
  • Algorithm Description: Model architecture, training methodology, feature engineering
  • Training Data: Dataset characteristics, curation process, bias assessment
  • Validation Protocol: Test datasets, performance metrics, acceptance criteria
  • Version Control: Complete history of algorithm changes with rationale

Quality Management

  • ISO 13485 QMS: Foundation for all regulatory pathways
  • Design Controls: Traceability from user needs through verification
  • Risk Management: ISO 14971-compliant process addressing AI-specific risks
  • CAPA System: Corrective and preventive action procedures

AI-Specific Documentation

  • Model Cards: Standardized documentation of model characteristics and limitations
  • Data Cards: Comprehensive dataset documentation including provenance
  • Fairness Assessment: Evaluation of performance across demographic subgroups
  • Explainability Methods: Techniques used to interpret model decisions

11. Practical Implementation Roadmap

Based on the regulatory landscape analysis, Ukrainian medical AI developers should follow a phased implementation approach that balances speed to market with long-term global scalability:

Phase 1: Foundation (Months 1-6)

  • Establish ISO 13485-compliant Quality Management System
  • Define intended use and target patient population precisely
  • Conduct preliminary risk assessment identifying AI-specific hazards
  • Begin documentation of training data provenance and characteristics
  • Engage regulatory consultant familiar with all three jurisdictions

Phase 2: Development (Months 6-18)

  • Develop algorithm following Good Machine Learning Practice (GMLP) principles
  • Implement comprehensive version control and change documentation
  • Conduct internal validation on representative datasets
  • Prepare preliminary clinical evidence package
  • Complete cybersecurity threat modeling and security testing

Phase 3: Regulatory Submission (Months 18-30)

  • Submit to EU notified body for MDR conformity assessment
  • Simultaneously prepare FDA pre-submission package for US pathway advice
  • Upon CE marking, apply for Ukrainian EC Certificate recognition
  • Establish authorized representative in Ukraine if based elsewhere
  • Prepare market-specific labeling and instructions for use

Phase 4: Deployment and Monitoring (Ongoing)

  • Implement real-world performance monitoring systems
  • Establish adverse event reporting procedures for each jurisdiction
  • Execute Post-Market Clinical Follow-up (PMCF) plan
  • Monitor for distribution shift requiring model updates
  • Maintain regulatory intelligence on evolving requirements

This roadmap assumes development of a Class IIa equivalent device. Higher-risk classifications require extended timelines and additional clinical evidence. Lower-risk devices may proceed more rapidly, particularly through Ukrainian direct registration for domestic-only deployment.


12. Cost-Benefit Considerations

Regulatory compliance represents a significant investment that must be factored into business planning. Approximate cost ranges for each pathway:

  • FDA 510(k): $50,000-$300,000 (dependent on predicate device availability and clinical evidence needs)
  • EU MDR CE Marking: €150,000-€500,000 (including notified body fees, clinical evaluation, and quality system certification)
  • Ukrainian Direct Registration: $20,000-$100,000 (significantly lower but limited to domestic market)
  • Ukrainian EC Recognition: Incremental $10,000-$30,000 on top of EU CE marking costs

The “develop once, deploy globally” strategy, while requiring higher upfront investment, delivers superior unit economics when targeting multiple markets. A device developed to EU AI Act + MDR standards can access Ukrainian, EU, and US markets with incremental regulatory effort, maximizing return on development investment.

Timeline considerations are equally important. The EC recognition pathway offers Ukrainian market access within 6-12 months of EU approval, compared to 12-24 months for FDA 510(k). For revenue-constrained startups, this accelerated cash flow may be decisive.


13. Future Regulatory Trends

The medical AI regulatory landscape continues to evolve rapidly. Several trends will shape requirements over the coming years:

Algorithmic Transparency: Regulators increasingly require explainability for AI decisions. The EU AI Act’s transparency requirements for high-risk systems will likely influence global standards. Developers should invest in interpretability techniques now.

Real-World Evidence: Post-market performance data will become more important than premarket clinical trials for AI validation. Systems for continuous performance monitoring should be designed into products from inception.

International Harmonization: The International Medical Device Regulators Forum (IMDRF) is developing harmonized approaches to SaMD and AI regulation. Alignment with IMDRF guidance positions devices for smoother multi-jurisdictional approval.

Adaptive Regulation: Sandbox programs and expedited pathways for innovative AI devices are emerging. Ukraine’s EU integration presents opportunities for participation in European regulatory innovation initiatives.


The FDA’s PCCP framework explicitly addresses post-market updates, while the EU AI Act requires ongoing monitoring throughout the AI system’s lifecycle. Ukrainian regulations currently lack specific AI post-market requirements but are expected to align with EU approaches.


Key Insights Summary

🎯 For Ukrainian Medical AI Developers

The optimal strategy is develop once, deploy globally:

  1. Build to the highest common denominator (EU AI Act + MDR)
  2. Use this documentation base for FDA and Ukrainian submissions
  3. Leverage EC Certificate recognition for Ukrainian market speed
  4. Plan for eventual Ukrainian MDR harmonization

Questions Answered

✅ How do FDA, EU, and Ukrainian frameworks differ?
FDA leads in AI-specific guidance with TPLC/PCCP; EU creates dual regulatory burden with MDR + AI Act; Ukraine relies on older Directive-based rules with EC recognition pathway.

✅ What are market authorization pathways?
FDA: 510(k)/De Novo/PMA; EU: CE marking via notified body; Ukraine: Direct assessment or EC Certificate recognition.

✅ How can Ukrainian developers prepare for international access?
Develop to EU AI Act + MDR standards; this documentation base supports all three markets with minimal adaptation.


Next in Series: Article #7 – US Experience: FDA-Approved AI Devices

Series: Medical ML for Ukrainian Doctors | Stabilarity Hub Research Initiative


Author: Oleh Ivchenko | ONPU Researcher | Stabilarity Hub

Recent Posts

  • Edge AI Economics: When Edge Beats Cloud
  • Velocity, Momentum, and Collapse: How Global Macro Dynamics Drive Near-Term Political Risk
  • Economic Vulnerability and Political Fragility: Are They the Same Crisis?
  • World Models: The Next AI Paradigm — Morning Review 2026-03-02
  • World Stability Intelligence: Unifying Conflict Prediction and Geopolitical Risk into a Single Model

Recent Comments

  1. Oleh on Google Antigravity: Redefining AI-Assisted Software Development

Archives

  • March 2026
  • February 2026

Categories

  • ai
  • AI Economics
  • Ancient IT History
  • Anticipatory Intelligence
  • Cost-Effective Enterprise AI
  • Future of AI
  • Geopolitical Risk Intelligence
  • hackathon
  • healthcare
  • innovation
  • Intellectual Data Analysis
  • medai
  • Medical ML Diagnosis
  • Research
  • Spec-Driven AI Development
  • Technology
  • Uncategorized
  • War Prediction

About

Stabilarity Research Hub is dedicated to advancing the frontiers of AI, from Medical ML to Anticipatory Intelligence. Our mission is to build robust and efficient AI systems for a safer future.

Language

  • Medical ML Diagnosis
  • AI Economics
  • Cost-Effective AI
  • Anticipatory Intelligence
  • Data Mining

Connect

Telegram: @Y0man

Email: contact@stabilarity.com

© 2026 Stabilarity Research Hub

© 2026 Stabilarity Hub | Powered by Superbs Personal Blog theme
Stabilarity Research Hub

Open research platform for AI, machine learning, and enterprise technology. All articles are preprints with DOI registration via Zenodo.

100+
Articles
6
Series
DOI
Archived

Research Series

  • Medical ML Diagnosis
  • Anticipatory Intelligence
  • Intellectual Data Analysis
  • AI Economics
  • Cost-Effective AI
  • Spec-Driven AI

Community

  • Join Community
  • MedAI Hack
  • Zenodo Archive
  • Contact Us

Legal

  • Terms of Service
  • About Us
  • Contact
Operated by
Stabilarity OÜ
Registry: 17150040
Estonian Business Register →
© 2026 Stabilarity OÜ. Content licensed under CC BY 4.0
Terms About Contact

We use cookies to enhance your experience and analyze site traffic. By clicking "Accept All", you consent to our use of cookies. Read our Terms of Service for more information.