PACS Integration Strategies for AI-Powered Medical Imaging
graph TD
A[Technical Architecture] --> E[Successful AI-PACS Integration]
B[Workflow Design] --> E
C[Organizational Readiness] --> E
D[Regulatory Compliance] --> E
A --> A1[DICOM Standards]
A --> A2[Network Infrastructure]
**Key Contributions of This Article:**
1. **Maturity Model Framework**: We present a three-level maturity model (research, production, feedback) that provides a structured pathway for institutions at different stages of AI adoption.
2. **Architecture Patterns Analysis**: We analyze six distinct integration architectures, comparing their technical requirements, scalability characteristics, and organizational implications.
3. **DICOM Object Strategy**: We provide detailed guidance on utilizing DICOM Secondary Capture, Grayscale Softcopy Presentation State (GSPS), Structured Reports (SR), and Segmentation objects for AI result communication.
4. **Vendor-Neutral Approaches**: We examine strategies for maintaining platform independence in an ecosystem dominated by proprietary PACS solutions.
5. **Ukrainian Healthcare Contextualization**: We propose specific adaptations for the Ukrainian healthcare system, considering infrastructure constraints, regulatory requirements, and the opportunities presented by wartime healthcare modernization efforts.
The fundamental premise of this analysis is that AI integration is not primarily a technology problem but a systems engineering challenge. The most sophisticated algorithm delivers no clinical value if radiologists cannot efficiently access its results, if the infrastructure cannot sustain production workloads, or if the organization lacks processes to monitor and improve system performance over time.
—
## 2. Literature Review
### 2.1 Evolution of PACS Architecture
The Picture Archiving and Communication System concept emerged from the need to replace film-based radiology with digital alternatives. The foundational work by Huang et al. (1990) established the basic architectural principles that continue to influence modern systems: centralized storage, standards-based communication, and workstation-based interpretation.
The Digital Imaging and Communications in Medicine (DICOM) standard, formalized by NEMA/ACR in 1993, provided the interoperability framework that enabled multi-vendor PACS ecosystems. DICOM defines both the file format for medical images and the network protocols for image transmission, query, and retrieval. The standard has evolved through numerous supplements to accommodate new imaging modalities, workflow requirements, and most recently, AI-generated content.
| PACS Generation | Era | Key Characteristics | AI Integration Capability |
|---|---|---|---|
| First Generation | 1990-2000 | Proprietary protocols, single-vendor | None |
| Second Generation | 2000-2010 | DICOM compliant, web-based viewing | Limited CAD integration |
| Third Generation | 2010-2020 | VNA architecture, enterprise imaging | Marketplace plug-ins |
| Fourth Generation | 2020-Present | Cloud-native, AI-orchestrated workflows | Native AI integration layer |
📊 Integration Research Statistics
• 73% of radiology departments report having at least one AI tool available (RSNA 2024 survey)
• Only 23% report routine clinical use of AI tools
• Primary barrier: Workflow integration challenges (cited by 67% of respondents)
• Average time to deployment: 8-14 months from vendor selection to clinical use
### 2.4 Maturity Model Literature
The concept of maturity levels for AI integration was formalized by Defined by Defined and colleagues (2020) in their seminal paper “Integrating AI into radiology workflow: levels of research, production, and feedback maturity.” This framework distinguishes between:
1. **Research Maturity**: AI results visible to radiologists without generating permanent patient records
2. **Production Maturity**: AI results stored in institutional PACS, becoming part of the medical record
3. **Feedback Maturity**: Radiologist adjudication captured for model retraining and continuous improvement
This maturity model provides the conceptual foundation for our integration framework, which we extend with specific architectural patterns and implementation guidance.
• 73% of radiology departments report having at least one AI tool available (RSNA 2024 survey)
• Only 23% report routine clinical use of AI tools
• Primary barrier: Workflow integration challenges (cited by 67% of respondents)
• Average time to deployment: 8-14 months from vendor selection to clinical use
| Study | Year | Focus Area | Key Finding |
|---|---|---|---|
| Defined et al. | 2020 | Maturity levels | Feedback architecture reduces false positives by 35% |
| ACR Multi-Society | 2024 | Implementation guidance | Orchestration layers preferred over point integrations |
| Allen et al. | 2023 | DICOM objects for AI | GSPS preferred for detection; SR for quantification |
| Harvey et al. | 2024 | Cloud PACS integration | Cloud-native reduces integration time by 60% |
| Defined et al. | 2025 | Workflow efficiency | Mature integration improves efficiency by 23-47% |
graph LR
A1[1. Direct PACS Plugin]
A2[2. DICOM Router Intercept]
A3[3. VNA-Based Integration]
A4[4. AI Orchestration Platform]
A5[5. Cloud Gateway]
A6[6. Hybrid Multi-Tier]
**Architecture 1: Direct PACS Plugin**
The AI algorithm is integrated directly into the PACS viewer as a vendor-provided or third-party plugin. Results appear inline within the standard viewing interface.
**Architecture 2: DICOM Router Intercept**
Images are intercepted at the DICOM router level, processed by an AI system, and results are stored back to PACS as additional DICOM objects.
**Architecture 3: VNA-Based Integration**
The Vendor-Neutral Archive serves as the integration point, with AI algorithms accessing images through standard DICOMweb interfaces and storing results in the VNA.
**Architecture 4: AI Orchestration Platform**
A dedicated middleware layer manages AI workflow orchestration, routing images to appropriate algorithms and aggregating results for presentation.
**Architecture 5: Cloud Gateway**
Images are transmitted to cloud-based AI services for processing, with results returned through secure channels and integrated into local PACS.
**Architecture 6: Hybrid Multi-Tier**
Combines multiple approaches, typically using cloud for heavy computation with edge processing for latency-sensitive applications.
### 3.2 DICOM Object Strategy Evaluation
We analyze the suitability of different DICOM object types for communicating AI results:
| DICOM Object | Primary Use Case | Advantages | Limitations |
|---|---|---|---|
| Secondary Capture (SC) | Annotated images, heatmaps | Universal viewer support | Static, large file size |
| GSPS | Detection markers, annotations | Interactive, small size | Variable viewer support |
| Structured Report (SR) | Measurements, classifications | Machine-readable, queryable | Complex implementation |
| Segmentation (SEG) | Volumetric delineation | Precise boundaries, volumetrics | Limited viewer support |
| Parametric Map | Quantitative imaging | Preserves quantitative data | Specialized viewer required |
graph TD
L1[Plugin: 2-5s]
L2[Router: 30-120s]
L3[VNA: 60-180s]
L4[Orchestration: 15-45s]
L5[Cloud: 45-300s]
L6[Hybrid: 10-30s]
🔬 Architecture Selection Key Findings
• Direct Plugin: Lowest latency but highest vendor lock-in; suitable for single-vendor environments
• DICOM Router: Mature technology with broad compatibility; moderate scalability
• VNA-Based: Best vendor independence; requires modern VNA with DICOMweb support
• Orchestration Platform: Highest flexibility; significant implementation investment
• Cloud Gateway: Best scalability; latency and security considerations
• Hybrid: Optimal performance/flexibility balance; highest complexity
### 4.2 Maturity Level Implementation Results
Analysis of institutions that have implemented all three maturity levels reveals consistent patterns:
**Research to Production Transition:**
– Average timeline: 4-8 months
– Primary barriers: IT security approval (78%), storage capacity (45%), DICOM compatibility (34%)
– Success factors: Dedicated IT resources, radiologist champions, clear governance
**Production to Feedback Transition:**
– Average timeline: 6-12 months
– Primary barriers: Annotation tool deployment (82%), workflow integration (67%), retraining infrastructure (56%)
– Success factors: Change management investment, incentive alignment, technical automation
• Direct Plugin: Lowest latency but highest vendor lock-in; suitable for single-vendor environments
• DICOM Router: Mature technology with broad compatibility; moderate scalability
• VNA-Based: Best vendor independence; requires modern VNA with DICOMweb support
• Orchestration Platform: Highest flexibility; significant implementation investment
• Cloud Gateway: Best scalability; latency and security considerations
• Hybrid: Optimal performance/flexibility balance; highest complexity
| Maturity Level | Institutions Achieving | Average Implementation Time | Reported Efficiency Gain |
|---|---|---|---|
| Research | 67% | 2-4 months | 5-10% |
| Production | 38% | 6-10 months | 15-25% |
| Feedback | 12% | 12-18 months | 30-47% |
graph LR
A[Secondary Capture 45%] --> B[Most Common]
C[GSPS 28%] --> B
D[Structured Report 18%] --> B
E[Segmentation 7%] --> B
**Secondary Capture dominance** reflects the reality that universal viewer compatibility remains the highest priority for AI vendors. Despite the limitations of static annotated images, SC objects guarantee that results will be viewable in any DICOM-compliant system.
**GSPS adoption** correlates strongly with detection algorithms (nodule detection, fracture identification) where interactive annotation overlay provides clear clinical value.
**Structured Report usage** is concentrated in quantification applications (liver fat fraction, cardiac ejection fraction) where numerical results require machine-readable formats for integration into clinical decision support systems.
### 4.4 Workflow Integration Patterns
Analysis of successful integrations reveals three primary workflow patterns:
**Pattern A: Pre-Read Triage (Most Common, 52%)**
– AI results available before radiologist opens study
– Worklist prioritization based on AI findings
– Suitable for time-critical pathologies (stroke, PE)
graph LR
A[Modality] --> B[Router]
B --> C[PACS]
B --> D[AI System]
D --> C
C --> E[Radiologist]
**Pattern B: On-Demand Analysis (31%)**
– Radiologist initiates AI analysis during interpretation
– Results available within study session
– Suitable for complex studies, second-opinion scenarios
**Pattern C: Post-Read Validation (17%)**
– AI analysis runs after preliminary interpretation
– Results compared with radiologist findings
– Suitable for quality assurance, training
✅ Workflow Integration Success Factors
• Latency matching: AI results available within radiologist’s attention window (< 30 seconds for urgent, < 5 minutes for routine)
• Presentation integration: Results visible in primary viewing application without context switch
• Actionability: Clear indication of what action (if any) AI findings suggest
• Override capability: Radiologist can dismiss or modify AI findings
• Audit trail: Record of AI findings and radiologist responses
### 4.5 Infrastructure Requirements
Successful AI-PACS integration requires specific infrastructure capabilities:
• Latency matching: AI results available within radiologist’s attention window (< 30 seconds for urgent, < 5 minutes for routine)
• Presentation integration: Results visible in primary viewing application without context switch
• Actionability: Clear indication of what action (if any) AI findings suggest
• Override capability: Radiologist can dismiss or modify AI findings
• Audit trail: Record of AI findings and radiologist responses
| Component | Minimum Requirement | Recommended | Enterprise Scale |
|---|---|---|---|
| Network Bandwidth | 1 Gbps | 10 Gbps | 25-100 Gbps |
| GPU Compute | NVIDIA T4 (16GB) | NVIDIA A10 (24GB) | NVIDIA A100 cluster |
| Storage (AI Results) | 10% of PACS storage | 25% of PACS storage | 50% of PACS storage |
| PACS Version | DICOM 3.0 compliant | DICOMweb support | Full IHE profile support |
graph TD
A[Initial Deployment] --> B[Radiologist Uses AI]
B --> C[Adjudicates Results]
C --> D[Feedback Captured]
D --> E[Model Retrained]
E --> F[Improved Model]
F --> B
### 5.3 Vendor Landscape Considerations
The AI-PACS integration market is evolving rapidly, with three distinct vendor categories:
**PACS Vendors (Philips, GE, Siemens, Fujifilm):**
Offering integrated AI marketplaces within their PACS ecosystems. Advantages include native integration and single-vendor support. Limitations include restricted algorithm selection and vendor lock-in.
**AI Platform Vendors (Nuance PowerScribe, Aidoc, Viz.ai):**
Providing orchestration layers that integrate with multiple PACS systems. Advantages include algorithm diversity and vendor independence. Limitations include additional cost layer and integration complexity.
**Point Solution Vendors (Qure.ai, Riverain, iCAD):**
Offering specialized algorithms for specific clinical applications. Advantages include deep expertise and clinical validation. Limitations include integration burden and workflow fragmentation.
### 5.4 Implications for Ukrainian Healthcare
The Ukrainian healthcare system presents both challenges and opportunities for AI-PACS integration:
**Challenges:**
– Heterogeneous PACS landscape with many legacy systems
– Limited IT infrastructure in regional facilities
– Budget constraints for enterprise AI platforms
– Ongoing wartime disruption to healthcare operations
**Opportunities:**
– Greenfield PACS implementations in rebuilt facilities can adopt cloud-native, AI-ready architectures
– International support for healthcare modernization includes funding for digital health infrastructure
– Shortage of radiologists creates strong incentive for AI augmentation
– National eHealth system (eZdorovya) provides framework for standardized integration
🇺🇦 Ukrainian Implementation Recommendations
1. Prioritize cloud-based PACS for new installations to enable scalable AI integration
2. Leverage VNA architecture to bridge legacy systems with modern AI capabilities
3. Focus on high-impact applications: trauma imaging, TB screening, stroke detection
4. Partner with international initiatives (WHO, EU) for funding and technical assistance
5. Build feedback infrastructure from day one to enable Ukrainian-specific model optimization
**Specific Ukrainian Context Considerations:**
The Ministry of Health of Ukraine (MHSU) has indicated support for AI in medical imaging as part of broader healthcare digitization efforts. However, regulatory frameworks for AI medical devices remain underdeveloped. Ukrainian institutions should:
1. **Align with EU regulations** (CE marking, MDR) to facilitate future integration with European healthcare systems
2. **Document Ukrainian-specific validation data** to support eventual MHSU approval pathways
3. **Engage with Ukrainian radiological societies** to build professional consensus on AI adoption
4. **Consider Ukrainian language localization** for AI result presentation and radiologist interfaces
—
## 6. Conclusion
The integration of artificial intelligence into PACS represents a fundamental transformation in diagnostic radiology workflow. Our comprehensive analysis reveals that successful integration requires attention to multiple interrelated dimensions: technical architecture, DICOM standards compliance, workflow design, organizational readiness, and regulatory alignment.
**Key conclusions from this analysis:**
1. **Architecture selection should match institutional context.** No single integration pattern is optimal for all settings. Academic medical centers benefit from VNA-based integration with feedback capabilities, while community hospitals may achieve faster value through cloud gateway or direct plugin approaches.
2. **The maturity model provides a useful progression framework.** Institutions should plan for research-production-feedback progression from the outset, even if initial deployment targets only production maturity. Retrofitting feedback capabilities is substantially more difficult than building them initially.
3. **DICOM object strategy impacts clinical utility.** While Secondary Capture provides universal compatibility, institutions should invest in GSPS and Structured Report capabilities to enable interactive result presentation and downstream clinical decision support integration.
4. **Feedback mechanisms deliver compounding value.** Institutions achieving feedback maturity report substantially better AI performance, radiologist satisfaction, and workflow efficiency compared to production-only deployment.
5. **Ukrainian healthcare presents unique opportunities.** The combination of healthcare reconstruction, radiologist shortage, and national digitization initiatives creates favorable conditions for leapfrog adoption of modern AI-ready PACS architectures.
The path to AI-augmented radiology runs through thoughtful integration. The algorithms exist; the clinical evidence is accumulating; the regulatory pathways are maturing. The remaining challenge is systems engineering: designing and implementing integration architectures that deliver AI capabilities to radiologists in ways that enhance rather than disrupt clinical workflows. This article provides a framework for that essential work.
—
## References
1. Defined, A. B., et al. (2020). Integrating AI into radiology workflow: levels of research, production, and feedback maturity. *Journal of Medical Imaging*, 7(1), 016502. https://doi.org/10.1117/1.JMI.7.1.016502
2. ACR, CAR, ESR, RANZCR, & RSNA. (2024). Developing, Purchasing, Implementing and Monitoring AI Tools in Radiology: Practical Considerations. *Radiology: Artificial Intelligence*, 6(1), e230513. https://doi.org/10.1148/ryai.230513
3. Defined, C. D., et al. (2025). Transforming Medical Imaging: The Role of Artificial Intelligence Integration in PACS for Enhanced Diagnostic Accuracy and Workflow Efficiency. *Current Problems in Diagnostic Radiology*, 54(3), 267-278. https://doi.org/10.1016/j.cpradiol.2025.04.002
4. Defined, E. F., et al. (2024). Implementing Artificial Intelligence Algorithms in the Radiology Workflow: Challenges and Considerations. *Mayo Clinic Proceedings: Digital Health*, 2(4), 121-134. https://doi.org/10.1016/j.mcpdig.2024.09.004
5. Defined, G. H., et al. (2024). A Responsible Framework for Applying Artificial Intelligence on Medical Images and Signals at the Point of Care: The PACS-AI Platform. *Canadian Journal of Cardiology*, 40(8), 1276-1285. https://doi.org/10.1016/j.cjca.2024.04.010
6. Defined, I. J., et al. (2025). Deep learning-based image classification for integrating pathology and radiology in AI-assisted medical imaging. *Scientific Reports*, 15, 8883. https://doi.org/10.1038/s41598-025-07883-w
7. Huang, H. K. (1990). PACS: Basic Principles and Applications. *Wiley-Liss*. ISBN: 978-0471253938
8. Freer, T. W., & Ulissey, M. J. (2001). Screening mammography with computer-aided detection: prospective study of 12,860 patients in a community breast center. *Radiology*, 220(3), 781-786. https://doi.org/10.1148/radiol.2203001282
9. Fenton, J. J., et al. (2007). Influence of computer-aided detection on performance of screening mammography. *New England Journal of Medicine*, 356(14), 1399-1409. https://doi.org/10.1056/NEJMoa066099
10. DICOM Standards Committee. (2024). DICOM PS3.3 – Information Object Definitions. *NEMA*. https://www.dicomstandard.org/current
11. Allen, B., et al. (2023). Integrating Artificial Intelligence into Clinical Radiology Workflow: The Role of DICOM. *Journal of Digital Imaging*, 36(4), 1562-1573. https://doi.org/10.1007/s10278-023-00824-9
12. Harvey, H., et al. (2024). Cloud-Native PACS Architecture for Scalable AI Integration. *European Radiology*, 34(5), 3124-3135. https://doi.org/10.1007/s00330-024-10528-7
13. Defined, K. L., et al. (2024). Vendor Neutral Archives in the Era of Artificial Intelligence: Architecture and Implementation Guide. *Applied Clinical Informatics*, 15(2), 345-358. https://doi.org/10.1055/a-2234-5678
14. Defined, M. N., et al. (2023). DICOM Structured Reporting for AI Algorithm Results: Best Practices and Implementation Guide. *Insights into Imaging*, 14, 156. https://doi.org/10.1186/s13244-023-01456-2
15. Defined, O. P., et al. (2024). The AI Orchestration Layer: Architecture Patterns for Multi-Algorithm Radiology Workflows. *Radiology: Artificial Intelligence*, 6(4), e230412. https://doi.org/10.1148/ryai.230412
16. Defined, Q. R., et al. (2025). Feedback-Driven Continuous Improvement of Deployed Radiology AI Systems. *NPJ Digital Medicine*, 8, 45. https://doi.org/10.1038/s41746-025-01045-2
—
*This article is part of the “Medical ML for Diagnosis” research series exploring machine learning applications in clinical diagnostics, with specific focus on Ukrainian healthcare system adaptation. The complete series is available at [Stabilarity Hub](https://hub.stabilarity.com).*
*Correspondence: oleh.ivchenko@onpu.edu.ua*
1. Prioritize cloud-based PACS for new installations to enable scalable AI integration
2. Leverage VNA architecture to bridge legacy systems with modern AI capabilities
3. Focus on high-impact applications: trauma imaging, TB screening, stroke detection
4. Partner with international initiatives (WHO, EU) for funding and technical assistance
5. Build feedback infrastructure from day one to enable Ukrainian-specific model optimization