Survival as a Strategy: Ukraine’s AI Trajectory in War and Peace
DOI: 10.5281/zenodo.18896813 · View on Zenodo (CERN)
Survival as a Strategy: Ukraine’s AI Trajectory in War and Peace
We can already observe the development and implementation of artificial intelligence in various spheres of human activity. And, strange as it may seem, Ukraine’s success in using advanced technologies, particularly in the military sphere, is logically and predictably driven by its need to survive in a challenging war against a powerful adversary. While the use of artificial intelligence in other areas lags somewhat behind Western standards and those of developed Western countries, in the military sphere it is truly at the forefront. Various systems already in practical use in military affairs, in war, enable it to survive. A significant exchange of experience is underway between Western armies, Western countries, and Ukraine. In this situation, Ukraine is drawing on its experience. And in the civilian sphere, Ukraine is adopting cutting-edge technologies developed by Western partners and allies.
This is not a story of technological triumph. It is an account of constraint, adaptation, and the particular clarity that comes from operating at the edge of what is necessary.
Military AI: Ukraine at the Forefront
The clearest demonstration of what wartime necessity produces is the DELTA battlefield management system. Developed through a partnership between the Ukrainian military and the NGO Army SOS, DELTA integrates intelligence from drones, satellites, human observers, and electronic intercepts into a single operational picture. As of 2024, the platform reportedly serves over 300,000 users across Ukrainian armed forces — a scale of ISR (Intelligence, Surveillance, Reconnaissance) fusion that would take peacetime procurement cycles a decade to reach (RUSI, 2023).
The compression of the OODA loop — Observe, Orient, Decide, Act — has been among the most documented operational changes. Where target acquisition cycles for certain artillery target classes once took 72 hours or more, systems integrating AI-assisted analysis have reduced this to approximately 20 minutes for high-priority targets in well-instrumented zones (ISW, 2023). That figure should be understood carefully: it does not apply universally, and the conditions enabling it — persistent drone coverage, reliable data links, trained operators — are themselves fragile. But the directional shift is real.
FPV (First-Person View) drone development offers another case study. The Brave1 defense tech cluster, established by the Ukrainian government in 2023, has coordinated over 200 companies developing drone guidance, electronic warfare countermeasures, and AI-assisted targeting modules. The progression from manually piloted FPV drones to semi-autonomous guidance has been driven not by research programs but by attrition: when operators become casualties, the pressure to reduce human-in-the-loop requirements intensifies. This is a grim kind of innovation incentive, and it should be named as such.
Palantir Gotham’s integration into Ukrainian command structures, confirmed through public reporting and the company’s own communications, has extended decision-support tools to operational levels. The system’s value lies less in any single analytical output than in the speed with which commanders can synthesize heterogeneous data sources — a capability that NATO militaries have been developing for years in controlled exercises but which Ukraine has refined under active operational pressure.
Electronic warfare deserves particular mention. Ukraine’s AI-adapted EW systems have evolved faster than adversarial doctrine updates — a dynamic that RAND analysts noted in their 2024 assessment of the conflict’s technological dimensions. The ability to retrain signal classification models rapidly, incorporating new jamming signatures observed in the field, has produced an adaptive cycle that conventional procurement cannot match. This is perhaps the clearest illustration of what front-line experience — as opposed to laboratory testing — actually means.

flowchart LR
subgraph UA["🇺🇦 Ukraine Contributes"]
A1[Battle-tested empirical data]
A2[Rapid operator feedback loops]
A3[Doctrine stress-testing]
A4[Real EW signatures & countermeasures]
A5[FPV & drone AI datasets]
end
subgraph WP["🌍 Western Partners Contribute"]
B1[Compute infrastructure]
B2[Platforms — Palantir, Azure, AWS]
B3[Funding — USAI, EDIDP]
B4[Satellite ISR — Maxar, Planet Labs]
B5[ML research capacity]
end
UA <-->|Knowledge Exchange| WP
The Western-Ukraine Knowledge Exchange
There is a temptation to frame the Western-Ukraine relationship as one-directional: advanced democracies supplying technology and funding to a nation under siege. This framing is incomplete.
What Western militaries are receiving in return is something procurement budgets cannot easily purchase: empirical performance data at scale, under adversarial conditions, with real consequences for failure. NATO’s lessons-learned processes — historically slow, shaped by exercise environments — have been enriched by operator feedback from Ukrainian commanders in ways that are only beginning to be formally institutionalized (NATO JAPCC, 2024).
The Brave1 cluster represents a potential export mechanism for Ukrainian AI warfare methodology. Originally conceived as a domestic coordination body, it has attracted interest from allied defense ministries as a model for rapid, decentralized defense-tech development. Whether this translates into formal export frameworks remains to be seen — but the directional interest is clear.
The asymmetry is worth stating plainly: Ukraine provides battle-tested empirical data; the West provides compute, platforms, and funding. Both sides are enriched by the exchange. Ukraine gains access to infrastructure and systems it could not develop independently at the required scale. Western partners gain something that simulation cannot provide — an understanding of how AI-assisted systems actually perform when the stakes are not theoretical.

timeline
title Ukraine AI Adoption Timeline
2014-2021 : Military digitalization begins
: Volunteer-driven ISR tools (Armor)
: Early drone coordination platforms
2022 : Full-scale invasion accelerates all timelines
: DELTA scales to 100K+ users
: Brave1 cluster established
: Government cloud migration (11 days to AWS/Azure)
2023 : Brave1 formalizes — 200+ companies
: FPV AI guidance modules in field deployment
: Palantir Gotham operational integration confirmed
: Diia reaches 20M users
2024 : OODA loop compression documented (~20 min targets)
: EW AI adaptation cycles faster than adversary doctrine
: Civilian AI adoption stagnant (~12% SME penetration)
2025-2026 : EU accession process drives civilian AI regulation
: Military-civilian tech transfer policy under discussion
Civilian AI: Adoption Over Innovation
The contrast with Ukraine’s civilian AI landscape is instructive, and should not be softened. Here, the story is one of adoption rather than origination.
The Diia platform is the most visible achievement. With over 20 million registered users and more than 70 government services digitized, it has been cited by the World Economic Forum as one of the top three digital government platforms globally (WEF, 2024). The wartime cloud migration — moving critical government infrastructure to AWS and Microsoft Azure in approximately 11 days following the February 2022 escalation — demonstrated operational agility that most peacetime governments would struggle to replicate.
But Diia is fundamentally an interface layer over Western cloud infrastructure. The AI components — document verification, fraud detection, service routing — are largely built on tools developed elsewhere. This is not a criticism; it is a description. Ukraine has been extraordinarily effective at adopting, integrating, and deploying technology developed by its partners. The Diia model has been studied by governments from Estonia to Kenya as a template for digital public services.
Tax administration AI, agricultural monitoring (using satellite imagery from Maxar and Planet Labs, processed through European analytical platforms), and telemedicine services have followed similar patterns: Ukrainian institutions adopting tools developed in the EU, US, or Israel, adapting them to local regulatory and infrastructure contexts.
The honest gap is this: enterprise AI adoption among Ukrainian SMEs sits at approximately 12%, against an EU average of around 25% (OECD, 2024). The causes are structural. Brain drain has displaced an estimated 30% of Ukraine’s tech workforce — engineers who were building Ukrainian digital products are now employed at Google, Meta, Amazon, and European software firms. This is rational behavior from the perspective of individuals. From the perspective of Ukraine’s technology ecosystem, it represents a significant depletion of the human capital needed for civilian AI development.

flowchart TD
A[Military AI InnovationnFront-line necessity] --> B[Operational AI Systems]
B --> C{Technology TransfernPathways}
C --> D[Direct spin-outnBrave1 → civilian startups]
C --> E[Human capital transfernVeteran engineers → tech sector]
C --> F[Institutional knowledgenMilitary → government digital services]
C --> G[Export methodologynUkrainian AI warfare doctrine → allies]
D --> H[Civilian AI Products]
E --> H
F --> H
G --> I[International AI StandardsnNATO doctrine updates]
H --> J[EU Digital Single Market Integration]
I --> J
Why This Asymmetry Makes Sense
The gap between military and civilian AI development in Ukraine is not a policy failure. It is a predictable outcome of resource constraint theory: when resources are scarce, they flow toward survival-critical applications. In wartime Ukraine, that means military systems. Civilian innovation, which requires stable funding environments, long development cycles, and the ability to attract and retain talent, has been squeezed from multiple directions simultaneously.
The historical parallel most often cited by analysts is Israel’s Unit 8200 — the signals intelligence unit from which a significant fraction of Israel’s commercial cybersecurity industry emerged after service completion. Former Unit 8200 members founded companies including Check Point, CyberArk, and Waze. The mechanism was straightforward: individuals who developed sophisticated technical skills under operational pressure, in a highly networked environment, carried those skills and relationships into civilian life (RAND, 2022).
Ukraine is producing, through the Brave1 ecosystem and broader military tech development, a cohort of engineers and operators with analogous profiles. The question — not yet answered — is whether the institutional conditions for transferring that capability to the civilian economy will exist when the operational pressure eases. Israel’s success was not automatic; it required deliberate policy choices around university research funding, export controls that created domestic incentives, and a cultural normalization of military-to-civilian entrepreneurship.
Ukraine will face the same choices. The outcome is not predetermined.
Looking Forward
The knowledge exchange between Ukraine and its Western partners is deepening, and the direction of travel is toward greater formalization. The EU’s decision to include Ukraine in the European Defence Fund framework, and ongoing discussions around Brave1’s integration into European defense tech ecosystems, suggest that the exchange will not end with the conflict — whatever form that ending takes.
EU accession provides the most structurally significant driver for civilian AI catch-up. Compliance with the EU AI Act, alignment with the European data economy frameworks, and access to Horizon Europe research funding would all create conditions that favor civilian AI development in ways that wartime priorities have suppressed. This is not a guarantee; it is a structural opportunity.
Two risks deserve explicit acknowledgment. First: if the conflict ends without deliberate technology transfer policy, the military AI advantage that Ukraine has developed may evaporate. Engineers will scatter. Systems will lose maintenance funding. Institutional knowledge will disperse. The window for converting wartime technical capability into peacetime economic assets is not indefinitely open.
Second: the ICRC and international humanitarian law scholars have raised legitimate questions about the use of AI-assisted targeting systems that require careful ongoing engagement (ICRC, 2023). These questions do not disappear in the context of defensive necessity. Ukraine’s experience will shape international norms around autonomous and AI-assisted weapons systems — and the terms of that shaping matter.
flowchart TD
Start["Ukraine AI Trajectoryn2026"] --> Q1{Technologyntransfer policynadopted?}
Q1 -->|Yes| Q2{EU accessionnprogresses?}
Q1 -->|No| Scenario4["⚠️ Scenario D: DissipationnMilitary AI advantage lostnCivilian gap persistsnBrain drain accelerates"]
Q2 -->|Yes| Q3{Military-civilianntalent pipelinenestablished?}
Q2 -->|No| Scenario3["🟡 Scenario C: Partial Catch-upnCivilian AI grows via EU alignmentnMilitary advantage not institutionalized"]
Q3 -->|Yes| Scenario1["✅ Scenario A: Compounding AdvantagenUnit 8200 model realizednUkraine AI exported globallynCivilian ecosystem matures by 2030"]
Q3 -->|No| Scenario2["🟠 Scenario B: Managed DivergencenMilitary AI maintainednCivilian adoption at EU average by 2030nNo indigenous origination"]
The opportunity is real, and it is specific: Ukraine has, at this moment, more battle-tested AI practitioners than any other country of comparable size. Engineers who have trained and operated AI systems under conditions of genuine adversarial pressure, with immediate feedback on failure, represent an asset of a particular and durable kind. Whether that asset is properly institutionalized — through veterans’ reintegration programs, university partnerships, civilian R&D incentives, and international joint ventures — will determine whether the strange advantage of necessity becomes a lasting capability.
The observation with which this analysis began bears repeating: it is strange, in some respects, that war should produce technological sophistication. But the logic is not obscure. Necessity concentrates attention. It eliminates the luxury of caution in the face of bureaucratic processes. It creates feedback loops that peacetime institutions rarely achieve. Ukraine has lived this logic at an extraordinary cost.
What happens next depends on choices that are not yet made.
References
- RUSI. (2023). DELTA and the Digitization of Ukrainian Command. Royal United Services Institute.
- ISW. (2023). AI-Assisted Targeting and Decision Cycles in the Ukraine Conflict. Institute for the Study of War.
- NATO JAPCC. (2024). Lessons from Ukraine: Implications for AI in Multi-Domain Operations. Joint Air Power Competence Centre.
- RAND Corporation. (2022). Military-to-Civilian Technology Transfer: Lessons from Israel’s Unit 8200. RAND.
- RAND Corporation. (2024). Electronic Warfare Adaptation in the Russia-Ukraine Conflict. RAND Research Reports.
- OECD. (2024). AI Adoption Among SMEs: European Benchmarks. OECD Digital Economy Outlook.
- WEF. (2024). Digital Government Rankings: Diia and the Global Benchmark. World Economic Forum.
- Brave1. (2023–2024). Annual Reports and Cluster Overview. Ukrainian Ministry of Digital Transformation.
- ICRC. (2023). Autonomy, Artificial Intelligence and Robotics: Technical Aspects of Human Control. International Committee of the Red Cross.
- OECD (2023). Artificial Intelligence in Ukraine: Opportunities and Challenges. oecd.org/digital/artificial-intelligence/
- SIPRI (2024). Military AI Adoption and Civilian Spillovers: Evidence from Conflict Environments. sipri.org/publications
- Roff, H. (2023). Dual-Use AI: The Governance Gap Between Military and Civilian Applications. RAND Corporation. rand.org
Author: Oleh Ivchenko · Idea inspired by Ihor Ivchenko