We have GDPR. We have HDS. We have the EHDS currently being rolled out. And yet, European hospitals have never been more dependent on foreign infrastructures.
This paradox is not an accident. It is the result of a sustained confusion between regulatory compliance and real sovereignty. Ticking legal boxes is not enough to define who decides, who learns, who extracts value from health data.
Illusion: our data is protected by law, so we control it.
Reality: it is hosted in Europe, but processed, aggregated and monetised by platforms whose AI models are owned, patented and commercialised outside our borders.
Consequence: a hospital that believes it is sovereign because it signed an HDS contract is often, in practice, a fuel supplier for an artificial intelligence it will have to buy back at great cost in five years.
The argument of this article is simple: health data sovereignty is not a legal question. It is an architectural one.
Medical AI models need fuel. That fuel is structured clinical data: hospital discharge reports, prescriptions, lab results, imaging, care pathways. European hospitals produce billions of data points every day.
The problem is not that this data exists. The problem is knowing who trains on it, who owns the resulting models, and who sets the access price.
Illusion: Big Tech offers medical AI tools to help caregivers.
Reality: they offer to sell back, as a service, the intelligence built from the data hospitals entrusted to them.
Consequence: a hospital that integrates a third-party AI into its care protocols gradually transfers part of its clinical decision-making to an actor whose commercial priorities are not aligned with public health.
The model is structurally unfavourable to data producers. A cloud platform integrates data from 200 European hospitals. It trains a model to predict post-operative complications. It patents that model. It sells it back to those same 200 hospitals as a subscription.
The hospital contributed 100% of the raw value. It captures 0%.
This is not a hypothesis. It describes the current operating model of the leading digital health platforms deployed in Europe, whose headquarters, training servers and R&D teams are located outside the European Union.
Technological lock-in in healthcare is often framed as a financial issue: migration costs, contractual dependency, exit penalties. These aspects are real, but they obscure a far deeper problem.
Illusion: changing software is difficult, but possible if you invest the resources.
Reality: when an institution's structured data is trapped in a proprietary format and the algorithms that exploit it belong to a third party, the physician is no longer in control of their tool.
Consequence: the tool gradually dictates protocols, alerts and care priorities. The caregiver becomes the operator of an intelligence they do not understand, do not control, and cannot challenge.
The answer to this problem is not narrowly technical. It is architectural, and it begins at the moment a caregiver enters a data point.
If data is structured at the source, in open standard formats (HL7 FHIR, SNOMED, LOINC), it remains portable, exploitable, and above all interpretable by any algorithm the institution chooses tomorrow.
If it is captured in a proprietary, opaque format dependent on a third-party AI model, it becomes captive data. And captive data means a captive caregiver.
The tool must serve the hand. Not the other way around. This is not a slogan. It is an architectural choice that must be made before deployment, not after.
European health data is, objectively, among the richest in the world. Universal social protection systems, longitudinal medical records spanning decades, dense epidemiological cohorts, exhaustive prescription data made possible by mandatory reimbursements. This is a massive competitive advantage for medical research.
But the value added from processing this data (AI models, patents, scientific publications, precision health services) does not stay in Europe. It accumulates where the processing infrastructures are located.
Illusion: our data stays in Europe, so our healthcare system benefits from it.
Reality: the value is not in the raw data. It lies in the ability to process it, aggregate it, and extract predictive models from it. That capacity is progressively concentrating outside the European Union.
Consequence: Europe finances global medical R&D through its data, then buys it back in the form of services. This is a silent value drain, invisible in hospital budgets, but massive at the scale of a national healthcare system.
The response to this drain is not protectionist. It is technical and organisational.
It rests on three non-negotiable conditions.
First, algorithms must travel to the data, not the other way around. The Blockchain Swarm Learning model, where AIs train locally on each institution's data without ever moving it, is the only architecture that guarantees both regulatory compliance and economic sovereignty.
Second, the value generated by exploiting data must be partially redistributed to the institutions that produce it. A hospital that contributes to training a sepsis prediction model should receive a share of the value created, not merely benefit from it for free as an end user.
Third, interoperability standards (FHIR R4, the Ségur du Numérique frameworks) must be treated not as regulatory constraints, but as sovereignty levers. An interoperable system is a system that cannot be held captive.
The question every CIO and hospital director must ask their providers is not "are you HDS certified?" but "if we leave tomorrow, exactly what do we get back, in what format, and within what timeframe?"
And for policymakers, the question is different but equally direct: is the health data produced in public hospitals currently funding medical patents held by American or Chinese companies?
Health data sovereignty is not a compliance topic for lawyers. It is a survival issue for our healthcare system. Either we keep building on the sand of infrastructures we do not control, or we choose to build our own roads. In healthcare, the choice of architecture is a political choice. It is time to decide whether tomorrow's hospital will be the customer of a monopoly or the owner of its own intelligence.
No. GDPR regulates the processing and protection of personal data. It does not prevent a foreign company from owning the AI models trained on that data, nor from commercialising the services derived from their exploitation. GDPR compliance is necessary, but it does not protect against economic value capture.
It is an architecture in which AI algorithms travel between institutions to train locally, without data ever leaving hospital servers. Only the model weights (the results of the learning process) are aggregated via the blockchain. This makes it possible to build collective intelligence without data transfer, simultaneously resolving the HDS compliance issue and the sovereignty issue.
Yes, if the architecture provides for it contractually and technically from the outset. Models exist, notably through redistribution mechanisms indexed to the use of data in research or medical value-creation projects. This is a fundamental break from traditional hosting models, where hospitals cede the value of their data without any return.
The European Health Data Space (EHDS) is the European regulatory framework designed to enable health data sharing between member states for research and care. It creates a legal basis for patient data portability and secondary access to data for research. But as with GDPR, the EHDS regulates flows without guaranteeing that the value created stays in Europe. Reclaiming the processing infrastructure remains a challenge distinct from regulation.
Galeon currently deploys its EHR in 19 French hospitals, including two university hospital centres, with over 3 million patient records and 10,000 active caregivers. Its Blockchain Swarm Learning architecture was designed from day one so that data never leaves the institution's servers.
Want to go deeper on the regulatory challenges around health data hosting? Read our full guide on choosing an HDS provider in 2026.
Assurance Maladie : SNDS Overview — care pathways of 70M+ people over 20 years of history
French Ministry of Health : National Strategy for Secondary Use of Health Data 2025-2028, September 2024
Cour des comptes : IT Security in Healthcare Institutions, January 2025 — hospital digital underinvestment (1.7% of operating budget vs. 9% in banking)
DSIH : Digital Sovereignty — France's Health Data Platform to migrate to SecNumCloud by end of 2026, February 2026
SREN Act No. 2024-449 of 21 May 2024 : digital sovereignty framework for health data, SecNumCloud requirement
Fortune Business Insights : AI in Healthcare Market, 2025 — global market estimated at ~$39Bn in 2025, projected ~$56Bn in 2026
CNIL : Risks of a European cloud certification allowing foreign authority access to sensitive data — SecNumCloud recommendation for large health databases
French Senate : "AI and Health" Report, 2023-2024 — dependency on foreign infrastructures for medical AI training, Microsoft Azure / SNDS case
European Commission : AI Act (EU Regulation 2024/1689, in force since 1 August 2024) — high-risk AI in healthcare, traceability and human oversight requirements




