Blog

Ressources

Summary
Ressources

Health Data and GDPR: Concrete Obligations for Healthcare Institutions in 2026

GDPR, AI Act, HDS: in 2026, health data compliance is no longer optional. Galeon builds it in by design.

The Essentials in 30 Seconds

Question Short Answer Key Takeaway
Does GDPR apply to hospital health data? Yes, in full Health data is a special category of sensitive data (art. 9 GDPR), subject to the strictest regulatory regime.
What is a DPIA and when is it mandatory? Data Protection Impact Assessment, mandatory before any high-risk processing In 2026, any medical AI deployment impacting diagnosis requires a dynamic DPIA, reassessed with every model update.
Is public cloud compliant for hosting health data? Not without HDS certification Only an HDS-certified (Health Data Hosting) provider can legally store patient data in France.
What is the AI Act and how does it affect hospitals? European regulation in force since 2024, progressively applied through 2026 Medical AI systems impacting diagnosis or clinical decisions are classified as "high risk": they are subject to enhanced requirements in documentation, human oversight, and compliance.
What is the HL7 FHIR standard and why does it matter? Health data interoperability format Without FHIR structuring, guaranteeing the accuracy and non-discrimination required by GDPR becomes a near-insurmountable challenge.
Is Galeon HDS certified? Yes Galeon holds both HDS and ISO 27001 certifications, the two prerequisites for hosting patient data in France.
What is data sovereignty in a hospital context? A hospital's ability to control where and how its data is processed Galeon's BSL® architecture keeps data on the institution's own servers: the algorithms move, not the data.
What does a hospital risk in case of GDPR non-compliance? Fine of up to 4% of global annual turnover, CNIL sanctions The CNIL regularly emphasises the importance of securing and tracking access to health data.

Introduction

The time for experimentation is over. Since 2024, the European AI Act has transformed the deployment of medical artificial intelligence from a research experiment with limited risk into a fully regulated legal obligation. Hospitals that were piloting AI solutions in "sandbox mode" now face an inescapable legal reality: AI systems impacting diagnosis or clinical decisions are classified as high risk, and that status demands transparency, traceability, and data sovereignty.

GDPR, which came into force in 2018, did not wait for this acceleration to set its requirements. But in 2026, the convergence of GDPR, the AI Act – whose implementation timeline extends from 2024 to 2027 depending on the category of systems –, the NIS2 directive, and the HDS v2.0 framework creates a compliance landscape that IT Directors can no longer manage in a fragmented way. Unstructured health data, hosted without certification, processed by an opaque AI: that is both a medical risk and a major legal liability.

Galeon deploys its intelligent EHR across 19 hospitals, covering more than 3 million patient records and 10,000 active healthcare professionals. This deployment is built on an architecture that integrates compliance from the ground up, following the privacy by design principle enshrined in article 25 of GDPR. This article explains what that means in practice for CIOs and CEOs of hospital institutions in 2026.

Why Is AI Trained on Unstructured Data a Legal and Medical Risk?

What Is "Unstructured" Health Data and Why Does It Create Problems?

Unstructured data includes free-text clinical notes, DICOM images (the standard format for digital medical imaging) without normalised metadata, and lab results entered in open fields without SNOMED (the international clinical terminology system) or LOINC (the coding system for biological and clinical examinations) coding. This type of data remains the norm in the majority of legacy French hospital EHR systems.

The problem is twofold. On one hand, an AI trained on heterogeneous data produces non-reproducible and potentially biased results, which directly conflicts with GDPR's requirements for accuracy and data minimisation (art. 5). On the other hand, the AI Act requires that high-risk AI systems be trained on data that is "relevant, representative, and as free from errors as possible".

The vast majority of French hospital institutions have not yet reached a sufficient level of FHIR maturity to facilitate a compliant AI deployment. This structural gap creates direct tension with the requirements of the 2026 regulatory framework.

Without FHIR structuring, guaranteeing the accuracy and non-discrimination required by GDPR becomes a near-insurmountable challenge. This is the reality facing most hospital CIOs in 2026.

The Dynamic DPIA: A New Obligation for Every AI Deployment

A Data Protection Impact Assessment (DPIA) is mandatory whenever processing is likely to generate a high risk for the individuals concerned – which is the case for medical AI systems impacting diagnosis or clinical decision-making. In 2026, the CNIL specifies that this DPIA must be dynamic: it must be updated with every substantial modification to the AI model, not only at initial deployment.

In practice, a hospital DPIA must cover: the precise purpose of the processing, the applicable legal basis (commonly public interest or preventive medicine, art. 9.2.h and j GDPR), technical security measures, data retention periods, and patient rights. Any subcontracting to a software vendor additionally requires a compliant DPA (Data Processing Agreement).

Why Is HDS Certification the Only Viable Option for Hosting Patient Data in 2026?

What Is HDS Certification and What Does It Actually Cover?

HDS certification (Health Data Hosting), issued by bodies accredited by COFRAC (the French Accreditation Committee, the national body responsible for accrediting certification organisations), is mandatory for any entity hosting personal health data on behalf of a healthcare professional. It is built on ISO 27001 – which is a prerequisite – and adds sector-specific requirements: access management, operational traceability, business continuity planning, and data localisation within the EEA (European Economic Area, comprising the 27 EU member states as well as Norway, Iceland, and Liechtenstein).

In 2026, the HDS v2.0 framework explicitly incorporates strengthened requirements on infrastructure resilience and access auditing, aligned with the NIS2 obligations (the European directive on the security of network and information systems, which came into force in 2023 and reinforces cybersecurity obligations for essential entities, including hospitals) for essential service operators – a category that now includes hospitals of significant size.

Galeon holds both HDS and ISO 27001 certifications. These two certifications represent the minimum baseline of trust for any partnership with a French hospital institution.

US Public Cloud: A Major Risk for Health Data Sovereignty

Entrusting patient data to a non-HDS-certified hosting provider exposes the institution to a clear violation of the French Public Health Code (art. L.1111-8) and GDPR. The CNIL regularly emphasises the importance of securing health data and ensuring full traceability of access.

Beyond certification, the question of applicable jurisdiction is structurally significant. The US CLOUD Act of 2018 authorises American federal authorities to compel US-based cloud providers to disclose data, including data hosted in Europe. This extraterritoriality represents a major risk to health data sovereignty and must be explicitly factored into risk analysis and architectural decisions.

The question is not only where data is hosted: it is under which jurisdiction it can be compelled.

Galeon's Blockchain Swarm Learning® architecture addresses this requirement by design: data remains on the hospital's own servers, and only the algorithms travel to enable decentralised training.

How Can Hospitals Guarantee Transparency and Explainability of Medical AI Under the AI Act?

What Does the AI Act Concretely Require of Hospitals Deploying AI?

The AI Act, which entered into force in August 2024, follows a progressive timeline: provisions on high-risk systems apply in full from August 2026 for new systems placed on the market, with transitional arrangements for existing systems. AI used in clinical decisions – diagnosis, triage, assisted prescribing – falls under the "high risk" category whenever it influences a clinical decision (Annex III of the AI Act).

This classification triggers a set of obligations: registration in the EU database for high-risk AI systems, complete technical documentation, audit logs retained for a minimum of 10 years, and a mandatory human oversight requirement. The AI Act does not require AI to be infallible; it requires a qualified human to remain able to understand, challenge, and correct its decisions. These obligations fall within a framework of risk management, data governance, logging, and ongoing compliance.

Algorithmic Transparency: The Challenge of Complex Architectures

The explainability requirement is particularly demanding for technical teams. The best-performing deep learning models in medical diagnosis – pathology detection in imaging, complication prediction – are precisely those whose explainability is hardest to produce.

The AI Act does not prohibit these complex architectures, but it requires reinforced human oversight and documentation demonstrating that decisions can be understood and challenged by a clinician. In practice, deploying a "black box" medical AI without an explainability layer makes compliance with the AI Act extremely complex and risky, both legally and operationally.

CIOs have a responsibility to qualify their AI vendors against these criteria. The questions to ask are straightforward: can the algorithm be audited by an independent third party? Does it produce explanations that a clinician can read and understand? Have the biases in its training dataset been documented?

Galeon's BSL® traces every action during AI training across the inter-hospital blockchain network. This native traceability directly meets the AI Act's documentation requirements, without any additional technical layer.

Comparison: Traditional Approach vs. Galeon's Approach Against GDPR/AI Act Obligations in 2026

Compliance Criterion Traditional EHR (standard approach) Galeon EHR (BSL® approach)
Data structuring Heterogeneous data, proprietary formats, low interoperability. Native HL7 FHIR structuring, data readable and usable from the point of entry.
HDS-certified hosting Varies by vendor, often outsourced without location guarantees. HDS and ISO 27001 certified, data hosted on the institution's own servers.
Data location and jurisdiction Frequently in shared clouds, sometimes subject to non-European jurisdictions. Data maintained on hospital servers, decentralised architecture by design.
Dynamic DPIA Rarely updated after initial deployment. Blockchain traceability of AI training facilitating DPIA updates.
AI Act compliance (high risk) Explainability often absent, fragmented audit logs, human oversight not formalised. Training traced on blockchain, native technical documentation, human oversight integrated into protocol.*
Patient rights (access, rectification, erasure) Manual processes, long delays, risk of omission across multiple systems. Patient consent managed at EHR level, revocation possible at any time.*
Subcontractor management (DPA) Multiple vendors, DPAs often incomplete or generic. Documented contractual chain, controller and processor roles clearly defined.
Access traceability (NIS2/HDS v2.0) Logs often centralised at vendor level, not audited by the hospital. Native traceability, accessible to the institution, compliant with NIS2 requirements.
Breach notification (72h GDPR) Depends on vendor responsiveness, deadline rarely met. Detection facilitated by blockchain monitoring, integrated notification procedure.*
Cost of compliance High: multiple parallel projects (DPA, DPIA, HDS, AI Act). Rationalised: compliance built into the architecture, not added as an afterthought.

What Are the Real Limits and Challenges of GDPR Compliance for Hospitals?

GDPR compliance in hospital settings cannot be decreed: it is built over time, and several significant obstacles remain.

The operational burden falls on under-resourced teams. Hospital DPOs (Data Protection Officers) manage hundreds of different processing activities on average. Updating processing registers, conducting dynamic DPIAs, and managing patient rights represent a workload that is difficult to absorb without dedicated tooling.

Migrating historical data is an underestimated undertaking. Restructuring data produced over 10 or 20 years in proprietary formats into FHIR is a multi-year project. During the transition, institutions operate in a hybrid environment where a portion of data remains difficult to use within a compliant framework.

The tension between AI performance and explainability remains unresolved. The best-performing diagnostic AI models are often the least explainable. The AI Act requires human oversight and documentation without prescribing the technical method. Vendors and hospitals must find their own solutions, which represents a significant R&D effort.

Training healthcare professionals is the weakest link. A compliant EHR is not enough if users circumvent procedures: sharing data via unsecured messaging, accessing records outside protocol in emergency situations. Real-world compliance depends as much on culture as on technology.

The coexistence of multiple frameworks creates complexity. GDPR, AI Act, NIS2, HDS, PGSSI-S, identity vigilance frameworks: each standard has its own timeline, its own requirements, and its own oversight bodies. The absence of a unified framework forces CIOs to orchestrate multi-framework compliance, often without dedicated resources.

FAQ – Health Data and GDPR: Frequently Asked Questions from CIOs

Can a hospital use an AI tool provided by a US vendor to process patient data?Not without strict contractual and technical safeguards. Since the invalidation of the Privacy Shield, any transfer of data to the United States requires Standard Contractual Clauses (SCCs) validated by the CNIL and an analysis of US laws that could compel the vendor to disclose data (CLOUD Act). In practice, the extraterritoriality of US law represents a documented legal risk that few institutions have the means to fully manage, which makes sovereign-hosting solutions structurally safer.

What is the difference between HDS and ISO 27001?ISO 27001 is an international standard for information security management, required as a prerequisite for HDS certification. HDS certification is specific to the French healthcare sector and adds obligations on patient data management, business continuity, and the rights of data subjects. Obtaining ISO 27001 is not sufficient to legally host health data in France: the two certifications are distinct and complementary.

How often should a DPIA be updated for a medical AI system?The CNIL recommends reviewing the DPIA at every substantial change to the processing: major algorithm update, change in the scope of data processed, new clinical use case, change of hosting provider. For medical AI systems in active production that are regularly retrained, a minimum annual review is expected, and more frequent updates if the model evolves significantly.

Does the right to erasure apply to medical data?Yes, but with important exceptions. Health data may be retained beyond an erasure request if its processing is necessary for archiving purposes in the public interest, for scientific research purposes, or for the performance of a public interest mission. The legal retention period for patient records in France is 20 years from the date of the last stay. The right to erasure applies to processing that goes beyond these legitimate purposes.

What constitutes "high-risk" processing under the AI Act and how does a hospital qualify it?The AI Act classifies as "high risk" AI systems used in sensitive domains listed in its Annex III, including medical devices and clinical decision support systems. In practice, an AI that influences a diagnosis, a triage decision, or a treatment decision falls into this category. The qualification must be carried out by the institution together with its DPO and its vendor, based on the guidelines published by the European Commission.

What happens if a hospital suffers a data breach and does not notify the CNIL within 72 hours?GDPR requires notification to the CNIL within 72 hours of becoming aware of the breach (art. 33). Failure to meet this deadline constitutes a separate violation, which may result in an independent administrative penalty. The CNIL has issued fines against healthcare institutions for failure to notify or late notification: the healthcare sector is among the most closely monitored in France.

Conclusion – Ethical AI as the Foundation of Medical Trust

In 2026, GDPR compliance in hospital institutions is no longer an IT project: it is a condition of practice. The convergence of GDPR, the AI Act, and the HDS v2.0 framework defines a demanding landscape in which every medical AI deployment must be justifiable, traceable, and auditable. Hospitals that have invested in structured data, sovereign hosting, and transparent architecture do not suffer under this regulation: they draw a competitive advantage from it and build a stronger relationship of trust with their patients.

The most reliable medical AI will be the one built on the most rigorously governed data. Hospital data sovereignty is not a barrier to innovation: it is its foundation. Without FHIR structuring, without HDS certification, without algorithmic traceability, compliance with the 2026 regulatory framework becomes a permanent and costly undertaking. Galeon has been building this foundation since 2016, across 19 hospitals, so that the medicine of tomorrow rests on data worthy of trust.

Would you like to know more about our Smart EHR ?

Book a demo

They trust us

Logo du Centre Hospitalier Intercommunal Toulon La Seyne-sur-MerLogo du Centre Hospitalier Sud Francilien (CHSF)Logo blanc du GHNE (Groupement Hospitalier Nord Essonne) sur fond transparentLogo du CHU de RouenLogo du CHU Caen Normandie