Between Safeguards and Stalemates: Regulatory Coherence in the Secondary Use of Health Data under the EHDSR, DGA, and GDPR
Research Questions:
- Does the EHDSR apply uniformly across health data categories?
- Under what legal and procedural conditions is secondary use permitted?
- Do the instruments, collectively, achieve a functional balance—or does one objective dominate? GDPR means General Data Protection Regulation (Regulation (EU) 2016/679 )
DGA means Data Governance Act (Regulation (EU) 2022/868)
EHDSR means European Health Data Space Regulation (Regulation (EU) 2025/327) ________________________________________________________________________
1. Introduction:
The metaphor of data as “the new oil” has become central to regulatory discourse in the digital economy, underscoring data’s dual role as both a driver of innovation and a source of regulatory tension.iMuch like oil during the industrial revolution, data today fuels transformation across sectors including healthcare, finance, and manufacturing; enabling new forms of production, value creation, and competitive advantage.iiHowever, data’s intangible, non-rivalrous nature presents new tensions around access, control, and redistribution.As such, policymakers are tasked with enabling data sharing and reuse while protecting privacy, intellectual property, and market fairness.Within this context, data access governance has become a key, yet contested, pillar of the EU’s digital strategy.
This governance model departs sharply from the early cyber-libertarian ethos, exemplified by figures like John Perry Barlow and John Gilmore, who viewed information as inherently resistant to regulation. iii In contrast, the EU has embraced a more interventionist approach.The European Data Strategy promotes a unified internal market for data, especially non-personal data, guided by principles of data sovereignty, legal certainty, and responsible innovation.ivBy positioning data as a “fifth freedom” alongside goods, services, capital, and people, the EU advances a vision of data governance that is both rights-based and innovation-enabling.v
This vision is acutely tested in the healthcare sector. The integration of digital technologies has transformed medical research, treatment delivery, and public health systems. Big data and algorithmic tools now enable “beyond-the-pill” solutions, combining treatments with diagnostics, wearables, and predictive analytics.viThese advances rely on large datasets: electronic health records, genomic data, imaging, and biometric inputs which are often highly sensitive and subject to strict regulation.vii
Within this new paradigm, data functions like a raw material refined through AI, machine learning, and analytics into insights that power diagnostics, predictive models, and personalized treatmentsviii.However, this datafication of healthcare raises profound legal and ethical challenges. Much of the underlying data is sensitive, and often protected: databases may benefit from sui generis rights, genetic sequences can be patented, and algorithms or analytic methods may be shielded as trade secrets.ixMeanwhile, patients
2
remain at an informational disadvantage, lacking both access to and understanding of the data that increasingly drives their care.x
Against this backdrop, the EU has introduced regulatory instruments governing the use and reuse of health data. Together, these aim to foster responsible innovation while upholding fundamental rights.xiThis essay analyzes whether the interplay among these instruments forms a coherent and effective legal framework for the secondary use of health data in the EU. It asks whether they succeed in enabling socially beneficial data-driven innovation while maintaining robust safeguards, or whether their interaction reveals gaps, overlaps, or tensions.
2. EU Legislative Approach to Governance of Non-Personal Data in health sector
The EU has established a comprehensive legal framework to regulate data use across multiple sectors, particularly in support of emerging technologies such as AI and data- driven healthcare innovation.The GDPR, DGA, and EHDSR being ‘Regulation’ collectively aiming to harmonize data governance across Member States, these instruments simultaneously preserve national discretion in areas where cultural, legal, or systemic divergence persists. This reflects a regulatory strategy that seeks to balance uniformity with subsidiarity in areas of heightened sensitivity, particularly concerning health data.
The GDPR, despite its status as a regulation intended to ensure EU-wide uniformity, allows Member States to introduce additional conditions or limitations on the processing of special categories of personal data, including health and biometric data, pursuant to Article 9(4). Moreover, it permits national variation regarding the age of consent for children.xiiThe DGA similarly allows Member States discretion in implementing key institutional elements, such as the establishment and design of competent authorities and oversight bodies.xiiiEHDSR also accommodates national flexibility, particularly in defining rules for secondary data use and in structuring Health Data Access Bodies, where existing national laws governing sensitive health data remain operative.xiv
These regulatory divergences reflect underlying differences in legal bases and policy objectives.The GDPR is founded on Article 16 TFEU, underscoring its fundamental rights focus, specifically, the right to privacy as enshrined in Article 8 ECHR and Articles 7 and
3
8 of the EU Charter of Fundamental Rights. Jurisprudence from both the ECtHR and the CJEU has reinforced the need for effective enforcement of privacy rights. In I v. Finland, for instance, the ECtHR found a violation of Article 8 ECHR after the state failed to prevent unauthorized access to a patient’s medical records, highlighting that legal protections must be accompanied by robust enforcement mechanisms.xvSimilarly, the CJEU in Digital Rights Ireland emphasized that any derogation from data protection norms must be limited to what is strictly necessary to preserve fundamental rights.xvi
In contrast, the DGA is rooted in Article 114 TFEU, which provides the legal basis for measures aimed at the establishment and functioning of the internal market.The EHDSR relies on a dual legal basis:Articles 114 and 16 TFEU, reflecting its ambition to simultaneously promote internal market efficiency and uphold fundamental rights.This hybrid foundation seeks to align economic integration with the EU’s data protection framework, thereby enhancing legal certainty and policy coherence.It also helps insulate the regulation from claims of competence overreach, particularly in sensitive policy areas such as healthcare and privacyxvii.
The expansive use of Article 114 TFEU has sparked critical debate over its compatibility with the principle of subsidiarity under Article 5 TFEU, which requires that EU action be taken only when Member States cannot adequately achieve the desired objectives.xviiiScholars and policymakers have cautioned that Article 114 has been employed to justify legislation advancing non-market aims, thereby risking overreach into national regulatory domains.xixThe CJEU addressed this concern in Tobacco Advertising I,xx ruling that measures lacking a clear link to market harmonization fall outside Article 114’s scope.Similarly, Norra Stockholm Bygg AB reaffirmed that a direct connection to internal market objectives is essential.xxi
Proponents of Article 114, however, argue that regulatory harmonization is essential to prevent market fragmentation and ensure a level playing field across the EU.xxiiThey emphasize the democratic legitimacy of EU legislative processes, which require the joint approval of the European Parliament and the Council.Yet, to preserve institutional balance and the legitimacy of EU action, Article 114 must not function as a catch-all provision for legislation that serves primarily non-market goals.
4
In my view, while Article 114 is an indispensable tool for achieving regulatory coherence across the internal market, its application must be calibrated to avoid undermining the principle of subsidiarity and national sovereignty.A more narrowly defined scope of application, increased transparency in legislative processes, and enhanced roles for national parliaments could mitigate the risks of competence creep while supporting a coherent and rights-respecting framework for EU data governance.
3. Regulatory Scope and Interaction of the three frameworks
GDPR is the principal legal framework for data protection in the EU, harmonizing personal data processing rules to uphold fundamental rights.xxiiiIt provides the foundation for sector- specific instruments such as DGA and EHDSR, which promote responsible data sharing and innovation while preserving GDPR’s privacy protections.According to Recital 35, health data means information revealing a person’s physical or mental condition, past, present, or future. Article 4 highlights three sensitive categories: genetic data, biometric data, and health data.The GDPR applies to any processing of personal data involving identifiable individuals, even by entities outside the EU that target EU residents.xxivRecital 26 excludes anonymized data, enabling research with non-personal health data.
The GDPR is principles-based, as detailed in Article 5.Personal data must be collected for specific, legitimate purposes and further processed only if a compatible legal basis exists. Consent is one such basis, alongside performance of a contract, legal obligations, vital interests, public interest, and legitimate interests.xxvHealth, genetic, and biometric data are classified as “special categories” under Article 9 and are generally prohibited from processing unless exceptions apply, such as explicit consentxxvi or substantial public interest.xxviiRecital 33 allows broad consent for scientific research when it is not feasible to fully specify research purposes at the time of data collection.
Data subjects are granted rights including access, xxviii rectification, xxix restriction, xxx objection, xxxi and erasure. xxxii Controllers must conduct Data Protection Impact Assessments (DPIAs),xxxiii appoint Data Protection Officers (DPOs) xxxiv where necessary, and implement robust safeguards.xxxvArticle 89(1) allows further processing for research and statistical purposes under safeguards, and Recital 50 emphasizes that such reuse must be foreseeable and non-incompatible from the data subject’s perspective.xxxvi
5
The DGA complements GDPR by introducing frameworks for trustworthy data sharing while confirming the GDPR’s primacy.Under Article 1(3), the DGA does not create new legal bases for processing personal data or override existing GDPR obligations.xxxviiRecital 4 reinforces this, emphasizing that DGA must not be interpreted as interfering with cross- border transfers under Chapter V of GDPR.
This distinction is especially relevant for health data and scientific research, which remain subject to Article 9 of GDPR. Member States may further elaborate on GDPR’s exceptions in national law. xxxviii The DGA introduces the concept of Data Altruism Organisations(DAOs), entities that facilitate voluntary data sharing for purposes of general interest such as research or public health.xxxixHowever, DAOs are fully subject to GDPR obligations if they act as controllers or processors. Supervisory authorities under the GDPR maintain oversight to ensure compliance.xl
DGA also facilitates the reuse of data held by public sector bodies,xli including personal data, trade secrets, and intellectual property.xliiReuse simply mean use for purposes other than those for which data was originally collected.xliiiWhile this overlaps with GDPR’s concept of “further processing,” DGA adds structural safeguards: reuse is only allowed if data is anonymized or aggregated,xliv with re-identification strictly prohibited.xlvAccess may be allowed by Public sector bodies under transparent, proportionate, and non- discriminatory conditions,xlviincluding the chargeable Fees for accessxlvii.
Expanding upon the foundations laid by GDPR and DGA, the EHDSR introduces a tailored regulatory structure for health data governance, facilitating both primary uses in clinical care and secondary uses such as research and public health policymaking.xlviii It formalizes the concept of data altruism within the healthcare context, enabling individuals to voluntarily contribute their health data for initiatives serving the public good.xlix In doing so, the EHDSR gives practical effect to the DGA’s normative aspirations while remaining consistent with the GDPR’s commitment to fundamental rights and data protection.
The EHDSR reflects DGA’s data altruism model by adopting language such as “common standards” and “common data spaces,” echoing the discourse of data commons.l However, it diverges from broader knowledge commons models in healthcare, which govern
6
intangible resources like scientific data and ideas as cumulative public goods.li Knowledge commons emphasize both the creation and reuse of resources for long-term societal benefit.lii Unlike material commons with natural boundaries, knowledge commons are legally constructed, often shaped by intellectual property regimes.liiiThis marks a key distinction in scope, governance, and intent between the two approaches.
The EHDSR distinguishes primary use diagnostics, treatment, prescriptions; from secondary use: research, innovation, policymaking.livArticle 34 authorizes access to health data for secondary use under strict ethical and confidentiality conditions, including access to data protected by intellectual property or trade secrets if justified by public interest.lv
The regulation enhances individual rights in the healthcare context.lvi Individuals can access their digital health records,lvii request corrections, restrict access,lviii and authorize third-party providers.lix These rights, rooted in the GDPR, are tailored to the needs of digital patient autonomy.lx
To ensure data portability, EHDSR mandates interoperability across Member States through platforms like MyHealth@EU, requiring that specific categories of data be structured and exchangeable.lxi However, this requirement excludes paper records, non- standard clinical notes, and non-interoperable systems unless otherwise specified.lxii It extends to wellness app and wearable device data under safeguards, capturing emerging digital health domains.lxiii
However, EHDSR includes jurisdictional limitations, excluding personal data processing for law enforcement, national security, or outside the scope of Union law, consistent with broader EU data protection principles.lxiv
To manage secondary use, EHDSR establishes national health data access bodies responsible for evaluating requests from public and private entities (excluding individual researchers and microenterprises). lxv Access is granted only for socially beneficial purposes like public health or drug development, while explicitly prohibiting commercial uses such as insurance profiling.lxvi Results from secondary use must be published within 18 months to ensure transparency.lxviiEHDSR also includes opt-out mechanisms, allowing individuals to withdraw consent for secondary data use without providing justification.
7
While opt-out from primary use is theoretically possiblelxviii, it may be limited by overriding public interest.lxix
Lastly, EHDSR emphasizes semantic and technical interoperability to ensure consistent use and interpretation of health data. It includes provisions for inferred data like AI- generated diagnoses, and enforces data minimization.lxx Re-identification risks remain, particularly for wellness data and algorithmic outputs. Regulatory authorities such as the EDPB and EDPS have flagged these concerns, urging continuous oversight and safeguards.lxxi
Together, the GDPR, DGA, and EHDSR constitute a layered and complementary EU data governance framework. GDPR establishes the core rights and principles; DGA introduces regulated sharing mechanisms like data altruism; and EHDSR applies these mechanisms in healthcare, enabling ethical, secure, and socially beneficial reuse. Collectively, they balance innovation and individual rights, ensuring that the EU’s data-driven future remains anchored in trust and fundamental values.
4. Conditions for the Secondary Use of Non-Personal Data in the Health Sector
AI applications in healthcare critically depend on access to large, heterogeneous datasets. This reliance raises concerns about privacy, data breaches, and potential misuse of sensitive medical information.lxxii A foundational condition for the lawful secondary use of health- related data, particularly non-personal data, is the implementation of robust technical and organisational safeguards, most notably through anonymisation and pseudonymisation.
Anonymisation serves as a primary means of protecting confidentiality and safeguarding individual privacy. The OECD (2015) defines de-identified data as data that cannot directly identify an individual.lxxiii While this reduces privacy risks, the potential for reidentification remains, particularly in high-dimensional datasets such as genomic sequences or biobank records. The distinction between direct and indirect identifiers, and the increasing sophistication of data linkage techniques, means that even anonymised data may, in certain contexts, become re-identifiable.
Confidentiality is a duty, often owed by a professional to an individual in specific contexts, while privacy is a broader personal right.lxxiv In sectors such as law and medicine,
8
individuals routinely disclose sensitive information with the expectation that professionals will uphold a legal and ethical duty of confidentiality. This duty, enshrined in instruments like the Hippocratic Oathlxxv and the Geneva Declaration, ceases to apply once data have been anonymised.Notably, the Geneva Declaration also includes a duty to “share my medical knowledge for the benefit of the patient and the advancement of healthcare,” underscoring a broader public interest dimension.lxxvi
The CJEU, in Norra Stockholm Bygg AB, underscored the difficulty in distinguishing personal from non-personal data in healthcare, particularly where anonymised datasets may still permit identification when cross-referenced with auxiliary information. lxxvii In response, EHDSR mandates that Health Data Access Bodies(HDABs) grant access to only the data necessary for a specific research purposelxxviii, and only within Secure Processing Environments (SPEs).lxxix As defined under DGA, SPEs are technical and organisational infrastructures designed to ensure lawful, secure, and confidential data processing.lxxx
Though GDPR primarily governs personal data, its principles guide the use of non-personal data where there is a realistic risk of reidentification.lxxxiArticle 89(1) GDPR, read in conjunction with Article 9(2)(j), permits processing of special categories of personal data for scientific research purposes, contingent upon the implementation of “appropriate safeguards.” The CJEU reaffirmed in VB v. Natsionalna that these safeguards must be continuously updated to reflect the evolving risk landscape.lxxxii Yet, the term “appropriate safeguards” remains undefined in EU law, resulting in interpretive variation across Member States.
The European Data Protection Board(EDPB) and the European Data Protection Supervisor(EDPS) have proposed non-binding safeguard measures, including pseudonymisation, data protection impact assessments(DPIAs), and secure environments. lxxxiii These safeguards, including the implementation of advanced encryption protocols and data minimisation principle, aim to protect the rights and freedoms of data subjects. However, scholars increasingly advocate for a broader, more integrated view of safeguards, one that includes transparency, independent oversight, public engagement, ethical governance mechanisms and informed consent.lxxxiv
9
While consent is key to respecting autonomy and transparency, it is often impractical for retrospective health data due to time, resource, or ethical constraints.Legal frameworks therefore allow secondary use based on public interest or research purposes, provided adequate safeguards are in place.These ensure lawful and responsible data processing.Whether data is collected via consent or opt-out, individuals delegate stewardship to Data Altruism Organisations(DAOs), which hold both rights and responsibilities.lxxxv
The EHDSR builds upon these foundations by introducing sector-specific mechanisms tailored to the health domain. It mandates both “appropriate” and “sufficient” safeguards, underscoring the legitimacy of secondary data use when conducted under conditions that respect individual rights.lxxxvi The EHDSR adopts a decentralised model, allowing data to remain with their original holders and facilitating access through nationally designated HDABs. Data processing must occur within SPEs that uphold core principles such as purpose limitation, data minimisation, and secure access.lxxxvii
EHDSR applies to both personal and non-personal electronic health data, requiring contextual assessment of reidentification risks, even for anonymised datasets.Recital 93 distinguishes this from the Cyber Resilience Act,lxxxviiiembedding cybersecurity directly into health data systems like electronic health records.The DGA supports this by banning reidentification and requiring breach notifications.lxxxix When personal and non-personal data are combined, the dataset is treated as personal, underscoring the need for early anonymisation or pseudonymisation, an essential safeguard involving separation of identifiable information.
Nonetheless, anonymisation is often difficult to achieve in practice, especially in the context of granular health dataxcThe trade-off between data utility and privacy protection necessitates careful calibration within SPEs, supported by rigorous risk assessments and multi-layered safeguards.xciHowever, the absence of harmonised EU-wide standards for SPE certification and inconsistent national adoption of Trusted Research Environments(TREs) impede cross-border research and regulatory coherence. Responsibility for implementing safeguards also remains ambiguously distributed. EHDSR permits either HDABs or data holders (or their processors) to anonymise or
10
pseudonymise data,xciialthough preference is given to data holders due to their contextual familiarity.xciiiThis pragmatic approach seeks to leverage domain expertise while ensuring legal compliance.
The EHDSR also intersects with other regulatory domains, including intellectual property and trade secrets.xciv Data holders must disclose any proprietary protections to HDABs, which are tasked with ensuring proportional safeguards.However, concerns remain that insufficient procedural guarantees, such as avenues for redress or contractual protections, may discourage data sharing and innovation.
Finally, conceptual ambiguities persist.The EHDSR does not clarify whether anonymised data must be anonymised ex post or whether data collected anonymously qualify. Nor does it specify whether anonymity must be absolute or contextually assessed.The CJEU’s decision in Breyer v. Bundesrepublik Deutschland supports a contextual standard based on reasonable likelihood of identification. If anonymised data prove insufficient for research, the DGA allows for recontacting individuals to obtain consent, thereby preserving both research viability and individual autonomy.
5. Fragmentation and Legal Ambiguity in the Governance of Secondary Health Data The interaction among the three frameworks reflects the EU’s goal of a harmonised digital health data space. However, differing scopes and logics, GDPR’s rights-based focus versus DGA and EHDSR’s emphasis on reuse create tensions, especially when data transitions between personal and non-personal status under overlapping obligations.
The EHDSR, grounded in both Articles 114 and 16 TFEU, attempts to reconcile market integration and data protection. It extends GDPR-like requirements like data minimisation and secure processing to all electronic health data, regardless of identifiability.xcv This effectively imposes data protection duties on non-personal data, even though GDPR explicitly excludes such data. As a result, data controllers and access bodies must navigate a blurred regulatory landscape, where compliance obligations are unclear and not easily traced to a single legal source.
This ambiguity is exacerbated by the emergence of powerful digital intermediaries. These platform-based actors play a crucial “matching” role, connecting developers, healthcare
11
providers, and end-users through the continuous analysis of vast datasets. A prominent example is Apple’s App Store, which integrates health data from devices via HealthKit and controls access to thousands of third-party health applications. Apple’s role in orchestrating this ecosystem exemplifies the architectural power of digital intermediaries, entities that not only structure access to markets but also set quasi-regulatory conditions for data use and monetisation.xcvi
The Apple App Store casexcvii before the European Commission reveals deeper structural issues about such power in the context of EU competition law. Originally targeting Apple’s mandatory use of its in-app purchase system, the case was narrowed to focus on the anti- steering provisions imposed on app developers.These rules prevented companies like Spotify from informing users on iOS devices about alternative, potentially cheaper subscription options outside the App Store.Framed under Article 102 TFEU as exploitative unfair trading conditions, the case marked a shift from exclusionary conduct to the exploitation of end-users through architecture-driven market control.
The European Commission characterised the App Store as a multi-sided platform with developers on one side and consumers on the other. Apple’s anti-steering rules were deemed detrimental to end-users, who faced inflated prices and restricted choices. Moreover, these rules were not necessary or proportionate to achieving a legitimate objective. Notably, although the case arose from a platform-to-business(P2B) context, the Commission’s analysis centred on platform-to-consumer(P2C) harm. It emphasised how Apple’s unilateral rule-making powers, which could be changed at will, constituted a form of quasi-regulatory authority, a concern echoed in the CJEU’s Superleague ruling,xcviii albeit from a procedural fairness rather than substantive competition perspective.
These findings resonate with broader concerns about the governance of digital health ecosystems. When intermediaries such as Apple act as both gatekeepers and infrastructure providers, their private rule-making significantly impacts data access, interoperability, and fairness. Yet DGA and EHDSR do not adequately address these quasi-regulatory dynamics. They fail to impose specific obligations on ecosystem orchestrators regarding transparency, proportionality, or fairness in access conditions. This omission risks
12
embedding asymmetries of power into the EU’s digital health architecture, potentially stifling innovation, competition, and data-driven research.
Moreover, legal uncertainty is further deepened by inconsistencies in key concepts such as anonymisation.xcix The GDPR adopts a flexible, context-dependent definition, affirmed in Breyer v. Bundesrepublik Deutschland, which considers the means reasonably available for reidentification.c However, EHDSR and DGA reference anonymisation without aligning clearly with this jurisprudence.ci Health data, especially when generated by mobile and wearable devices, often includes high-dimensional or biometric information, making it particularly vulnerable to reidentification through AI-driven analytics.cii
A similar lack of coherence affects the articulation of safeguards. Article 89(1) GDPR mandates safeguards in research involving personal data, but leaves the concept vague. The EHDSR echoes this requirement without elaboration, while DGA introduces secure processing environments(SPEs) without integrating them into a broader evaluative framework. As a result, a patchwork of national or sectoral standards governs what constitutes adequate protection, hindering cross-border data use.
Moreover, the procedural architecture of these frameworks lacks a formal conflict resolution mechanism. Despite GDPR’s primacy in matters of fundamental rights, neither the DGA nor the EHDSR provides clear guidance for reconciling conflicting provisions.ciii In the absence of judicial or legislative clarification, stakeholders must rely on fragmented, risk-based interpretations that may diverge across jurisdictions.
Finally, the lack of harmonised certification schemes for key infrastructures such as Trusted Research Environments or SPEs exacerbates fragmentation. While both the DGA and EHDSR acknowledge their importance, no EU-wide technical or governance standards currently exist.civ This opens the door to national divergence, impeding interoperability and undermining the scalability of the European Health Data Space.
6. Conclusion
13
while GDPR, DGA, and EHDSR each bring essential normative contributions, respectively grounding data protection, facilitating trust-based data sharing, and enabling health- specific data reuse, their combined framework remains fraught with fragmentation and legal ambiguity. Despite the EU’s commitment to balancing innovation with individual rights, unclear standards on anonymisation, safeguards, and procedural requirements often result in overly cautious interpretations that stifle legitimate data use.cvThis is particularly detrimental to under-resourced actors like individual researchers and microenterprises, who face disproportionate compliance burdens.
Moreover, uneven implementation of secure processing environments and exclusions in access eligibility further undermine the vision of a cohesive European Health Data Space. Realizing this vision will require not only legislative coherence but also practical reforms: harmonized certification schemes, consistent interpretative guidance, and streamlined legal pathways for research. Without these structural improvements, the EU’s ambition for a balanced, interoperable, and innovation-friendly health data governance regime will remain aspirational rather than actionable.
Bibliography
i Lianos, I. (2024, August). Access to health data: Competition and regulatory alternatives – Three dimensions of fairness (CLES Policy Paper Series). Centre for Law, Economics and Society, Faculty of Laws, University College London. https://www.ucl.ac.uk/cles/research-papers
ii Talagala, N. (2022, March 4). Data as the new oil is not enough: Four principles for avoiding data fires. Forbes. https://www.forbes.com/sites/nishatalagala/2022/03/02/data-as-the-new-oil-is-not-enough-four- principles-for-avoiding-data-fires/
iii Lessig, L. (1999). Code and other laws of cyberspace. New York, NY: Basic Books
iv European Commission. (2020). A European strategy for data (COM/2020/66 final). https://eur- lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52020DC0066
v Savage, L., Gaynor, M., & Adler-Milstein, J. (2019). Digital health data and information sharing: A new frontier for healthcare competition. Antitrust Law Journal, 82(2), 593–626.
vi Lianos, I. (2024, August). ibid
vii Schneider, G. (2022). Health data pools under European data protection and competition law: Health as a digital business. Springer.
viii Okunleye, O. J. (2024). The role of open data in driving sectoral innovation and global economic development. Journal of Engineering Research and Reports, 26(7), 122–134. https://doi.org/10.9734/jerr/2024/v26i71205
ix Pfenninger, S., DeCarolis, J., Hirth, L., Quoilin, S., & Staffell, I. (2017). The importance of open data and software: Is energy research lagging behind? Energy Policy, 101, 211–215. https://doi.org/10.1016/j.enpol.2016.11.046
14
x Purtova, N. (2018). The law of everything. Broad concept of personal data and future of EU data protection law. Law, Innovation and Technology, 10(1), 40–81. https://doi.org/10.1080/17579961.2018.1452176
xi Didier, A., Nathaniel, A., Scott, H., Look, S., Benaroyo, L., & Zumstein-Shaha, M. (2023). Protecting personhood: A classic grounded theory. Qualitative Health Research, 33(13), 1177–1189. https://doi.org/10.1177/10497323231188491
xii Article 8(1) GDPR
xiii Article 23 DGA
xiv Article 11 (3) EHDSR
xv European Court of Human Rights. (2008). I. v. Finland, no. 20511/03, ECHR 2008.
xvi Court of Justice of the European Union. (2014). Digital Rights Ireland Ltd v. Minister for Communications, Marine and Natural Resources and Others, Joined Cases C-293/12 and C-594/12, ECLI:EU:C:2014:238. xvii Craig, P., & De Búrca, G. (2021). EU law: Text, cases, and materials (7th ed.). Oxford University Press. xviii Barnard, C. (2019). The substantive law of the EU: The four freedoms (6th ed.). Oxford University Press xix Verfassungsblog. (2023). Why the Words “But” and “However” Determine the EMFA’s Legal Basis. Retrieved from https://verfassungsblog.de/why-the-words-but-and-however-determine-the-emfas-legal- basis/
xx C-376/98
xxi Case: C-268/21 – Norra Stockholm Bygg AB v Per Nycander AB
xxii Lenaerts, K. (2022). EU constitutional law. Oxford University Press.
xxiii Purtova, N. (2018). The law of everything
xxiv Article 2 GDPR
xxv Article 6 GDPR
xxvi Article 9(2) GDPR
xxvii Article 89 GDPR
xxviii Article 15 GDPR
xxix Article 16 GDPR
xxx Article 18 GDPR
xxxi Article 21 GDPR
xxxii Article 17 GDPR
xxxiii Article 35 GDPR
xxxiv Article 37 GDPR
xxxv Article 32 GDPR
xxxvi El Mestari, S. Z., Doğan, F. S., & Botes, W. M. (2024). Technical and legal aspects relating to the (re)use of health data obtained from repurposing machine learning models in the EU. University of Luxembourg and Jagiellonian University.
xxxvii Chassang, G., & Feriol, L. (2024). Data altruism, personal health data and the consent challenge in scientific research: A difficult interplay between EU acts. European Data Protection Law Review, 10(1), 57– 68. https://doi.org/10.21552/edpl/2024/1/9
xxxviii Article 9 (4) GDPR
xxxix Article 23 DGA
xl Articles 24 and 25 DGA
xli Recital 6 DGA
xlii Article 2 DGA
xliii Article 2(2) DGA
xliv Art.6(4) and Rec.50 GDPR.
xlv Article 5(3), DGA
xlvi Article 3(5) DGA
xlvii Article 6 DGA.
xlviii Article 16 DGA
xlix Article 11(3) EHDSR
l Article 2(16) DGA
li Terzis, P., & Santamaria Echeverria, (E.) O. (2021). Interoperability and governance in the European Health Data Space regulation. University College London & Erasmus School of Law.
lii Strandburg, K., Frischmann, B., & Madison, M. (Eds.). (2017). The knowledge commons framework. In K. Strandburg, B. Frischmann, & M. Madison (Eds.), Governing medical knowledge commons (pp. [page range]). Cambridge University Press.
15
liii Hesse, C., & Ostrom, E. (2005). Introduction: An overview of the knowledge commons. In C. Hesse & E. Ostrom (Eds.), Understanding knowledge as a commons: From theory to practice (pp. 7–8). MIT Press.
liv Article 33(4)
lv Article 52 EHDSR
lvi Article 4 EHDSR
lvii Article 5 EHDSR
lviii Article 6 EHDSR
lix Article 7 EHDSR
lx Article 10 EHDSR
lxi Article 23 EHDSR
lxii Following from Article 14, Annex I EHDSR lxiii Chapter III, Article 35 EHDSR
lxiv Article 1 (9)(a)(b)
lxv Article 87 EHDSR
lxvi Article 54 EHDSR lxvii Article 61(4) EHDSR lxviii Article 10 EHDSR lxix Article 53 EHDSR
lxx Article 66 EHDSR
lxxi European Data Protection Board (EDPB) & European Data Protection Supervisor (EDPS). (2021). Joint Opinion 03/2021 on the proposal for a regulation of the European Parliament and of the Council on European data governance (Data Governance Act) (paras. 13, 27, 28). https://edpb.europa.eu/our-work- tools/our-documents/edpbedps-joint-opinion/edpb-edps-joint-opinion-032021-proposal_en
lxxii Rasool, S., Ali, M., Shahroz, H. M., Hussain, H. K., & Gill, A. Y. (2024). Innovations in AI-powered healthcare: Transforming cancer treatment with innovative methods. BULLET: Jurnal Multidisiplin Ilmu, 3(1), 118–128. https://journal.mediapublikasi.id/index.php/bullet/article/view/4094
lxxiii Organisation for Economic Co-operation and Development (OECD). (2015). OECD recommendations on health data governance. https://www.oecd.org/health/health-data-governance.htm
lxxiv Rumbold, J., & Pierscionek, B. (2018). Contextual anonymization for secondary use of big data in biomedical research: Proposal for an anonymization matrix. JMIR Medical Informatics, 6(4), e47. https://doi.org/10.2196/medinform.7096
lxxv Higgins, G. L. (1989). The history of confidentiality in medicine: The physician-patient relationship. Canadian Family Physician, 35, 921–926. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2281020/
lxxvi Arsaa-Parsi, R. (2017). The revised Declaration of Geneva: A modern-day physician’s pledge. JAMA, 318(20), 1971–1972. https://doi.org/10.1001/jama.2017.16230
lxxvii Ohm, P. (2009). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review, 57, 1701–1778.
lxxviii Article 53(1) EHDSR
lxxix Lippert, C., Sabatini, R., Maher, M. C., Kang, E. Y., Lee, S., Arikan, O., … & Venter, J. C. (2017). Identification of individuals by trait prediction using whole-genome sequencing data. Proceedings of the National Academy of Sciences, 114(38), 10166–10171. https://doi.org/10.1073/pnas.1711125114
lxxx Article 2(20) DGA
lxxxi Chassang, G., & Feriol, L. (2024). Supra.
lxxxii Case C-340/21, VB v. Natsionalna agentsia za prihodite (NAP)
lxxxiii Article 29 Data Protection Working Party. (2017). Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 (WP 248 rev.01, pp. 9–11). Endorsed by the European Data Protection Board (EDPB).
lxxxiv Chassang, G., & Feriol, L. (2024). supra
lxxxv Kondylakis, H., Kalokyri, V., Sfakianakis, S., Marias, K., Tsiknakis, M., Jimenez-Pastor, A., Camacho- Ramos, E., Blanquer, I., Segrelles, J. D., López-Huguet, S., Barelle, C., Kogut-Czarkowska, M., Tsakou, G., Siopis, N., Sakellariou, Z., Bizopoulos, P., Drossou, V., Lalas, A., Votis, K., … Lekadir, K. (2023). Data infrastructures for AI in medical imaging: A report on the experiences of five EU projects. European Radiology Experimental, 7, Article 45. https://doi.org/10.1186/s41747-023-00336-x
lxxxvi Article 58 and Recital 4 EHDSR
lxxxvii Article 66 EHDSR
16
lxxxviii European Union. (2024). Regulation (EU) 2024/2847 of the European Parliament and of the Council of 23 October 2024 on horizontal cybersecurity requirements for products with digital elements and amending Regulations (EU) No 168/2013 and (EU) No 2019/1020 and Directive (EU) 2020/1828 (Cyber Resilience Act). Official Journal of the European Union, L 2847, 20 November 2024.
lxxxix Recital 15 DGA
xc Scheibner, J., Ienca, M., Kechagia, S., Troncoso-Pastoriza, J. R., Raisaro, J. L., Hubaux, J.-P., Fellay, J., & Vayena, E. (2020). Data protection and ethics requirements for multisite research with health data: A comparative examination of legislative governance frameworks and the role of data protection technologies. Journal of Law and the Biosciences, 7(1), lsaa010. https://doi.org/10.1093/jlb/lsaa010
xci Chassang, G., & Feriol, L. (2024). supra
xcii Article 55 EHDSR
xciii Chassang, G., & Feriol, L. (2024). ibid
xciv Article 52 EHDSR
xcv Article 54, 66 and Recital 93 EHDSR
xcvi Lianos, I. (2024, August). Access to health data: supra
xcvii European Commission. (2024). CASE AT.40437 – Apple – App Store practices (music streaming). https://competition-cases.ec.europa.eu/cases/at40437
xcviii Court of Justice of the European Union. (2023). Case C-333/21, European Superleague Company v. Union of European Football Associations (UEFA), ECLI:EU:C:2023:1011, paras. 135–138.
xcix Recital 26 GDPR
c Court of Justice of the European Union. (2016). Patrick Breyer v Bundesrepublik Deutschland, C-582/14, ECLI:EU:C:2016:779. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A62014CJ0582
ci Porsdam Mann, S., Savulescu, J., & Sahakian, B. J. (2021). Facilitating health data sharing: Reconciling individual rights and public benefit. PLOS Medicine, 18(4), e1003592.
cii Purtova, N. (2018). The law of everything. Broad concept of personal data and future of EU data protection law. Law, Innovation and Technology, 10(1), 40–81. https://doi.org/10.1080/17579961.2018.1452176
ciii de Hert, P., & Papakonstantinou, V. (2012). The proposed data protection regulation replacing Directive 95/46/EC: A sound system for the protection of individuals. Computer Law & Security Review, 28(2), 130– 142.
civ Bourgeois, F. T., Murthy, S., Mandl, K. D., & de Moor, G. (2023). Trusted research environments: A pathway to secure and trustworthy cross-border data sharing in Europe. Health Policy, 127(3), 345–352.
cv El Mestari, S. Z., Doğan, F. S., & Botes, W. M. (2024). Supra
17