When the GDPR came into force in 2018, many commentators treated the Data Protection Impact Assessment (DPIA) as a novel invention. In truth, DPIAs are not the immaculate offspring of Brussels policymakers. They are the messy product of decades of experimentation, scandal, and philosophical borrowing: from the environmental movement, from early privacy law, and from the push for accountability in a data-driven world.
Understanding this genealogy matters. It shows us that DPIAs are more than compliance paperwork; they’re rather a compass for navigating the risks of innovation. Also, their story is not one of a pristine origin, but of emergence—the clash of principles, politics, and power that gave them form.
Early forces: Principles without a process (1960s–1980s)
The Fair Information Practices (FIPs) of the late 1960s and the OECD Privacy Guidelines (1980) set down the first seeds. They emphasised accountability, proportionality, and purpose limitation. Yet they lacked teeth. There was no method for systematically testing whether organisations actually lived up to these ideals.
In this period, privacy was still largely reactive. Breaches were punished, but proactive evaluation was rare. The concept of an “impact assessment” had not yet crossed from environmental law into the privacy domain.
Borrowing a framework: Environmental law meets privacy (1990s)
The intellectual breakthrough came by borrowing the impact assessment model from environmental regulation. Just as property developers were forced to prove their projects would not poison rivers or destroy ecosystems, so too should organisations prove that their data projects would not irreparably damage privacy.
This analogy was transformative: it shifted privacy from a remedial lens (fix harm later) to a preventative lens (anticipate and mitigate harm before launch).
Privacy by Design and the rise of PIAs (1990s–2000s)
Dr Ann Cavoukian’s Privacy by Design (PbD), developed in the mid-1990s, codified this proactive ethos. PbD’s principle—“proactive not reactive, preventive not remedial”—found practical expression in the Privacy Impact Assessment (PIA).
Governments became early adopters.
- New Zealand (1993) required PIAs for government data matching.
- U.S. E-Government Act (2002) mandated PIAs for federal IT systems.
- Canada (2002) and Australia (2006) issued PIA frameworks.
- UK ICO (2007) released one of the first PIA handbooks.
These early PIAs were a response to growing data scandals. As databases expanded and leaks multiplied, regulators sought tools to restore public trust.
The European turning point: From prior checking to DPIAs
The EU Data Protection Directive 95/46/EC introduced the idea of “prior checking” by regulators for risky processing. While imperfect—burdening regulators with bureaucracy—it laid down the principle that some processing deserved special scrutiny before it began.
By the late 2000s, international resolutions (e.g. the 2009 Madrid Resolution) and EU-funded studies like the PIAF Project (2011) called for stronger, standardised PIA requirements. Scholars such as Roger Clarke and David Wright pressed for making PIAs mandatory.
The stage was set for a regulatory shift, from regulators doing the checking to organisations taking on that responsibility themselves.
Birth of the DPIA: GDPR (2016/2018)
Article 35 of the GDPR formalised the DPIA. Three features marked the transition from PIA to DPIA:
- Terminology shift: from “privacy” to “data protection,” reflecting the EU’s rights-based framing.
- Risk-based trigger: DPIAs are mandatory when processing is “likely to result in high risk”.
- Accountability pivot: the duty lies with the controller to self-assess, mitigate, and document risk—consulting regulators only if high residual risks remain.
This was more than a procedural tweak. It represented a philosophical pivot: organisations could no longer hide behind paperwork filed with authorities. They had to internalise risk assessment as part of their governance.
Global spread and mutation: POPIA, CCPA, and beyond
- South Africa’s POPIA (2013, effective 2021) requires Personal Information Impact Assessments (PIIAs), arguably more broadly than GDPR. Every responsible party must conduct them, not only for high-risk projects. POPIA also introduced “prior authorisation” for certain activities like credit reporting—a stricter gatekeeping model.
- California’s CPRA (2020) amended the CCPA to require risk assessments for high-risk processing, plus independent cybersecurity audits for significant risk. Its Age-Appropriate Design Code (2022) even uses the term “Data Protection Impact Assessment” for children’s services.
- Brazil’s LGPD (2018), China’s PIPL (2021), and numerous U.S. state laws have adopted similar assessment obligations. ISO’s 29134 standard (2017/2023) has globalised methodology, while the UN Special Rapporteur has called for PIAs in surveillance technologies.
Each regime reflects local philosophy—GDPR’s rights-based accountability, POPIA’s paternalistic authorisation, CPRA’s consumer-market pragmatism.
What the genealogy reveals
The genealogy of DPIAs reveals messy emergence, not pure origin.
- From FIPs and OECD came proportionality and accountability.
- From environmental law came the very idea of “impact assessment”.
- From Privacy by Design came the ethos of proactivity.
- From Directive 95/46/EC came the recognition of high-risk processing.
DPIAs thus embody decades of rupture and convergence. They represent a regulatory philosophy that forces organisations to think before they act; a philosophy now spreading worldwide.
DPIAs as a compass, not checkbox
Seen through their genealogy, DPIAs are not dull paperwork but strategic tools for trust. They emerged through conflict and borrowing, not immaculate design. Their messy ancestry is their strength: they blend human rights law, risk management, and organisational accountability into a single, practical process.
As AI, biometrics, and algorithmic profiling intensify, DPIAs stand as our best available compass. Not perfect, not pure, but indispensable.
The future of responsible innovation will be written by those who understand that DPIAs are not boxes to be ticked, but legacies of a long struggle to balance progress with protection.




