Article 25 requires data controllers to utilize technical and organizational measures that by design and default implement the principles of data protection. This paper explores the meaning of Article 25 in reference to privacy engineering. The privacy engineering community distinguishes hard privacy and soft privacy. Hard privacy seeks to avoid as much as possible placing any trust in a third party. Hard privacy engineering therefore seeks to minimize any disclosure of personal data to controllers and relies on cryptographic protocols to ensure this outcome. Soft privacy assumes that the data subject will lose control over his data and has to place some trust in controllers. Soft privacy engineering therefore builds tools that help users make good decisions about data sharing while satisfying informed consent requirements (e.g., policy preference languages like P3P, cookie managers, privacy dashboards, and ad icons). Which type of privacy engineering does Article 25 require? Does Article 25 even permit an answer to this question?
This paper argues that Article 25 offers very little guidance about privacy engineering in practice. One reason for this shortcoming is the broader tension in the GDPR between principles that seek to uphold the rights of data subjects (and therefore require data controllers to implement soft privacy measures) and principles like data minimisation that attempt to eliminate or reduce data sharing (and therefore require hard privacy measures). The other reason is the language of Article 25, which is deficient in several respects. First, Article 25 suffers from several drafting flaws: it is written in vague and abstract language and offers very few examples of suggested technical and organisational measures other than pseudonymisation, which is a poor choice; its scope is uncertain and it overlaps with numerous “accountability” provisions in confusing ways; and, in the name of “technological neutrality,” it avoids any mention of specific privacy engineering techniques, which only heightens the confusion over what it requires. Second, the mechanisms Article 25 relies on to clarify how it works in practice—namely, delegating and implementing acts, certification, and codes of conduct—either have been removed from the GDPR or are ill-suited to the task. Finally, Article 25 is misaligned with privacy engineering methods and practices. This third objection is central to this paper, which uses cases studies and a road map of privacy-enhancing technologies (PETs) and privacy engineering to analyze these shortcomings in depth. The paper concludes with a few modest suggestions about how regulators might better align Article 25 with privacy engineering in practice.
Ira Rubinstein is a Senior Fellow at the Information Law Institute at New York University School of Law, where he is also an Adjunct Professor. He is also:
Mr. Rubinstein is a Yale Law School graduate. His research interests include electronic surveillance law, big data, voters’ privacy, and privacy by design. He has lectured and published widely on these and other topics and testified before the U.S. Congress on several occasions.
Find more info here.