Part 2 on PETs.

This article is a continuation, as a second part, of the one already published entitled “Privacy Enhancing Technologies (PETs): an ever-present category part 1.”

This contribution highlights some operational and practical aspects of PETs, describing what we believe are currently the leading solutions.

Classifying PETs.

In our previous article, we mentioned the report entitled “Emerging Privacy Enhancing Technologies - current regulatory and policy approaches published by the OECD (OECD).

That report points out that although a definition of PET is lacking, the OECD document proposes some definitions, recalling institutional reports and doctrinal contributions.

In addition, due to the considerable technological evolution in recent years, there are numerous solutions to which it is possible to attribute the status of Privacy Enhancing Technologies.

That has resulted in an attempt at classifying PETs, especially according to the areas in which they are applied.

In reality, it is not easy to classify PETs, on the one hand, because the protection of personal data concerns every natural person. It would be reductive to restrict the effects of algorithmic solutions only to certain areas. On the other hand, technological evolution involves the development of highly innovative solutions (e.g., Web3, blockchain) for which it would be challenging to adopt a classification system on a system-by-system basis.

However, the OECD mentioned above report, with a classificatory momentum, identifies four general categories of PETs, namely:

  1. Data obfuscation tools;
  2. Encrypted data processing tools;
  3. Federated and distributed analytics;
  4. Data accountability tools.

Concerning the first category, “Data obfuscation tools”, the OECD attributes membership to anonymization, pseudonymization, differential privacy, synthetic data, and zero-knowledge proofs (ZKP) techniques.

In addition, the OECD considers federated learning and distributed analytics to be classifiable in the third category, “Federated and distributed analytics”.

Staying on the ground of categorization, some1 identify three general categories with consequent classification, as follows:

  • Algorithmic PETs:
    1. Homomorphic encryption;
    2. Differential privacy;
    3. Zero-knowledge proofs;
  • Architectural PETs:
    1. Federated learning;
    2. Multi-party computation;
  • Augmentation PETs:
    1. Synthetic data;
    2. Digital twinning.

In addition, another classification tentative is that carried out by the Office of the Privacy Commissioner of Canada in Ottawa in 2017 with the document titled “Privacy Enhancing Technologies - A Review of Tools and Techniques.”

In fact, that paper described the taxonomy of privacy enhancing technologies (PETs) as a solution to classify PETs regarding the functionality/capabilities they provide to the end user.

The Ottawa mentioned above Office reportedly chose this taxonomy because it allows for a granular way of categorizing tools and techniques, but at the same time-as stated-has the disadvantage of complex categorization because some tools and techniques provide more than one functionality.

In this paper, reference is made to:

  • Informed consent;
  • Data minimization;
  • Data tracking;
  • Anonymity;
  • Control;
  • Negotiate Terms and Conditions;
  • Technical Enforcement;
  • Remote Audit of Enforcement;
  • Use of Legal rights.

The OECD report cites that taxonomy as a footnote.

In light of the above, undoubtedly, there is no unambiguity in the categorization process.

What are the most well-known and popular PETs?

It is tough to prepare a list of PETs since-especially in the open source environment-the ability of developers to perceive the issue of “privacy” (more precisely, personal data protection and privacy) as relevant has increased significantly in recent years, so much so that they often refer to fundamental rights.

In this context, Artificial Intelligence, which has recently been at the center of debates and discussions, with any of its application solutions suitable for improving and enhancing users’ privacy, also fits right in.

Thus, many applications can be considered PET.

We are known, moreover, to be sensitive to the issue of digital communication, for which instant messaging systems are used daily, and - where possible - to the use of open source resources - there will be no shortage of specific indications that relate to these contexts.

We have created a small list given the categories to which application solutions belong. Among the most well-known and popular PETs, we mention the following:

We reiterated that this is a non-exhaustive list compiled according to our experience.
In addition, we highlight that some services should also be deepened in consideration of their location.

Moreover, we stress that the Digital Markets Act in Europe allows for complete interoperability between solutions developed by so-called gatekeepers (the majors) and those of other providers.

Conclusions.

PETs, therefore, can be identified with any solution that is aimed, through the use of specific technologies, at improving the protection of personal data and the privacy of individuals.

PETs, although they have existed for over forty years, are in continuous development due to the evolution of the most innovative technologies that are being implemented as they are implemented (think, for example, of blockchain and Web3).

The in-depth study of Privacy Enhancing Technologies shows how this phenomenon has impacted other areas, including data governance, personal data protection, and privacy. From this, we opt for an approach that refers to existing legislative instruments and technical norms contained in standards approved by institutional bodies (UNI, CEN, ISO). Difattim, in the first part, we referred to management systems.

The scenario we have tried to outline imposes the need to deepen and evaluate existing models and possibly develop new ones so that it is always possible to adopt an innovative approach (our DAPPREMO, an acronym for “Data Protection and Privacy Relationships Model”, is an example) adapted to the times and the most current technologies.

Using PETs calls for strengthening the training and culture of personal data protection and privacy, especially in Europe, where they are fundamental rights.


(*) Image by Shubham on Unsplash


If this resource was helpful, you could contribute by

Buy me a coffee

Or donate via

Liberapay


Follow us on Mastodon

Stay tuned!



  1. Jordan Sara, Fontaine Clara, Hendricks-Sturrup Rachele, “Selecting Privacy-Enhancing Technologies for Managing Health Data Use”, in Frontiers in Public Health, vol. 10, 2022, https://www.frontiersin.org/articles/10.3389/fpubh.2022.814163 ↩︎