The Next Big Thing?
... on Privacy-Enhancing Technologies
You probably already know that I am a techno-optimist, positive about AI, and generally excited about the possibilities of a wide range of digital technologies. This might incline you to discount my enthusiasms, particularly when I venture somewhat off the beaten track.
I would like however, to persuade you to pay close attention to an important set of technologies, Privacy-Enhancing Technologies (PETs), that I believe have transformational possibilities and that have, in my judgment, received insufficient attention. This, relative, neglect arises, I suspect, out of the complexity and variety of the technical foundations on which these technologies depend. I hope that in this brief account I can cut through this and render it accessible.
Privacy-Enhancing Technologies (PETs) comprise methods, tools, and architectures that allow data to be used, shared, analysed or learnt from, without exposing the underlying information, which may be sensitive. The term has been broadly applied, and variously taxonomised. I will focus principally on PETs in data analysis although there are also important applications in computation and communication.
In essence, PETs enable data use with, at the same time, privacy assurance (achieved by technical means). PETs rest upon the counter-intuitive idea that you can analyse data whilst limiting the access to that data.
PETs enable the development of secure data ecosystems with potential application in government, industry, and sectors such as health, finance, and, of course, national security. The benefits of PETs are that they: allow systems to avoid holding, or exposing, ‘raw’ personal data; enable data collaboration across multiple organisations that cannot or should not share datasets; mitigate insider and external security threats by reducing the attack surface and limiting the consequences of compromise; and support regulatory compliance through data minimisation and ‘purpose limitation’, a core principle of data protection.
The key strands of technology are:
Homomorphic Encryption. Permits secure outsourcing of computation by allowing computations to be performed directly on encrypted data without ever decrypting it. It achieves this by preserving the underlying mathematical structure of the data during encryption thus enabling meaningful operations to be performed without decryption. It is computationally demanding, and hence slow and resource-intensive, but pragmatic schemes (partially homomorphic or similar) are becoming more realistic.
Zero-Knowledge Proofs. Permits verifiable credentials and privacy-preserving authentication. A zero-knowledge proof demonstrates that something is true without revealing any of the underlying information. It works by presenting a challenge that can only answered correctly by somebody in possession of a ‘secret’ without revealing the secret itself.
Differential Privacy. Protects information through the addition of calibrated statistical noise so that outputs do not reveal information about any individual. The noise hides the individuals but maintains the truth about the group. It yields mathematically provable privacy guarantees.
Synthetic Data. Supports machine-learning model training and software testing by generating artificial or synthetic datasets with statistical properties similar to real data but without exposing real individuals. Gives rise to a potential utility–privacy trade-off that must be assessed in each case, but synthetic data protects vulnerable points in the development process.
Secure Multi-Party Computation. Protects information by allowing multiple parties to jointly compute a result without revealing their inputs to the computation. Each party splits their data into pieces distributing them so that no single party can reconstruct the original input. The parties then run a coordinated (and potentially scalable) cryptographic protocol, with strong guarantees, that combines the pieces to produce the correct final result without exposing the data.
Federated Learning. Enables machine-learning models to be trained across multiple data sets held by different parties without centralising data. This is achieved by retaining data locally and sending updates to a model that is centrally held.
Confidential Computing. Provides for protection against leaks, tampering, or misuse through Trusted Execution Environments, hardware-based secure areas (enclaves) inside a computer processor that isolate code and data during computation and thus allow sensitive data to be processed without exposing it to the wider system.
These strands should not be viewed separately. They combine into ‘architectural patterns’ for secure data ecosystems providing design solutions that enable confidential computing, federated data, zero-trust infrastructure, and policy-based access with cryptographic enforcement.
The transformational impact of PETs requires emphasis. Data no longer needs to move to be useful thereby reducing the legal, ethical, and security barriers that currently inhibit collaborations. Sectors that are data-siloed, thus health, finance, policing, national security, education can safely combine insights. PETs enable data coalitions that were previously impossible due to confidentiality constraints with information governance refocussed on outcomes rather than raw access.
In the world of PETs, privacy becomes a technical property not simply a policy promise, and trust shifts from agreements and compliance to architecture and code with mathematically provable or cryptographically enforced privacy guarantees. Digital identity and trust frameworks allow individuals to access services with minimal exposure. PETs additionally underpin security because when data remains encrypted, local, or hidden, even successful attacks reveal less. From a national security standpoint PETs rebalance the tension between surveillance and privacy and support the development of competitive sovereign data-driven industries without compromising privacy.
Clearly, PETs have their limitations. They can be costly, complex, and hard to operationalise. Multi-party computation, homomorphic encryption and differential privacy can, in particular, require significant computational resources and cannot be readily grafted onto legacy infrastructure. PETs obviously do not fix low-quality data. PETs address only one part of the security challenge and there are emerging threat models that include side-channel attacks, supply chain compromises and hardware backdoors. They are certainly not an answer to poor cyber defence. No technical mechanism can wholly protect from a coercive jurisdiction. Ultimately PETs reduce exposure but cannot substitute for properly constituted legal, ethical, or organisational controls ... they are not a magic fix.
The future of PETs lies in making secure computation practical, scalable, and seamlessly embedded in real-world systems. Research is delivering faster and more usable forms of homomorphic encryption, efficient multi-party computation that can support complex analytics, and differential privacy mechanisms that adapt dynamically to context and risk. Confidential computing is moving towards verifiable, decentralised trust models that reduce dependence on any single hardware vendor. A key area of focus remains usability, PETs that can be deployed by non-experts, with formal verification ensuring they are implemented correctly. A further area of focus is policy alignment, linking legal requirements for privacy and data protection directly to cryptographic enforcement.
I do not think it is overblown to suggest that PETs will disrupt the data economy in ways that are fundamental and difficult to readily anticipate. Privacy need not necessarily inhibit data exploitation, and the benefits of data exploitation need not justify undermining privacy. What we need now is research, standards, investment, and a willingness to rethink how we design systems and govern data.


Helpful and accessible summary. Hopefully some of the business service integrators might pull on or support the PET ecosystem in some of their offerings eg healthcare administration. As yet the case for investment in PET seems only partially understood?
Interesting and helpful. A couple of links to help us find out more would have been appreciated .