Can pseudonymisation make personal data anonymous? Lessons for New Zealand from EDPS v SRB
- R O'Brien
- Sep 9
- 4 min read
On 4 September 2025, the Court of Justice of the European Union (CJEU) delivered a landmark judgment in European Data Protection Supervisor v Single Resolution Board (Case C-413/23 P, EU:C:2025:645). The Court addressed a question at the heart of modern data protection law: when pseudonymised data is shared, is it still personal data?
For years, the assumption has been that pseudonymised data is always personal data, because the key to re-identification exists somewhere. This decision shifts that approach and brings nuance for data-driven sectors.
Background
The Single Resolution Board (SRB), an EU institution, invited affected shareholders and creditors to provide comments. To analyse those comments, the SRB sent them to Deloitte.
Before sharing, the SRB replaced names with alphanumeric codes. Deloitte received only the coded comments and had no access to the re-identification key. Several stakeholders complained that the SRB’s privacy notice had not mentioned Deloitte as a recipient.
The European Data Protection Supervisor (EDPS) agreed, finding a breach of transparency obligations. The General Court overturned that decision, but on appeal the CJEU sided largely with the EDPS, providing detailed guidance.
The Court’s findings
Personal opinions are personal data: The Court held that opinions and views, as expressions of a person’s thoughts, necessarily “relate to” that person. No further analysis was required.
A relative test for pseudonymised data: Pseudonymised data is not always personal data in all cases. Whether information is personal depends on whether re-identification is reasonably likely for the party holding it.
For the SRB (which retained the key), the comments remained personal data.
For Deloitte (which had no key and no realistic means of re-identification), the data could be considered anonymous.
This confirms a contextual approach: the same dataset may be personal for one party but not for another.
Transparency obligations remain with the controller: The duty to inform individuals about recipients is assessed at the time of collection and from the perspective of the controller. Controllers cannot avoid transparency obligations by pseudonymising data.
Broader implications in Europe
AI training: Data providers remain responsible for pseudonymised data, but AI developers who receive safeguarded datasets may, in some contexts, work outside GDPR.
Contractual safeguards are critical: Organisations disclosing pseudonymised data must implement strict contractual terms prohibiting recipients from attempting re-identification. These clauses, combined with technical and organisational measures, are essential to reduce risk and support a contextual assessment that the data may not be 'personal data' in the recipient’s hands.
Transparency as a baseline: The ruling underlines that pseudonymisation is not a compliance shortcut. Privacy notices and governance documents must still identify recipients, even if the recipient cannot re-identify individuals.
Why this matters for New Zealand
New Zealand has traditionally taken a broad approach to identifiability. In Proceedings Commissioner v Commissioner of Police [2000] NZAR 277 (CRT) at 285, it was held that identifiability “can be made on the basis of a link identifying the individual, whether that link is obtained from the recipient's own knowledge or by other means.”
The Privacy Act 2020 reinforces this by including exceptions for agencies that believe information will not be used in identifiable form (for example, under IPPs 2, 3, 10 and 11). Those exceptions would be unnecessary if such information were not already treated as 'personal information'.
In reality, not all forms of pseudonymisation are the same or provide the same level of protection. The actual interpretation may be context specific, particularly where pseudonymisation achieves a level of protection that is functionally equivalent to anonymisation.
For New Zealand organisations, the implications include:
Opportunity: This opens the door to more nuanced conversations (or legal challenges in Court) about whether pseudonymised data should always be captured by the Privacy Act. It could enable greater flexibility for innovation, particularly in AI and research.
Contractual safeguards are critical: Agencies disclosing pseudonymised data should use contracts to prohibit re-identification and to set clear technical and organisational limits on how the data can be used. This will be key to arguing that data is effectively anonymous for the recipient.
Uncertainty: It remains to be seen whether New Zealand courts would follow Europe’s contextual approach or continue with the broader “if anyone can link it, it’s personal” view.
Either way, the message is clear: pseudonymisation can reduce risk, but it does not remove accountability. Transparency, governance, and contractual controls remain essential.
Takeaway
The CJEU’s decision in EDPS v SRB is a turning point. It confirms that pseudonymisation can, in some contexts, push data into anonymity, but not for the original controller, and never at the expense of transparency.
For New Zealand, it raises an important question: will our courts stick with the traditional broad definition of personal information, or align with Europe’s risk-based, contextual approach?
Either way, agencies handling data need to be prepared. If your organisation is exploring AI projects, marketing partnerships, or new ways to share information, now is the time to revisit your privacy notices, contracts, and governance frameworks. O’Brien Legal can help you design for innovation while staying compliant and transparent.





Comments