News

Untying the Global Dataflows Mess

Image
Image

One of Harry Houdini’s most difficult tricks consisted of escaping from a nail-fastened and rope-bound wooden crate with manacles on his hands and feet, while submerged in New York’s East River. That feat is starting to look straightforward when compared to the prospect of lawfully exporting personal data out of the European Union. The restrictions on transfers of data to jurisdictions that do not provide an adequate level of protection have been in place for more than 20 years. And while these restrictions have not prevented the development of the digital economy, judging by this issue’s current direction of travel, we could be facing a situation from which not even the great Houdini could escape.

The world of global dataflows radically changed in October 2015 when the Court of Justice of the European Union – influenced by the seriousness of Snowden’s disclosures and skillfully persuaded by Max Schrems – established a tough new adequacy test while invalidating the Safe Harbor framework as an adequacy mechanism. Since then, overcoming the limitations that prohibit exporting personal data to countries that do not match the European standards of data protection involves a two-fold exercise:

  • ensuring that the importer of the data applies equivalent data privacy and security measures to those required by European data protection law; and crucially

  • ensuring that the public authorities in the importing jurisdictions – namely government and law enforcement agencies as well as intelligence services – do not have unnecessary, disproportionate and uncontrolled access to such data.

In principle, this is not an unreasonable expectation – after all, the point of the data export restrictions is to protect people’s fundamental right to privacy and their personal information. The challenge is the degree to which these two points need to be demonstrated and met. The second element of the test in particular has now become a nearly insurmountable ‎task.

With the Safe Harbor ruling, it became clear that any claim of adequacy must meet the standards of the Charter of Fundamental Rights of the European Union. So when the Privacy Shield was unveiled in February 2016 following two years of negotiations between the European Commission and the U.S. Department of Commerce, the key focus was on the assurances offered by the U.S. government in respect of any potential access to data and its oversight. Shortly afterwards, the Article 29 Working Party issued a detailed statement pointing out the deficiencies of the Privacy Shield in this respect. More recently both the European Parliament and the European Data Protection Supervisor have seen the glass half empty and dismissed the protections and controls of the Privacy Shield as insufficient.

Worryingly, since access to data by public authorities is not an issue on which adequacy decisions have focused in the past, all existing adequacy findings by the European Commission could potentially be at risk if a new Max Schrems decides to target any countries currently deemed to be adequate. And since, according to the CJEU, this aspect of data protection adequacy needs to be present no matter what, any established tools – like standard contractual clauses – used to deploy EU data privacy and security measures may also end up being insufficient to overcome the restrictions on transfers.

This is now in the process of being tested following the Irish Data Protection Commissioner’s action requesting the High Court of Ireland to refer the status of standard contractual clauses to the CJEU for a preliminary ruling. It may still take a year or two for the CJEU to rule on this but given the precedent of the Safe Harbor decision, it is not inconceivable that by the time the General Data Protection Regulation comes into force in May 2018, the standard contractual clauses adopted by the European Commission have become unsuitable to legitimise dataflows. The same could of course be true of any legal tool that does not incorporate a legally valid and effective redress mechanism against government access to data.

And then what?

The prospect of a digitally isolated Europe in the 21st century seems unrealistic but this may be a price that European regulators and judges are willing to demand for the sake of protecting fundamental rights. The question right now is what can possibly be done by any organisation that wishes to avoid digital isolation in the absence of a politically perfect solution. Part of the answer may well be persuading the democratic governments of the world to create legal frameworks that enable the levels of control and oversight that the CJEU demands. But more realistically and in the short term, organisations can attempt to complement existing transfer tools with additional protections aimed at limiting disproportionate disclosures of personal data to public authorities.

More than 100 years after Houdini’s miraculous escapes, it is still a mystery how he managed to pull them off, but he did. Untying the current mess affecting transfers of personal data from the EU will require similar skills, but it can be done. The trick in this case is to pay attention to what the CJEU has identified as risks to our privacy and address those risks in a sufficiently credible way, so that whatever the uses and disclosures of personal information, our digital freedom is not unjustifiably compromised.

This article was first published in IAPP’s Privacy Perspectives on June 2, 2016.

 

Authored by Eduardo Ustaran

Search

Register now to receive personalized content and more!