
Trump Administration Executive Order (EO) Tracker
London, April 26 2012: IAPP Europe is currently holding its Data Protection Intensive 2012 in London, of which Hogan Lovells is a sponsor. On the second day the conference heard from keynote speakers on the regulatory landscape in Ireland and on the economics of privacy.
Billy Hawkes, Data Protection Commissioner for Ireland, spoke in particular about the audit of Facebook carried out by his office. Alessandro Acquisti of Carnegie Mellon University illustrated with some interesting examples how transparency and control are not enough to ensure privacy. Privacy was instead more about protection from control.
In describing the general approach of his office, Billy Hawkes said that concerns of privacy should ideally not inhibit innovation. The office should act as an enabler, and it followed the "yes, but" approach of the UK Information Commissioner.
There had been a lot of focus on social networks. They were designed for data sharing, and were a free service in exchange for personal data used to target advertising. People make their own choices, so much of the responsibility lies with the individual. But choices must be made clear. Data protection law guarantees some control, but once you’ve signed up it’s not hugely different from another deal – as long as the user knows what the deal is.
The Article 29 Working Party’s WP 163 is helpful in analysing the legal relationships. A lot of activities fall outside data protection law. But there is frequently a record kept of what may be essentially a private conversation. The network provider must guarantee basic access rights as well as other fundamental data protection rights.
He went into some detail on the audit of Facebook which his office had undertaken in 2011. This had been a public audit, with the agreement of the company. The scope of the audit was wide, since for users in Europe, the Middle East and Africa their agreement is with Facebook Ireland. The office conducts about 30 audits a year, but usually these are not made public.
In conducting the audit the Commissioner took account of complaints and comments received from the Norwegian Consumer Council and an organisation called "Europe – v – Facebook" run from Austria. The audit took three months. Facebook fully co-operated. The report was published just before Christmas 2011. It made various recommendations with a timetable for implementation over six months and a formal review in July 2012.
In relation to transparency, the recommendations were that there should be a clearer data use policy, more prominent information on photo tagging, and clearer information on ad targeting and third party applications. On user control there should be informed choices on data sharing, rights of access, membership of groups, and control over applications. There should be an ability to opt out of tagging, and control over the extent of the audience for data posted on the site.
On data retention there needed to be a proper data retention policy, and a greater ability to delete data items and whole accounts. Regarding security and compliance there should be more oversight over third party apps and employee access to data. The European compliance function should be strengthened.
Much action had already been undertaken by Facebook and the Commissioner’s office had remained in contact with them. Based on their dealings with the company to date they did not expect to use enforcement powers. It was possible that some issues would be raised in the courts, if for example the Europe – v – Facebook group didn’t agree with the results.
Under the proposed EU Framework other DPAs would have the right to take part. The case highlighted the need for international co-operation, in particular with US and Canadian authorities. The new Regulation should recognise this international dimension. There must not be a "Fortress Europe" approach.
A principal contribution of EU data protection law was to give the user control over their data. One key conundrum was the extent to which terms and conditions can be offered on a "take it or leave it" basis. Another was how far the legitimate interest test could really extend.
Alessandro Acquisti of Carnegie Mellon University spoke about the economics of privacy.
He said that privacy was about transparency and control, but these were perhaps not enough to guarantee privacy. A lot depended on how the user was treated.
As an illustration he described how when you frame a problem you influence the solution. His team had gone to a mall outside Pittsburgh and offered people free VISA spending cards. One was a 10-dollar card whose utilisation by the user was not tracked. The other was a 12-dollar card whose use was tracked. Half the group were first offered a 10-dollar card and later told the 12-dollar card was available. The other half were offered the 12-dollar card first. The attitude people had to privacy seemed to depend on whether they were first offered a 10-dollar card or a 12-dollar card. One in two people given the 10 dollar card first did not want to trade up to the 12, but only 10% first offered the 12-dollar tracked card opted to trade down to the anonymous 10-dollar card.
Control can lead to paradoxical outcomes. Can more control lead to less privacy?
He had devised a questionnaire on ethical behaviour. Some questions asked about sensitive matters, others not. All respondents were told that answers to all questions were voluntary. However, half the respondents were in addition given a permission check box which they could fill in so that they could positively consent to release of data on certain answers. It seems that people given the check box option felt that they had more control, and as a consequence those respondents allowed more publication of their responses, particularly in relation to the more intrusive questions.
As for transparency, less than 3% of users read privacy policies. 75% of people think that the existence of a privacy policy implies protection. 54% of policies were beyond the grasp of 57% of internet users.
There could be "sleights of privacy", as a conjuror may have a sleight of hand. As an example he cited a questionnaire given to students. Some students were told that only their fellow students would see their responses, while others were told that both students and faculty members would see them. There was a higher propensity to answer questions in the first group, as one would expect. However this difference vanished in the second group if there was a delay between notice and answer – even a delay of 10 seconds. The same happened if respondents were distracted by another question.
There was also the issue of pervasive influence, or biased judgments. He gave an example of tests carried out on respondents who were asked to consider information about a fictional character and conclude whether they liked him or not. The results showed that respondents valued a recent good deed, but ignored an old one. However they deprecated a bad deed, however long ago it might have been.
Technology now provided the ability to find out a great deal about people. He gave the example of how it was possible to identify someone from a photograph using facial recognition software and social network data, from which social security numbers and even credit scores could be derived. All of this was publicly available.
In conclusion he said that privacy was more about protection from control. Regulation was important, as there was so much opportunity for deception. Default settings were crucial. Privacy enhancing technologies were increasingly good, and if we could incentivise the deployment of PETs we would have our cake and eat it!
Authored by Quentin Archer.