News

Presidential Council of Advisors on Science and Technology Stresses “Use” Issues in Report

Image
Image

On May 1, the Presidential Council of Advisors on Science and Technology (PCAST) released Big Data: A Technological Perspective. The report is billed as a technical accompaniment to the 90-day Big Data review performed by Presidential Counselor John Podesta and addresses “the nature of current technologies for managing and analyzing big data and for preserving privacy” and the evolving nature of those technologies. While the PCAST report, released to coincide with Counselor Podesta’s review, has received less media attention than the Podesta report, its findings may influence the Administration’s information-governance expectations of businesses.

In its report, the PCAST offers a clear account of the tradeoffs between privacy and opportunity in the era of Big Data by including a number of concrete information-related examples that are “happening today or very soon,” or in the near future, and evaluating how those new technologies may impact current approaches to privacy. The PCAST also proposes recognition of new privacy harms (e.g., invasion of private communications and public disclosure of inferred private facts) to account for the emerging privacy challenges offered by Big Data. Importantly, the PCAST “believes strongly” that the positive benefits of technology “are (or can be) greater” than any new harms.

Most importantly for businesses are the areas in which the PCAST has drilled down to offer reflections of practices that are currently part of most organizations’ and policymakers’ approaches to good information stewardship.

In line with Counselor Podesta’s review, the PCAST questions the effectiveness of “notice and choice” in today’s connected Big Data environment. Beginning with the issue that users rarely actually read notifications about how data is collected and used before giving consent, the Report defines the “conceptual problem” with notice and choice as placing the burden of privacy protection on the individual, who is often ill-equipped to evaluate complex privacy considerations, creating an uneven playing field between user and service provider.

The PCAST ultimately characterizes the notice and choice scenario as “a kind of market failure,” and proposes, as a successor to notice and choice, that third-party intermediaries be established to vet the information collection and use practices of organizations and ensure that they are in line with the “privacy profile” chosen by consumers.

The PCAST also proposes a paradigm shift from privacy protection based on collection of data to a framework based on uses of data. According to the PCAST, a framework based on uses of data would help account for the metadata floating around the digital environment that has been generated in digital form since the advent of computing and in analog form since much earlier. A focus on uses of data would also account for a looming issue of the Big Data era that information collected from individuals under a certain pretext today could be combined with existing datasets tomorrow to make new inferences about those individuals.

Finally, the PCAST recognizes the role that privacy enhancing technologies can play in protecting individuals’ privacy. However, the report notes that technical solutions cannot provide an adequate level of privacy protection without rules and regulations to encourage their development and use.

Notably for businesses, the PCAST discredits the use of some emerging—and established—technical means for protecting individual privacy in the era of Big Data. Some of these tools, “anonymization, data deletion, and distinguishing data from metadata,” are already being implemented by organizations as key aspects of their information-governance frameworks.

The PCAST has examined privacy and Big Data from academic and practical perspectives, using a holistic approach to drill down on certain shortcomings of current approaches to protecting privacy. While the The PCAST was not instructed to make specific policy recommendations, it offers the following high-level recommendations:

  • Policy attention should focus more on the actual uses of Big Data and less on its collection and analysis.

  • Policies and regulation, at all levels of government, should not embed particular technological solutions, but rather should be stated in terms of intended outcomes.

  • With coordination and encouragement from the White House Office of Science and Technology Policy (OSTP), the Networking and Information Technology Research and Development (NITRD) agencies should strengthen U.S. research in privacy-related technologies and in the relevant areas of social science that inform the successful application of those technologies.

  • OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protection, including career paths for professionals.

  • The United States should take the lead both in the international arena and at home by adopting policies that stimulate the use of practical privacy-protecting technologies that exist today.  It can exhibit leadership both by its convening power (for instance, by promoting the creation and adoption of standards) and also by its own procurement practices (such as its own use of privacy-preserving cloud services).

The PCAST is interesting in part because of the diverse range of advisors and outside experts, including industry representatives, appointed to help form the group’s approach. The report accounts for the potential “unintended negative consequences” that overly restrictive or misguided privacy rules and regulations may have on the economic growth and seeks instead to offer workable solutions for the shifting technological and regulatory landscape offered by Big Data.

Special thanks to Julian Flamant for his assistance in the preparation of this entry.

 

Authored by the HL Chronicle of Data Protection

Search

Register now to receive personalized content and more!