News

Insights on the Consumer Privacy Bill of Rights Act of 2015

Image
Image

On Friday, February 27, the White House released its promised draft privacy and data security legislation.

The proposed Consumer Privacy Bill of Rights Act of 2015 (the “Act”) contains few, if any, surprises and would codify the framework that the White House proposed in 2012, imposing privacy and data security requirements across sectors and industries (our analysis of the 2012 proposal can be found here). The proposal has drawn criticism from the Federal Trade Commission and privacy advocates for not containing enough consumer protections, and from the business community for a lack of clarity and the potential to stifle innovation and to create other unintended consequences. 

It appears unlikely that the Act will get much traction in this Congress. However, the White House proposal serves as a potentially useful data point for organizations seeking to develop or maintain privacy programs and industry codes of conduct that address the concerns of consumers and policymakers.

The proposed legislation would apply, subject to specific exemptions discussed below, to “covered entities” that collect, create, process, retain, use, or disclose personal data in or affecting interstate commerce. The bill defines “personal data” broadly. Personal data would generally include data linked to a specific individual or device but not otherwise generally available to the public through lawful means. Personal data also would encompass “unique persistent identifiers” and other identifiers or uniquely “descriptive information about personal computing or communication devices.”

In line with the Administration’s 2012 proposal, the Act identifies seven key principles for safeguarding personal information:

  • Transparency. Covered entities would have to provide individuals with concise, conspicuous, and easily understandable notices that offer accurate, clear, and timely information about the entities’ privacy and security practices. The Act specifies content requirements for such notices, including information about retention practices, disclosures, and mechanisms for obtaining access to personal data.

  • Individual Control. Covered entities would need to provide individuals with reasonable means to control the processing of their personal data that are proportionate to the privacy risks. The Act defines privacy risk as “the potential for personal data, on its own or when linked to other information about an individual, to cause emotional distress or physical, financial, professional or other harm to an individual.” The Act would require that covered entities afford individuals the means to withdraw their consent to the processing of personal data, subject to exceptions such as preventing fraud, protecting communications systems, and where a covered entity’s First Amendment interests or legal obligations require continued processing.

  • Respect for Context. If a covered entity processes personal data in a manner that is not reasonable in light of its “context,” the Act would require the entity to conduct a privacy risk analysis and provide individuals with “heightened transparency and individual control” (i.e., notices reasonably designed to inform individuals of the privacy risks as well as a mechanism allowing individuals to reduce the privacy risk). Context would be determined by, among other things, the interactions between an entity and individuals and what reasonable individuals would understand about the covered entity’s practices. The privacy risk analysis would include, but not be limited to, “reviews of data sources, systems, information flows, partnering entities, and data and analysis uses.” FTC-approved industry Privacy Review Boards could, however, exempt covered entities from providing heightened notice and individual control where the Privacy Review Boards supervise data processing that is otherwise not reasonable in terms of context.

  • Focused Collection and Responsible Use. Covered entities would be permitted to collect, retain and use personal data only as reasonable in light of context. Covered entities would have to delete, destroy, or de-identify personal data within a reasonable time after fulfilling the purposes for which the data were collected.

  • Security. To secure personal data against loss, compromise, alteration, and unauthorized use, or disclosure, covered entities would be required to conduct security risk assessments and implement reasonable security safeguards in light of those assessments.

  • Access and Accuracy. Covered entities would generally be required to provide individuals, upon request and subject to identify verification, with reasonable access to the personal data about them that the entities control. In general, covered entities would need to take reasonable steps, appropriate to any associated privacy risks, to ensure that personal data held by the entities are accurate.

  • Accountability. Covered entities would be required to provide training to employees, conduct privacy assessments, adopt privacy by design processes, require recipients of personal data to use the data consistently with the covered entities’ obligations, and take other reasonable steps to ensure compliance with the Act.

The Act would exempt certain covered entities from the framework and provides several significant exceptions to what constitutes personal data. The following entities would not be subject to the Act: organizations with twenty-five or fewer employees that only process employee and job applicant personal data and organizations that do not process “sensitive data” and either (1) collect personal data from fewer than 10,000 individuals and devices during any 12-month period or (2) have five or fewer employees. Sensitive data includes personal data relating to medical histories, national origin, sexual orientation, financial information, gender identity, precise geolocation information, biometrics, and Social Security numbers. It is worth noting, however, that sensitive data are not subject to heightened protections under the Act.

Personal data would not include employee information used in connection with employment, information about cyber threat indicators, or de-identified data. The Act’s definition of “de-identification” is similar to that of the Federal Trade Commission. Personal data are de-identified if:

  • it is not reasonable to link the data to a specific individual or device;

  • the covered entity publicly commits to not re-identify the data and adopts controls to prevent re-identification; and

  • prohibits each entity that receives de-identified data to refrain from re-identifying the data and to make a public commitment to that end.

Despite these exemptions and limitations, if the Act were to be enacted, it likely would create significant compliance burdens on covered entities and companies doing business with covered entities. The Act’s mandates regarding context, for example, may require covered entities to collect additional information from consumers in order to develop a reasonable understanding of the context of the relationship. In addition, the Act’s access rights and accountability requirements create significant new obligations on covered entities. The Act defines these obligations vaguely and, to a certain extent, creates untested standards that would likely be challenging to implement. Companies would likely struggle to determine how and to what extent consumers should have access to their personal data. Determining whether and when to modify personal data based on consumer requests would likely prove difficult. And companies would likely be challenged to anticipate with some certainty the type of emotional distress that would be deemed to constitute a privacy risk.

The Act does not explicitly distinguish between first and third party data collectors, which could result in consumers being inundated with privacy notices and choice mechanisms. Third parties such as service providers, data brokers, and business partners appear to be subject to the Act’s transparency and individual control obligations. Compliance with those requirements would be challenging for entities that have no direct relationship with consumers. If third parties chose to comply with the Act by directly contacting consumers, the scope and variety of notifications could very well overwhelm consumers and provide them little tangible benefit.

The issues arising from the potential lack of clarity and compliance burden are exacerbated by the potential for significant monetary penalties. Violations of the Act would be treated as unfair or deceptive acts or practices under Section 5 of the Federal Trade Commission Act. The FTC would be able to issue civil penalties of up to $25,000,000. Penalties would be calculated either by multiplying the number of days that an entity violated the Act by an amount set by the FTC and not to exceed $35,000 or by multiplying the number of directly affected consumers by an amount not to exceed $5,000. The latter method would be permitted only if the FTC provided the covered entity with notice of its alleged violations and offered the covered entity to provide a response within 45 days. State attorneys general would be able to pursue injunctive relief for violations on behalf of their constituents, but they would not be authorized to seek other relief. And the Act provides no private right of action.

The draft bill provides for the creation of industry codes of conduct, which the Federal Trade Commission (FTC) would review and possibly approve. Covered entities that abided by approved codes of conduct would be subject to a “safe harbor” against enforcement. Furthermore, in what appears to be an attempt not to stifle technological innovation, covered entities would not be subject to civil penalties within the first eighteen months after the date the covered entity first created or processed personal data.

The bill would not give the FTC broad rulemaking authority, but the FTC would be authorized to issue rules regarding the development of binding industry codes of conduct, providing further exceptions from the definition of “covered entity,” and establishing requirements for industry Privacy Review Boards. The bill would not preempt state laws of general application but would preempt state laws that explicitly regulate personal data processing.

We will continue to monitor the Act and other privacy-related legislative developments and report on them here.

Katherine Armstrong, Counsel in our Washington, D.C. office, contributed to this post.

 

Authored by James Denvil and Patrick Kane

Search

Register now to receive personalized content and more!