The EU’s General Data Protection Regulation 2016 (the “GDPR”) changed the global privacy landscape, and has been called the “gold standard” for data protection regulation. Recently, a number of U.S. states have introduced privacy laws, which borrow certain GDPR concepts (the “State Privacy Laws”): the Californian Consumer Rights Privacy Act 2020 (the “CPRA”) which amends the California Consumer Privacy Act (the “CCPA”); the Virginia Consumer Data Protection Act; the Colorado Protection of Personal Data Privacy Act (the “CPA”); the Connecticut Public Act Concerning Personal Data Privacy & Online Monitoring; and the Utah Consumer Privacy Act.

Thus far, only the CCPA has regulations in force and has been enforced. Of the laws not yet in force, only the CPA envisions rulemaking. Without guidance, regulations, or enforcement actions to draw from, EU and Member State guidance and enforcement actions are likely to help businesses understand key concepts in the State Privacy Laws.

This blog post explores some of the borrowed GDPR concepts and suggests resources companies might use as they develop their compliance programs.

Data Controllers & Data Processors

With the exception of the California CPRA and CCPA, the State Privacy Laws adopt the terms “data controllers” and “data processors,” which are directly borrowed from the GDPR.

Under the GDPR, data controllers are the main decision-makers about the data handling being undertaken; they determine the purposes and means of processing personal data. UK Information Commissioner’s Office (the “ICO”) guidance suggests that entities that decide the following are likely to be data controllers:

  • Whether to collect personal data;
  • What personal data to collect and from whom;
  • What the personal data is used for;
  • The lawful basis for collection;
  • What to tell individuals about the processing;
  • How to respond to individuals’ rights requests; and
  • How long to retain the personal data.

In contrast, data processors act on behalf of, and only on the instructions of, the relevant controller. Per UK ICO guidance, this means that processors may – subject to any restrictions imposed on them by contract – make decisions on various operational matters, such as:

  • What IT systems to use;
  • How to collect the personal data;
  • How to store the personal data;
  • The security measures it uses to safeguard the personal data;
  • How it will receive and transfer the personal data; and
  • How it will delete or dispose of the personal data.

The data controller and data processor designation is not a linear determination. It is possible for a company to be a data controller and data processor of the same personal data if the company is processing it on behalf of another company (processor) and also on its own behalf (controller) for different purposes (e.g., for internal, legal or regulatory compliance). Companies should undertake a holistic review of each of their data processing activities to identify whether they are a controller or processor for each activity. It is important for companies to identify correctly whether they are a data controller or data processor as the State Privacy Laws attach different obligations to each role.

In addition to the UK ICO guidance, the European Data Protection Board (the “EDPB”) has issued extensive guidance that might also help companies navigate these concepts under the State Privacy Laws.

Consent Requirements

What constitutes valid consent is important under the State Privacy Laws because, in several circumstances, personal data can only be processed with a user’s consent.

The State Privacy Laws all broadly contain the same definition of consent: a clear, affirmative act signifying a consumer’s freely given, specific, informed and unambiguous agreement to the data processing. This is closely aligned with the GDPR. The EDPB has issued in-depth guidance, including best practices, on what constitutes valid consent under the GDPR, and many of these points have been adopted in draft rulemaking under the CPRA and CPA. Some key points to note:

  • The request for consent should contain straightforward language that clearly explains how an individual’s data will be used.
  • Consent requires a statement or clear affirmative act from the individual. An individual’s silence or inactivity does not constitute consent. In particular, companies cannot just continue to provide services until an individual objects.
  • If consent is bundled up as a non-negotiable part of broader terms and conditions, it is presumed not to have been freely given or sufficiently specific.
  • There is no specific time limit for how long consent will last; it depends on the context of the processing, the scope of the original consent, and the individual’s expectations when consenting.
  • The burden is on the controller to prove that it obtained valid consent. UK ICO guidance recommends that controllers maintain consent records detailing:
    • Who consented;
    • When and how they consented;
    • What they were told when they consented; and
    • Whether consent has subsequently been withdrawn.

The record should be kept for as long as the controller is processing the data on the basis of consent. While the State Privacy Laws do not specifically require consent records to be maintained, it is good practice to evidence compliance with the law.

European enforcement action in this area highlights that transparency is key to ensuring that consent remains valid. For example, in January 2019, the French CNIL fined Google €50 million after it found (among other things) that it was too difficult for individuals to know what they were consenting to. The information was “excessively disseminated” across multiple website pages and required too many clicks for individuals to be able to piece all the information together; this lack of upfront transparency meant that the users’ consent was invalid.

Using Data for a Secondary Purpose

The State Privacy Laws also introduce a restriction on using personal data for a secondary purpose. Businesses cannot use personal data for another purpose that “is not reasonably necessary to, or compatible with” the specific purpose(s) for which the data was originally collected (i.e., a secondary purpose), unless the individual consents to the secondary purpose.

The EDPB guidance on consent, which appears to have been heavily relied upon by the Colorado AG in its draft regulations, states that, in order to determine whether the data can be used for a secondary purpose, businesses must assess: (i) which purposes the data was initially collected for; and (ii) whether the new purposes are compatible with the initial ones.

Factors relevant for that assessment include:

  • What processing the individual reasonably believed would occur when their data was originally collected;
  • Whether the subsequent processing is customary or a generally expected practice in the context in which the data was initially collected;
  • The individual’s reasonable expectations regarding the future use of their data;
  • Whether the data being processed is sensitive (and therefore obtains additional protection); and
  • The impact of the additional processing on the individual, including any appropriate safeguards that can be implemented to minimize that impact.

The EDPB guidance and European Commission guidance on secondary purposes also contain multiple examples of when consent for future processing is required. These include:

  • If a bank had a contract with a client to provide them with a bank account and personal loan, the bank would be able to use the client’s personal data to check whether they are eligible for a better type of loan without needing to obtain additional consent. However, if the bank decided to share the client’s data with insurance firms to check if the client is eligible for an insurance product, that processing is not compatible with the original purpose, so the bank would have to obtain the customer’s consent.
  • If a company installs CCTV at the entrance to a building, with a sign that says the footage is for security purposes, the company would not be able to use any CCTV footage captured showing employees failing to fulfil their duties, without obtaining consent from their employees for this secondary purpose. Using the CCTV to monitor whether employees are performing their duties is an unrelated purpose from that which the CCTV was originally collected for (building security) and would not be reasonably expected by the individual given the notice stating the footage is for security purposes.
  • If a medical practitioner enters into a partnership with a travel company, the doctor would have to obtain the patient’s consent before providing the patient’s details to the travel company so the company can provide the patient with tailored “get well quick” holidays.

There are indications that using personal data for a secondary purpose is becoming an increasing area of enforcement for European regulators. For example, in November 2022, the UK ICO issued a reprimand against the Department for Education after its poor due diligence measures enabled a third-party company to access a database of children’s learning records to check whether persons opening online gambling accounts were 18 or over; the UK ICO noted that it would have issued a £10 million fine against the Department for Education had it been a private company.

The Colorado, Connecticut and Virginia laws suggest that secondary use limitations apply where a controller intends to process data in a manner that was not previously disclosed to consumers, as they call for consent for purposes that are neither reasonably necessary to, nor compatible with, the “disclosed” or “specified” purposes for which such personal data was processed. The CPRA and draft regulations implementing it, on the other hand, appear to go beyond this and require consent for processing of data for purposes that are not compatible with the context in which the data was originally collected – irrespective of whether the purpose was disclosed. In such circumstances, companies would not be able to rely on notice of the processing in a privacy policy (issued at the time the data was collected) as a basis for saying the individual has consented to the secondary purpose.

Data Protection Assessments

With the exception of Utah, the State Privacy Laws require businesses to conduct a data protection assessment – sometimes called a privacy risk assessment – under specific circumstances. A data protection assessment helps companies analyze, identify, and minimize the data protection risks of a project or plan to a level of risk that is acceptable in the circumstances.

When Are Data Protection Assessments Required?

These assessments are largely inspired by the Data Protection Impact Assessments (the “Assessment”) that are required under the GDPR, where the processing is “likely to result in a high risk” to individuals.

One challenge for companies is determining when Assessments are required. The Virginia, Colorado, and Connecticut laws require Assessments when:

  • Personal data is processed for targeted advertising;
  • Personal data is sold;
  • Personal data is processed for profiling, where such profiling creates a risk to consumers, including a risk of financial, physical or reputational injury;
  • Sensitive personal data is processed; and
  • Other activities that present a significant or heightened risk to consumers.

The CCPA, on the other hand, has merely directed the CPPA to issue regulations requiring businesses to perform cybersecurity audits for processing that “presents a significant risk” to consumer’s privacy or security as well as regulations on risk assessments more generally.

The EDPB, Irish Data Protection Authority and UK ICO have issued detailed guidance, including when the assessments are required.

The UK ICO and Irish DPO have identified types of processing that trigger the Assessments (something the GDPR directs supervisory authorities to do) that may influence what activities a U.S. regulator would deem to “present a significant or heightened risk to consumers”). These include:

  1. Profiling: Profiling individuals on a large scale (NB: there is no universal definition of “large scale” as it depends on the duration of the processing and the number of people involved).
  2. Biometrics: Processing biometric data to uniquely identify an individual – including through facial recognition systems, and workplace access or verification systems.
  3. AI & New Technologies: Processing that involves new technologies, or novel applications of existing technologies – including AI products, connected and autonomous vehicles, smart technologies, and some IoT applications.
  4. Automated Decision Making: Processing, using automated decision-making, that determines whether an individual can access a product, service, opportunity or benefit.
  5. Special Category Data: Processing, using special category data such as information on an individuals’ race or ethnicity, trade union membership, health information, or their genetic or biometric data, that determines whether an individual can access a product, service, opportunity, or benefit.
  6. Data Aggregation: Processing that compares or matches personal data from multiple sources (data matching) – including where the processing is done for fraud prevention or direct marketing purposes.
  7. Geolocation Data: Processing that tracks individuals’ geolocation or behavior.
  8. Children’s/Vulnerable People’s Data: Processing children’s or vulnerable individuals’ data in order to offer them target marketing, create a profile of them for auto decision making purposes, or offer online services to them.
  9. Indirectly Obtained Data/Non-transparent Processing: Processing of personal data that has been indirectly sourced and has not been obtained in compliance with the GDPR’s transparency requirements – especially where the personal data was obtained under the impossibility or disproportionate effort transparency exemptions.

EDPB guidance includes examples of different types of processing activities and possible relevant criteria for determining whether a DPIA is required for those activities.

What Should an Assessment Look Like?

The Colorado AG’s draft regulations provide welcome color on what the scope and content of the Assessment should entail under the CPA. They also provide that if an entity conducts an Assessment to comply with another jurisdiction’s law, the Assessment will satisfy the requirements of the CPA as long as it is reasonably similar in scope to the Assessment called for by the CPA. The CPA draft regulations lay out a number of topics the Assessment must address, many of which echo what is required under the GDPR.

Following the GDPR example, the Assessment process consists of seven stages:
(1) describing the envisaged data processing; (2) assessing the necessity and proportionality of the processing; (3) assessing the scope and impact of data protection measures already envisioned; (4) assessing the processing’s risks to individual rights and freedoms;
(5) considering any additional measures that can be implemented to reduce those risks;
(6) documenting the process; and (7) monitoring and periodically reviewing the assessment.

Companies looking for a template might refer to the UK ICO’s DPIA template that could be modified to meet the State Privacy Law context.

Next Steps

Businesses can use these lessons learned from the GDPR as a starting point for understanding the potential scope and application of these concepts under the State Privacy Laws. To the extent they are subject to the State Privacy Laws, businesses should start to review their existing data protection policies and procedures in accordance with the European guidance to identify any areas that may need to be amended ahead of the State Privacy Laws’ implementation. Businesses should also continue to monitor for any state-specific guidance relating to these concepts in case they adopt a different interpretation than under the GDPR.

Author

Johanna Skrzypczyk (pronounced “Scrip-zik”) is a counsel in the Data Strategy and Security practice of Debevoise & Plimpton LLP. Her practice focuses on advising AI matters and privacy-oriented work, particularly related to the California Consumer Privacy Act. She can be reached at jnskrzypczyk@debevoise.com.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.