European Data Protection Roundup – September 2022

Key takeaways this September include:

  • Google Analytics: Continue to assess carefully the use of Google Analytics. The Danish Data Protection Agency became the latest supervisory authority to suggest that the cross border transfers involved in the use of Google Analytics in the European Union, without more, violates the GDPR;
  • Data Transfers: Regulators are continuing to scrutinise the use of non-EU-based service providers. A German regulator ordered Berlin University to stop using Cisco Webex out of concern that it was not GDPR-compliant;
  • Cybersecurity Resilience: Businesses in the hardware and software spaces may have to contend with new regulatory requirements. The European Commission proposed legislation to address widespread cybersecurity vulnerabilities in Internet of Things products, expanding a legislative push to improve cybersecurity resilience across financial services and information and communications technology;
  • Data Policies: The French CNIL fined a company €250,000 for failings that included the use of weak passwords and not complying with its own data retention schedule;
  • Privacy Enhancing Technology: New guidance from the ICO provides valuable insights about how privacy enhancing technologies (“PETs”) can reduce privacy and data protection risk by achieving privacy by design and by default;
  • Direct Marketing: A single individual’s complaint can lead to enforcement action, as highlighted in the UK ICO’s £30,000 fine against a retailer; and
  • Children’s Privacy: Processing children’s data remains an enforcement focus. The Irish DPC fined Meta €405 million for Instagram’s settings that were alleged to have publicly disclosed children’s contact details and accounts by default. The UK ICO announced it intends to fine TikTok up to £27 million for processing children’s data without consent or sufficient legal basis.

These developments, and more, covered below.

Danish DPA thinks Google Analytics is unlawful

What happened: Following similar decisions by the French, Italian and Austrian data protection authorities, the Danish Data Protection Agency (Datatilsynet), found that the use of Google Analytics, without more, violates the GDPR’s cross-border transfer restrictions.

In broad terms, the regulatory concern is that the European Commission-approved Standard Contractual Clauses (“SCCs”), entered into between website operators and Google, do not provide an adequate level of protection under the GDPR. Specifically, the technical, organisational and contractual arrangements generally adopted by Google with respect to Google Analytics are not sufficient to anonymise individual users, do not eliminate the possibility of surveillance by U.S. authorities, and do not provide adequate levels of transparency and accountability in relation to access requests. While Google permits IP anonymisation when using Google Analytics, users are – in the regulators’ eyes – potentially still identifiable from other data and the anonymisation is only effected once the data is transferred to Google in the U.S.

What to do: The Danish Data Protection Agency referred entities to the CNIL’s detailed guidance on making Google Analytics GDPR-compliant. This includes potential technical solutions (e.g., proxy set-ups) and other practical steps businesses may wish to take (including considering moving away from Google Analytics).

The upcoming EU-U.S. Data Privacy Framework may solve many of these issues after President Biden recently signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities.

Berlin University continues use of Cisco Webex despite German state regulator order’s to stop

What happened: Following years of concern over web conferencing services’ data transfers to the US, Berlin’s Commissioner for Data Protection ordered Berlin University to stop using Cisco Webex by September 30 or face enforcement action. It sent an initial notice to Berlin University in November 2021, which included recommendations to adopt stronger technical measures such as end-to-end encryption, but discussions between the University and regulator over how to use the service in a GDPR-compliant manner seemingly broke down this summer.

The University stated that it intends to continue using Cisco Webex and that such use was GDPR-compliant in light of the enhanced technical measures it took in the past year. The University hopes to resolve the conflict by meeting with the Commissioner.

What to do: While not a guarantee, consideration of the enhanced security measures that regulators have endorsed in connection with the use of Webex or similar services may put an organisation in the best possible position. For Webex, this includes the use of end-to-end encryption, anonymising the technical data, providing users with a VPN, and mandating the use of pseudonyms for calls. It also in practice likely necessitates engaging in negotiations with the vendor to adjust terms, as the University did to ensure that at least content data is processed via EU server locations.

European Commission’s Cyber Resilience Act proposes Internet of Things (IoT) cybersecurity requirements

What happened: In response to the increasing number of software and hardware products that have been victims of damaging cyberattacks, the European Commission released its proposed EU Cyber Resilience Act, mandating cybersecurity requirements for products that have “digital elements,” encompassing all products that are connected “either directly or indirectly to another device or network.” Such products include, for example, password managers, remote access/sharing software, and smartcards and corresponding readers.

If passed, software and hardware manufacturers would be required to comply with increased security standards throughout the product lifecycle (design, development, production, and monitoring). Requirements include:

  • security by default configuration, meaning that a product’s default configuration is set to the most secure available settings;
  • protection from unauthorised access by appropriate control mechanisms (e.g., authentication, identity or access management systems); and
  • minimisation of cybersecurity risks and incidents through security updates that are set to update automatically and additional user notifications when available.

The Act follows a series of EU initiatives to enhance cyber resiliency across industries, including the forthcoming EU Digital Operational Resilience Act (“DORA”) that will regulate a broad range of financial entities and certain of their information and communications technology third-party service providers.

What to do: Nothing immediate. The draft proposal has a long road ahead before it would become law and, if and when passed, has a proposed two-year time frame for manufacturers to come into full compliance.

French DPA fines company €250,000 for excessive data retention and insufficient data security

What happened:  The French data protection authority, the CNIL, fined Infogreffe, a French pay-per-click database compiling legal information on companies, €250,000. Following a website user complaint, the CNIL found that the personal data of 25% of users was kept beyond the company-determined 36-month retention period. Anonymisation was performed manually, only on users’ request, and only for a very small number of accounts. The CNIL also found that Infogreffe did not require the use of a strong password when creating an account and used clear text to send non-temporary passwords by email and to store secret questions and answers in its database. The CNIL considered that Infogreffe (i) failed to keep data only for a period proportionate to the purpose of the processing and (ii) had taken insufficient measures to safeguard personal data.

What to do: Companies may want to review data retention periods and ensure that they are appropriate and applied in practice. A mismatch between practice and policy is likely to stand out to a regulator. Further, companies should consider whether additional measures are warranted and feasible to mask credentials data and boost complexity. While the GDPR does not mandate specific security measures, safeguards including strong passwords and multi-factor authentication are increasingly expected by regulators.

ICO releases new draft guidance on privacy enhancing technologies (“PETs”)

What happened:  The ICO released the most recent instalment of its guidance on the use of privacy enhancing technologies in the context of anonymisation and pseudonymisation.

PETs are software and hardware solutions which specifically serve privacy and data protection functions. A simple example of a PET is a tool that hides and shields certain types of user data (e.g., the user’s name). Other PETs include secure multiparty computation, private set intersection, zero-knowledge proofs, and synthetic data.

The ICO’s draft guidance outlines:

  • how PETs relate to data protection law;
  • the different types of PETs (and their benefits and risks);
  • when entities should consider using PETs; and
  • how to assess the maturity of PETs.

Consultation on the draft guidance closes on 31 December 2022, and we expect the guidance to be finalised soon after.

What to do: Entities should see the appropriate and effective use of PETs as an opportunity to address mounting regulatory concern around the implementation of privacy by design and by default principles, namely, ensuring entities collect and process only the data needed for their purposes, improving the security of data, and minimizing the risk that arises from personal data breaches.

But, as there are risks in using PETs which may arise from their lack of maturity or mistakes in their implementation, organisations should carefully consider when and how they can use PETs to realise their benefits while managing their risks, including by performing a Data Protection Impact Assessment where appropriate.

UK retailer fined £30,000 for sending marketing emails without consent

What happened: After investigating a single complaint, the ICO fined Halfords, a major cycling and motoring retailer, £30,000 for sending nearly 500,000 “downright annoying” marketing emails during a one-day email campaign.

The ICO found that Halfords failed to ensure that it had the individuals’ consent and that the “soft opt-in” exemption did not apply. That exemption permits an organisation to send marketing emails to customers without their consent if the customers’ details were obtained during the course of a sale or negotiation for service and they were given the opportunity to opt out. The ICO determined that the targeted recipients included customers who had already opted out of marketing and that Halford’s email message did not contain an “unsubscribe” link, which is required in every message that an organisation sends under the soft opt-in exemption.

The ICO has prioritised enforcement of direct marking violations and issued a number of fines for violations. Earlier this year, the ICO emphasised its continued attention to direct marketing violations and forthcoming statutory updates in its ICO25 strategic plan.

What to do: Organisations that carry out direct marketing may want to:

  • review the ICO’s guidance to ensure they meet the latest regulatory expectations;
  • if relying on the “soft opt-in,” review consent collection practices, including the timing of opt-outs, to make sure that the exemption applies for each use and that marketing contact lists are properly updated; and
  • review the contents and structure of marketing materials to ensure they include easy-to-use unsubscribe channels.

UK ICO follows other regulators in issuing notice of intent and up to £27 million fine against TikTok for failing to protect children’s privacy

What happened: The UK ICO confirmed that it had issued TikTok with a “notice of intent” that sets out the ICO’s provisional view that TikTok breached UK data protection law between May 2018 and July 2020 by processing the data of children under the age of 13 without appropriate parental consent, failing to provide proper information to its users in transparent manner; and processing special category data without legal grounds to do so.

TikTok, which faces a fine of up to £27 million, will have an opportunity to respond before the decision is finalised. This penalty follows a $5.7 million fine from the U.S. Federal Trade Commission, a £123,000 fine from South Korea’s Korean Communications Commission for related conduct, and the Ireland DPC’s submission of a preliminary enforcement decision to the European Data Protection Board for review.

The ICO stated it is “currently looking into how over 50 different online services are conforming with the Children’s code and [has] six ongoing investigations looking into companies providing digital services,” so we expect continued focus on children’s privacy rights.

What to do: While the final enforcement outcomes remain to be determined, in light of the alleged grounds for violations cited by the ICO, companies may wish to:

  • ensure that age verification and parental consent collection mechanisms are functioning as intended and cover all appropriate processing activities;
  • confirm that privacy notices adequately cover the collection of any children’s data, and use clear, understandable language; and
  • review special data processing activities to ensure that an assessment has been conducted to establish the legal grounds for such processing.

Irish Data Protection Commission fines Meta €405 million following investigation into Instagram’s handling of children’s data

What happened: The Ireland Data Protection Commission announced it was imposing a €405 million fine and various corrective measures in connection with prior Instagram settings that publicly disclosed contact details of children using the business account feature, and, by default, set children’s personal accounts to public. The fine was finalised following an Article 60 process where six “Concerned Supervisory Authorities” raised objections to a draft decision, and the European Data Protection Board issued a binding decision to resolve the dispute, which included certain mandatory adjustments.

What to do: As noted above, businesses processing children’s data may want to review their current collection practices and surrounding compliance framework to ensure they can stand up to increasing regulatory scrutiny.

To subscribe to the Data Blog, please click here.

The authors would like to thank legal trainee Clara Montillet for her contribution to this article.

Author

Luke Dembosky is a Debevoise litigation partner based in the firm’s Washington, D.C. office. He is Co-Chair of the firm’s Data Strategy & Security practice and a member of the White Collar & Regulatory Defense Group. His practice focuses on cybersecurity incident preparation and response, internal investigations, civil litigation and regulatory defense, as well as national security issues. He can be reached at ldembosky@debevoise.com.

Author

Avi Gesser is Co-Chair of the Debevoise Data Strategy & Security Group. His practice focuses on advising major companies on a wide range of cybersecurity, privacy and artificial intelligence matters. He can be reached at agesser@debevoise.com.

Author

Erez is a litigation partner and a member of the Debevoise Data Strategy & Security Group. His practice focuses on advising major businesses on a wide range of complex, high-impact cyber-incident response matters and on data-related regulatory requirements. Erez can be reached at eliebermann@debevoise.com

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at sdthomas@debevoise.com.

Author

Tristan Lockwood is an associate in the firm’s Data Strategy & Security practice. He can be reached at tlockwood@debevoise.com.

Author

Melissa Muse is an associate in the Litigation Department based in the New York office. She is a member of the firm’s Data Strategy & Security Group, and the Intellectual Property practice. She can be reached at mmuse@debevoise.com.