Key takeaways from this February include:

  • Enforcement: Businesses that use third party data to conduct marketing should review the lawful basis on which each party relies to collect and process the data in light of a UK tribunal’s limiting of the ICO’s enforcement notice to Experian on appeal;
  • Digital Services Act: Covered entities should ensure they are adhering to reporting guidelines and expectations as compliance deadlines began to pass, following criticism from the EU Commission on one submission’s lack of detail;
  • EDPB work programme: Companies should monitor the ongoing rollout of new EDPB guidance and tools as part of its 2023/24 work programme that may clarify open compliance questions, including forthcoming direction on assessing legitimate interests and the interplay between the GDPR and the EU AI Act;
  • Third country data transfers: New EDPB guidance has clarified that GDPR-covered data importers established outside the EU must still implement third country data transfer mechanisms;
  • AI: Businesses should take note of the CNIL’s growing interest for AI issues as it creates a specialised AI department and publishes recommendations regarding learning databases following its increased enforcement activity;
  • Gaming: Game developers with players under 18 should review the ICO’s latest guidance outlining children’s data protection requirements in the gaming context, and take note of obligations to identify minor users, marketing restrictions, and the recommendation to avoid “nudge” techniques;
  • Data access requests: Controllers in Germany may be able to probe the legitimacy of a data access request following a German Court of Appeal ruling finding that such requests must be motivated by data protection interests—another in a growing series of cases on either side of the debate;
  • Cumulative violations: Organisations should be aware of enforcement risk posed by having multiple minor instances of GDPR violations, in light of the Norwegian DPA’s recent fine of almost €1m on a fitness chain for minor violations from handling of data access and erasure requests, retention, and data transfers;
  • Cookie walls: Businesses that make users’ access conditional on their consent to data processing should consider recent guidance from the Danish DPA to ensure that, among other criteria, they are providing a reasonable alternative at a reasonable price (if any); and
  • Age-appropriate AI: AI businesses with minor users may want to revisit age-appropriate restrictions detailed in the Italian Garante’s order against an U.S. AI bot company that had to halt operations in Italy.

These developments, and more, covered below.

UK tribunal limits ICO enforcement order but partially upholds lawful basis objection

What happened: A tribunal rejected certain aspects of the UK ICO’s October 2020 enforcement notice against Experian, a credit reference agency that holds and processes data relating to essentially the whole of the UK’s adult population.

As we previously discussed, the ICO took issue with Experian’s reliance on legitimate interests as the lawful basis for direct marketing data processing and its privacy notice for not being transparent and Experian having not provided it at all to millions of UK residents.

The tribunal agreed with the ICO that where Experian obtained personal data for marketing purposes from a third party that had collected it on the basis of consent, the consent would no longer be informed and withdrawal of consent is complicated, though Experian had since changed this practice.

However, the tribunal disagreed with the ICO and found that the use of credit reference data for direct marketing purposes is not inherently unfair, and that Experian’s privacy notice was sufficiently transparent. It also found the order to provide privacy notices to millions of data subjects to be disproportionate. The ICO can appeal.

What to do:  Notwithstanding the decision, businesses may want to review the lawful bases on which data obtained from third parties was collected, recognising that consent is likely insufficient or operationally challenging in many cases. The recipient entity may be able to rely on legitimate interests in certain circumstances.

Digital Services Act reporting deadline results in the first warning to an online platform

What happened: 17 February 2023 marked the first Digital Services Act (“DSA”) deadline, which implemented the obligation on all “very large” online platforms and search engines to report on, among other data points, average monthly active user figures, advertising revenue, and political ad purchases. All covered entities published this baseline transparency report on time and used a common template. However, the EU Commissioner for Internal Market raised significant concerns over one online platform’s “disappointing” submission as being “short of data, with no information on commitments to empower the fact-checking community,” and other leadership expected “a more serious commitment to their obligations” in light of identified disinformation threats from Russia.

Earlier in the month, the EU Commission published DSA reporting guidance for covered entities that clarifies, among other topics, who qualifies as an “active recipient” of the service, for example, by including in that definition those able to access content without being a registered user and third party advertisers on platforms, while discounting bots, scrapers, and duplicative devices for a single user.

What to do:  Entities subject to the DSA’s transparency reporting obligations should refer to the new guidance and keep the legislative requirements in mind ahead of the next reporting deadline in August, and may wish to review the broader outline of DSA obligations and deadlines, which we previously outlined here.

EDPB’s new work programme prioritises new technologies and cooperation amongst supervisory authorities

What happened: The EDPB adopted its work programme for 2023/24. Its overarching goals build on recent draft and finalised guidance and include:

  • Developing implementation tools and key concepts guidance. The EDPB aims to produce tools to aid in GDPR implementation and publish further guidance on breach notification and the legitimate interest lawful basis.
  • A fundamental rights approach to new technologies. In pursuit of a common EU approach, the EDPB will publish guidance on the use of facial recognition by law enforcement, Blockchain, and the interplay between the AI Act and the GDPR.
  • Facilitating harmonisation amongst national supervisory authorities. As multi-jurisdiction data protection concerns expand and opportunities to rely on a lead supervisory authority may narrow, the EDPB is emphasising consistency of decisions between national supervisory authorities through, among other measures, the development of approval procedures that require a cooperation phase and the creation of task forces.
  • A global approach. To promote GDPR data protection beyond EU borders, the EDPB will provide guidance on the use of data transfer tools, including certification as a tool for transfers, and engage with third supervisory authorities on issues relating to government access to personal data.

What to do: Companies may want to review existing procedures with reference to the draft and newly adopted EDPB guidelines noted in this latest work programme as they become available, and in light of supervisory authority enforcement trends, especially as applied to operations in emerging technologies, use of cookies, and deceptive design patterns in social media platform interfaces.

Newly adopted EDPB guidelines clarify criteria of third country personal data transfers  

What happened: Following public consultation, the EDPB adopted Guidelines on the Interplay between the GDPR’s territorial scope and data transfer restrictions to third countries.

The guidance clarifies that data importing controllers and processors established outside the EU but whose processing is covered by the GDPR’s extraterritorial scope have to implement third country data transfer mechanisms just as data exporting controllers and processors established inside the EU must do.

As the GDPR does not define the “transfer of personal data to a third country,” the new guidelines identify three cumulative criteria to identify an international transfer requiring specialised conditions, such as by providing appropriate safeguards.

Namely, (i) where a controller or processor is subject to the GDPR for the given processing, and (ii) personal data is exported to a third country importer, (iii) that importer, irrespective of whether or not it is subject to the GDPR for the given processing, must implement adequate third country data transfer mechanisms.

What to do: Controllers and processors established outside the EU but whose processing is subject to the GDPR’s extraterritorial scope should review their third country data transfer mechanisms and processes to ensure adequate compliance.

CNIL establishes an AI department and announces forthcoming guidance on learning databases

What happened: The CNIL announced that it is creating a new department fully dedicated to artificial intelligence. The department, constituted of legal experts and specialised engineers, is aimed at strengthening the CNIL’s AI expertise and solidifying its position as the lead French regulator for AI privacy concerns, while preparing for the implementation of the EU AI Act. The CNIL will soon propose initial compliance recommendations on learning databases used for training and development of AI, for both private and public entities.

What to do: Businesses should keep an eye out for further statements and recommendations from the CNIL on these issues, which may echo concerns raised in its enforcement actions to date. The new department signals the CNIL’s strong interest in AI-linked privacy issues. As we covered here, the CNIL fined Clearview AI €20 million for “intrusive and massive” data processing without consent or a valid legitimate interest, among other failings.

UK ICO issues guidance to game developers to comply with the Children’s code

What happened: The ICO issued guidance to game developers and providers to assist in embedding child data protection considerations early on in gameplay design and aligning practices to the Children’s Code. The recommendations are based on a series of voluntary audits of developers, studios, and publishers within the gaming industry.

In particular, the guidance recommends:

  • risk assessments during game design, with regular review following launch to identify any unexpected user groups;
  • processes to identify users under age 18, ensure a reasonable degree of certainty, and prevent false age declarations;
  • the inclusion of check-points and age-appropriate prompts to take breaks from extended sessions to prevent detriments to health and well-being;
  • restrictions on behavioural profiling for marketing including opt-in default settings and controls to monitor product placement, advertising, and sponsorship; and
  • a thorough review of and general discouragement of the use of “nudge techniques”—those that encourage poor privacy decision-making—in social media competition and partnership marketing, especially where social media account linking is tied into user rewards.

What to do: Businesses that have engagement from minors, especially game designers and gameplay providers, should consider revisiting policies and procedures to ensure they align with this latest guidance and the Children’s code, and consider using the ICO’s linked design guidance to assist in implementation.

German court confirms that access requests may constitute an abuse of rights if not motivated by data protection interests

What happened: On appeal, a German court upheld the rejection of a data subject’s access request as an “abuse of rights” where the customer sought detailed information about nine years of health insurance premiums adjustments for the purpose of calculating whether the insurance premium increases were legitimate. The customer asserted a right to receive this information under the GDPR, which the court rejected because the purpose of the request was not to assess whether the personal data processing was in conformity with the GDPR. This reversed the court’s previous position that the motivation of a data subject’s access request is irrelevant to its lawfulness.

What to do: Given the unsettled law in this area, and significant divides between jurisdictions, businesses should tread carefully before seeking to reject access requests on the basis of the underlying motivation.

Norwegian DPA fines fitness chain almost €1,000,000 for multiple minor violations related to DSARs and unlawful processing 

What happened: The Norwegian Datatilsynet fined a fitness chain NOK 10,000,000 (close to €1,000,000) for a series of GDPR violations found following multiple complaints.

The Datatilsynet found that the business:

  • failed to action and/or respond to two access requests promptly, even where one requestor sent reminders;
  • failed to action three erasure requests promptly;
  • failed to inform data subjects properly about its data retention policy concerning banned members;
  • retained a terminated customer’s personal data beyond the retention period established by the banned members policy; and
  • failed to establish a lawful basis to process the training history data of the members of its fitness centers.

The Datatilsynet noted that although the individual episodes of mishandling data subject’s requests were not very grave, but the repeated occurrences over time signalled systemic issues.

What to do: Organisations should take note that multiple minor violations occurring over time may be interpreted by regulators as a systemic problem worthy of enforcement action instead of a warning. The case further serves a reminder that regulators expect an organisation’s data protection policies to match actual practice, in particular on data retention.

Following multiple inquiries, Danish DPA publishes cookie wall guidance

What happened: Following its investigations into complaints about two companies’ use of cookie walls, for an online marketplace and a media group, the Danish Datatilsynet issued guidelines for companies that make access to their websites, services or content conditional on consent to processing personal data.

The guidelines establish four baseline criteria for compliance:

  • A reasonable alternative. Users who do not wish to consent to data processing must be offered a reasonable alternative, such as access via payment. The Datatilsynet found that the media group provided more content (i.e., a better service) to paying users than to consenting users, which was not a reasonable alternative.
  • A reasonable price. Where the alternative to user consent is payment, companies have wide discretion in determining the pricing but must not set an unreasonably high price that undermines the user’s freedom of choice.
  • Limited to what is necessary. Companies offering a choice between payment or consent must be able to demonstrate that all of the purposes for which consent is requested form a necessary part of the alternative paid access. Neither company that the Datatilsynet investigated met this factor and both were asked to demonstrate the necessity of data processing for statistical purposes for the paid version or formulate a consent solution.
  • Processing of personal data when users have paid. Where users have paid for access, companies cannot process personal data for purposes beyond what is necessary to provide the content or service.

What to do: Businesses, especially those wholly or partly financed by advertising revenue, may want to review their use of cookie walls, particularly to ensure that a reasonable alternative is provided in lieu of access via consent to data processing. When doing so they will consult the EDPB Consent Guidelines 05/2020, keeping in mind that local regulators such as the French CNIL also issued guidance.

Italian Garante bans AI-powered chatbot Replika over GDPR violations and concerns for children and emotionally vulnerable individuals

What happened: On 3 February 2023, Italy’s data protection agency prohibited the AI chatbot Replika by ordering its developer, U.S. company Luka Inc., from processing the personal data of Italian users until further notice. Replika generates a virtual friend for users using text and video interfaces.

Finding that Replika violates the GDPR and poses a risk to children and emotionally vulnerable individuals, the Garante determined that Replika:

  • provides age-inappropriate content in its replies;
  • lacks age verification and blocking mechanisms, requiring only a name, email address, and gender to create an account;
  • does not comply with transparency requirements; and
  • processes personal data unlawfully, as reliance on performance of a contract is invalid as applied to children under Italian law.

Luka must inform the Garante within 20 days of the measures taken to remedy the violations. If it fails to do so, the Garante may impose a fine. It is unclear if or when Luka can resume Replika’s Italian operations.

What to do: Businesses that may face the risk of children accessing their content should review the age verification mechanisms in place and ensure that they require sufficient age verification. They should also consider implementing filters for blocking child users or users or devices where needed. The enforcement action also shows that regulators can take measures to effectively block businesses from operating for significant failings.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at sdthomas@debevoise.com.

Author

Tristan Lockwood is an associate in the firm’s Data Strategy & Security practice. He can be reached at tlockwood@debevoise.com.

Author

Maria Epishkina is a corporate associate and a member of the Mergers & Acquisitions, Capital Markets and Private Equity Groups. She can be reached at mepishkina@debevoise.com

Author

Sophie Michalski is a trainee at the Debevoise London office.

Author

Alexandre Pous is a legal intern and trainee in the Litigation Department at the Debevoise Paris office.

Author

Maria Santos is a trainee associate in the Litigation Department.