Key takeaways this April include:

  • UK children’s data protection focus continues: Businesses may wish to review policies and procedures for dealing with children’s data in light of recent UK ICO fines and guidance, especially to ensure that terms of use are adequately enforced.
  • Updated EU “One-Stop Shopguidance: Non-EEA established businesses may want to revisit their breach notification procedures after updated EDPB guidelines state that the mere presence of an Art. 27 representative in an EU Member State does not constitute a “main establishment” for purposes of triggering the One-Stop-Shop system, potentially requiring businesses to notify in every member state in which affected individuals reside.
  • Italy lifts ban on ChatGPT ban but EDPB kicks-off investigation: Businesses operating and using Generative AI in Europe, or data sourced from individuals in Europe, should consider regulatory risks arising from emerging European data protection authority focus on developments in AI.
  • New UK National Cyber Security Centre (NCSC) joint guide: Software designers and manufacturers may want to review existing development procedures and products against new guidance which emphasises the importance of privacy by design and default. See also our previous insights about Privacy by Design from the UK ICO’s Product Design Forum.

These developments, and more, covered below.

 UK ICO fines TikTok £12.7 million for misuse of children’s data

What happened: The ICO fined TikTok £12.7 million after finding that up to 1.4 million children under 13 were allowed to use the video-sharing platform between 2018 and 2020, contrary to TikTok’s own terms of service. The fine was originally set at £27 million, but the regulator ultimately dropped an additional, provisional finding that TikTok had unlawfully handled special category data.

In the UK, parent or carer consent is required where businesses offer “information society services” to children under 13. The ICO criticised TikTok for failing to:

  • obtain parent or carer consent in circumstances where it should have been aware that children under 13 were using the platform;
  • carry out adequate checks to identify and remove children under 13 from its platform;
  • adequately respond to internal concerns raised with senior employees about children under 13 using the platform;
  • provide proper and easy-to-understand information to users about how data was collected, used, and shared to ensure informed decision-making about whether and how to engage with the platform; and
  • ensure that personal data was processed lawfully, fairly, and in a transparent manner, in light of the above.

What to do: Businesses may wish to review policies and procedures for dealing with children’s data. In particular, businesses may want to consider the adequacy of existing policies for monitoring whether terms of use are adequately enforced and whether, more broadly, sufficient processes have been implemented to identify and limit use of covered services by children under 13.

UK ICO issues guidance on when an online service is “likely to be accessed by children” and needs to comply with its Age-Appropriate Design Code

What happened: The UK ICO published draft guidance addressing the meaning of “likely to be accessed” by under-18s in the context of its 2020 Age-Appropriate Design Code. Advocating for a common-sense approach, the regulator confirmed that businesses must assess whether the “nature and content of their service” has “particular appeal for children” and “the way in which the service was accessed and any measures put in place to prevent children gaining access.” The ICO reiterated that the Code applies equally to services that are not intended for use by children, but are nonetheless likely to be used by them, as it does to services aimed at children. Adult-only and age-gated services may therefore still be in scope.

What to do: Consultation on the draft guidance closes on 19 May 2023. Once finalised, businesses may want to re-assess the Age-Appropriate Design Code’s applicability. The Guidance includes helpful examples and case studies. If the Code applies, businesses may wish to document this assessment and take steps to build compliance with the Code into existing product development and monitoring processes.

Amsterdam Court of Appeal allows automated decision-making and profiling-related access requests

What happened: In three linked appeals (here, here and here), the Amsterdam Court of Appeal ruled that drivers of Uber and Ola, ride-hailing companies, could access certain data used in processes for determining rates, fares and fraud probability scores under their data subject access rights, which give individuals a right to request information about automated decision-making affecting them, including meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

The decisions outlined important principles relevant to GDPR-covered automated decision making, including:

  • the right to access information about automated decision making mirrors the category of automated decision making subject to GDPR Art. 22 (i.e., decision-making based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her);
  • the possible circumstances which meet the materiality threshold in Art. 22(1) likely include decisions affecting a person’s: (i) financial situation, such as his ability to qualify for a loan; (ii) access to healthcare services; (iii) employment opportunities; and (iv) access to education, for example admission to a university;
  • human intervention, in order to take a decision-making process outside of the concept of automated decision making in Art. 22: (i) must be “more than a mere symbolic act”, (ii) requires intervention by someone with authority to change the outcome; and (iii) may, depending on the circumstances, require that an individual has the opportunity to be heard on the decision;
  • while the growth and complexity of machine learning can make it difficult to understand how an automated decision-making process or profiling works, controllers must find simple ways to explain to individuals the rationale behind automated decision making or the criteria used.

What to do: Businesses may wish to review existing decision making processes to assess whether they fall within the scope of automated decision making under GDPR Art. 22, as set out by the court. Where they do, businesses may wish to review, in particular, whether they comply with the knock-on requirements and consider the adequacy of existing processes for responding to data subject access requests.

EU EDPB’s updates of personal data breach notification guidelines suggest that non-EEA businesses cannot rely on the “One-Stop-Shop” Exception

What happened: The EDPB finalised a previously proposed update to its guidelines which states that the mere presence of an GDPR Art. 27 representative in a EU Member State does not constitute a “main establishment” for purposes of triggering the One-Stop-Shop system, meaning a controller subject to the GDPR’s extraterritorial scope may have to notify relevant authorities in each Member State where affected individuals reside.

We discussed the draft guidance in our October 2022 European Data Protection Roundup.

What to do: Businesses established outside the EEA with an Art. 27 representative having relied on previous EDPB guidance may want to prepare for a situation where notifications are required to authorities in numerous Member States (i.e., up to 45 when including the European Economic Area and the 16 German state data protection authorities) in the short GDPR prescribed timescale.

Italy agrees to lift ChatGPT ban

What happened: The Italian Data Protection Authority, the Garante, announced that it would lift its ban on ChatGPT, on the basis of an agreement by ChatGPT’s owner, OpenAI, to implement a number of measures outlined by the supervisory authority. The stated aim of those measures is to bring ChatGPT into compliance with EU data protection regulation, focussing on transparency, user rights, and protection of minors.

Despite the Garante’s flip, other European data protection authorities have continued to express concern about Generative AI tools, such as ChatGPT, with the EDPB appointing a taskforce to investigate.

What to do: Businesses operating and using Generative AI in Europe, or data sourced from individuals in Europe, should consider regulatory risks arising from emerging European data protection authority focus on developments in AI. In a number of recent blog posts, we have summarised key developments and outlined practical tips for businesses:

UK NCSC issues new joint guide on security by design and default to prioritize security and reduce vulnerabilities

What happened: The NCSC along with international cybersecurity authorities and the Government Communications Headquarters (GCHQ) issued guidance recommending software manufacturers build software security into their design processes prior to developing, configuring, and shipping their products. In particular, the joint guidance recommends:

  • embedding secure-by-design principles into design processes, which prioritises the security of the customer as a core business goal from the development stage (rather than as an added-on feature);
  • embedding secure-by-default principles into design processes, whereby products are protected against cybersecurity risk “out of the box” without any additional configurational changes or cost required;
  • using the guidance to evolve development processes, such as tracking and reducing “hardening guides”, which list methods of securing products, in favour of “loosening guides”, which explain the changes that users should make (along with the associated security risks of any decision).

Privacy by design has been an increasing focus in the UK. We recently outlined insights from the UK ICO’s Product Design Forum which aimed to help product designers and managers incorporate “privacy by design” or “data protection by design and by default” principles into their work. In particular, the ICO, offered practical guidance that may help companies better understand current market practice, the ICO’s expectations, and the direction of forthcoming regulatory guidance on privacy by design issues.

What to do: Software designers and manufacturers may wish to review existing development procedures and products against new guidance which emphasises the importance of privacy by design and default. The joint guidance signals increasing focus on design principles by authorities around the world and the increasing likelihood of future legislation in the UK and elsewhere mandating such requirements.

 

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at sdthomas@debevoise.com.

Author

Tristan Lockwood is an associate in the firm’s Data Strategy & Security practice. He can be reached at tlockwood@debevoise.com.

Author

Maria Epishkina is a corporate associate and a member of the Mergers & Acquisitions, Capital Markets and Private Equity Groups. She can be reached at mepishkina@debevoise.com

Author

Maria Santos is a trainee associate in the Litigation Department.