Last year, yet again, saw significant GDPR enforcement actions, important regulatory guidance, and an abundance of European legislative activity touching on cyber, data protection and AI-regulatory issues. Here, we unpack five trends which defined 2022 and signal areas to watch in 2023:

  1. Increased AI regulation;
  2. Prioritization of operational resilience;
  3. Focus on children’s privacy;
  4. Continued complication in data transfers out of the EEA/UK; and
  5. Heightened scrutiny on data minimisation.

The robust regulatory environment was evident for the CNIL’s series of year-end announcements, fining Microsoft, TikTok, and Apple, among others, over 70 million euros in total.

Increased AI Regulation

Direct regulation of AI has continued to escalate through 2022 and a number of significant enforcement decisions highlighted mounting regulatory concern.

Originally introduced in 2021, the AI Act edged in 2022 toward a final form and was completed by the European Commission’s publication of a proposed AI Liability Directive in September (covered here). The focus on regulating AI in the EU highlights growing concerns in particular on fairness, transparency, and potential harms that may arise as a result of the misuse of AI. Mirroring concerns across the Atlantic, the draft AI Act also looks set to impose significant obligations relating to the accuracy, completeness and appropriateness of data being used in AI models. These concerns echo in the fines by multiple regulators across Europe against Clearview AI (covered here).

The UK meanwhile has taken a slightly different approach, declining to introduce AI specific regulation but instead framing AI-related challenges principally through a data protection lens. The UK’s approach reflects a broader concern to ensure that AI regulation does not inadvertently stymie digital innovation. With that said, UK ICO guidance (e.g., on biometric technologies) and its own enforcement action against Clearview AI signal similar concerns to those emerging in the EU.

While 2023 seems unlikely to see the introduction of any other big-ticket pieces of AI regulation, there is much to anticipate as the AI Act and AI Liability Directive move toward adoption and AI-focused enforcement action continues to gain momentum. Increasing regulatory scrutiny highlights the need for businesses to be ready to ensure compliance with these new laws, but also that use of AI complies with existing obligations under GDPR and other applicable laws.

Operational Resilience

In 2022, EU and UK lawmakers remained firmly focused on digital operational resilience, expanding measures for both critical infrastructure and the financial services sector.

The EU made big moves. In November 2022, the European Parliament passed both the Second Network and Information Systems Directive (“NIS2”) and the Digital Operational Resilience Act (“DORA”). In addition to imposing new supply chain security and incident reporting obligations, NIS2 significantly expands the scope of existing critical infrastructure regulation to cover additional sectors and entities, including digital infrastructure (data centre service providers, content delivery service providers, trust service providers, public electronic communications networks or electronic communications services providers); space, postal and courier services, and digital providers (social networking services platforms).

Likewise, DORA, a directly applicable Regulation, will impose far-reaching operational resilience requirements and management oversight obligations on financial services firms – including banks, insurers, payment services providers and private equity firms – as well as critical service providers that, for the first time, will be directly regulated by EU financial services regulators.

UK lawmakers meanwhile took a more nuanced approach. In July 2022, the Financial Services and Markets Bill proposed, among other things, for the oversight of critical service providers by the FCA and PRA in a manner similar to DORA. However, the UK has thus fair refrained from imposing broad reaching obligations on all financial entities to the same extent as DORA. Likewise, the UK Government has confirmed plans to move forward with a planned expansion which will both heighten cybersecurity incident reporting and other obligations for critical infrastructure, and permit the UK Government to bring additional sectors within scope without the need for further legislation. It remains unclear whether these changes will be as far reaching as NIS2.

While more detail on the UK approach is set to emerge on all proposals through 2023, the general trend toward more rather than less burdensome requirements seems likely to continue. In the meantime, businesses may want to assess whether they are affected by any of the above changes and, if so, develop compliance strategies as soon as possible.

Children’s Privacy

Multiple regulators turned their attention to companies that process children’s personal data.

On the legislative front, in the UK, discussion has centred on the Online Safety Bill, which remains pending before the House of Lords but looks set to pass imminently. The Bill would impose obligations on social media platforms, which would include duties to enforce age-based access measures, remove illegal content, prevent children from accessing harmful content, and make transparent the risks and dangers posed to children on their platforms, including by publishing risk assessments.

The UK ICO has also made children’s privacy rights a top enforcement priority, stating it was “looking into how over 50 different online services are conforming with the Children’s code and [has] six ongoing investigations looking into companies providing digital services.” Last September, one of those companies, TikTok, received a note of intent from the ICO and faces a fine of up to 27 million pounds for alleged processing of data of children under the age 13 without appropriate parental consent, proper transparency to users, or a legal basis for processing of special category data.

The Irish DPC also pursued multiple enforcement actions related to the processing of children’s data, including TikTok for related conduct to that of the UK case. There, the authority has submitted a preliminary enforcement decision to the EDPB or review. It also fined Meta 405 million euros in connection with prior Instagram settings that that publicly disclosed contact details of children using the business account feature, and, by default, set children’s personal accounts to public. That fine remains pending under EDPB review.

We anticipate that the regulatory focus on the processing of children’s data online will continue throughout this year, and that companies may wish to ensure that they have sufficient measures and processes in place to address how children interact with their platforms or services. Recognising this risk, businesses working with children may want to review past enforcement action and guidance to ensure existing and planned programs meet regulatory expectations and sufficiently account for increasing enforcement risk.

Cross-border Data Transfers

Despite ongoing efforts by lawmakers in Europe and the United States, personal data transfers out of the EEA and UK remain complicated.

Almost two years on from the CJEU’s Schrems II decision (covered here and here), European lawmakers and U.S. officials in April 2022 finally struck a deal to enable a GDPR Article 45 adequacy decision for the U.S. (covered here). The European Commission published the draft decision in December 2022, which is now working its way through various EU institutional checks. Despite the U.S. promising enhanced measures for the protection of EU personal data in the context of U.S. surveillance activities, divergence between EU Member States and a concerned European Data Protection Board may pose hurdles to adoption, which is expected later this year. If passed, a challenge to the adequacy decision before the CJEU is highly likely with the EU’s justice commissioner, Didier Reynders, giving it a “seven or eight out of ten” chance of surviving a challenge.

Amidst this uncertainty, the EU Standard Contractual Clauses and UK Data Transfer Agreement and Addendum will likely continue to play an important role in facilitating cross-border data transfers. However, 2022 was also the year when reliance on the standard contractual clauses became more fraught. In 2022, the Austrian, Italian, French and Danish data protection authorities published decisions to the effect that the EU’s standard contractual clauses and Google’s technical and organisational measures did not, without more, comply with GDPR, especially given the prospect of U.S. intelligence use of exported data. While the decisions have focused attention on Google Analytics, they have not resulted in a broad shift in data transfers more broadly. Moreover, EU divergence on the point was highlighted in December when the Spanish DPA published a decision going the other way in which it focused on the fact that the target of the complaint did not intend to identify users.

In addition to divergence in the form of standard contractual clauses, the EU and UK have now also parted ways in their approach to cross-border transfer risk assessments. In November, the UK Information Commissioner’s Office published a new risk assessment tool reflecting a holistic and transfer-specific risk assessment that focuses on “whether the transfer significantly increases the risk of either a privacy or other human rights breach”. This approach marks a modest shift from the approach recommended by EDPB Guidance, which involves a relatively formalistic comparison of the laws and practices of the exporting country to the laws and practices of the importing country. While the UK approach is potentially more permissive, entities also subject to the EU GDPR should consider the risk that EU regulators might not consider a UK risk assessment adequate.

Businesses engaged in international data transfers from the EU and UK should continue to monitor developments, guidance and enforcement action, as we expect further changes in the landscape through 2023.

Data Minimisation and Hygiene

While data minimisation has always been a core focus of GDPR, enforcement has increasingly highlighted compliance failures – either as core or supporting conduct warranting a fine and remedial action.

In particular, the CNIL took a leading role on this issue in 2022, calling out data minimisation and retention failures of Discord and Infogreffe, among others. The CNIL found that Discord did not have a written data retention policy that covered how to handle user accounts that had been inactive for over three years and did not provide its users with clear and complete information about its data retention periods. In Infogreffe’s case, while it had a company-determined retention period, the CNIL found that personal data of 25% of its users was stored beyond that period, thus failing to keep personal data only for a period proportionate to the purpose of the processing.

In a similar vein, the Italian Garante criticized Deliveroo Italy’s data retention policies within a broader enforcement order for a failure to identify retention periods for each type of personal data and processing purpose. Instead, the regulator found that Deliveroo Italy had retained all personal data for six years, without regard to the nature of the data or processing. The Spanish AEPD also fined two internet domains for, among other GDPR violations, the companies’ practice of storing all registered users’ personal data indefinitely, unless and until the user withdrew consent for data processing.

Collectively, these enforcement actions make clear that companies must have comprehensive data minimisation and retention policies in place, tailored to the particular types of data collected and purposes for data processing, and that throughout 2023 and beyond, regulators are likely to take issue when those policies are lacking or are not actually followed in practice.

We will continue to cover updates related to these trends, GDPR enforcement, and more, on the blog.

To subscribe to the Data Blog, please click here.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at sdthomas@debevoise.com.

Author

Tristan Lockwood is an associate in the firm’s Data Strategy & Security practice. He can be reached at tlockwood@debevoise.com.