Our top five European data protection developments from July are:
- EU AI guidance: Businesses should consider reviewing their AI policies and practices following guidance from the French CNIL and the Irish DPC recommending that businesses conduct AI risk assessments and prepare AI policies and procedures, alongside the EDPB’s statement supporting the appointment of DPAs as the national authorities responsible for AI regulation.
- Enforcement: Following UK Ofcom’s fine against TikTok for failing to respond accurately to a formal request related to parental controls, businesses should ensure adequate measures are in place to facilitate fulsome cooperation with regulatory requests, avoid errors and inaccuracies, and, in the event of an error, carefully consider promptly correcting it.
- UK legislative proposals: Businesses should take note of the UK Labour Party’s proposed Digital Information and Smart Data Bill and Cyber Security and Resilience Bill, the latter of which, if passed, would bring data protection and cyber resilience in the UK more closely aligned with EU standards, as well as through potentially expanded incident notification obligations (including an obligation to report ransom demands).
- Implementation of NIS2: With the second network and information systems directive (“NIS2”) implementation deadline of 17 October 2024 fast approaching, businesses operating in Germany may wish to review Germany’s draft law implementing NIS2 to ensure they are prepared to comply when the law enters into force by, for example, registering their business with the German Federal Office for Information Security and updating incident notification protocols.
- Representative actions: In light of the CJEU ruling that a violation of the controller’s information obligations can be subject to a representative action, businesses should consider the potentially increased risk of litigation following GDPR breach allegations and how to respond.
EDPB, French CNIL, and Irish DPC each publish guidance as EU AI Act officially adopted
What happened: Shortly after the EU AI Act came into force on 1 August 2024 (as previously covered in detail here), the EDPB published a non-binding statement highlighting supervision and coordination issues that could arise in the implementation of the AI Act among EU member states, and recommending that the AI Act and the EU data protection laws, particularly the GDPR, be considered and interpreted in tandem. To that end, the EDPB proposed designating DPAs as the “national competent authorities” under the AI Act to create a single point of contact.
Building on prior European guidance, the French and Irish DPAs published guidance on the deployment of generative AI, large language models and data protection. Both DPAs cautioned that businesses should conduct risk assessments and prepare policies and procedures regarding their use of AI systems. For example, the French CNIL recommends that businesses deploying generative AI systems: (i) identify a specific goal for the systems; (ii) define a list of authorised and prohibited uses; (iii) remain conscious of the limitations of the systems; (iv) use fine-tuned systems; (v) inform end users of prohibited uses and risks; and (vi) ensure compliance with the GDPR through appropriate governance. The Irish DPC’s guidance recommends that businesses using AI systems should put processes in place to facilitate the exercise of data subject rights and that developers keep in mind that publicly accessible personal data still falls within the scope of the GDPR.
What to do: Businesses which use or develop AI products should consider reviewing their AI policies and practices to ensure conformance with a rapidly evolving regulatory environment, including the requirements of the AI Act, while considering governance frameworks and process to ensure compliance with existing applicable data protection laws and DPAs’ expectations.
UK Ofcom fines TikTok for inaccurate information on parental control systems
What happened: UK Ofcom fined TikTok £1.875 million for failing to respond accurately to, and fully cooperate with, a formal request on the company’s parental controls for use in UK Ofcom’s Child Safety Report. The requested information included the implementation of the controls, as well as their uptake and efficacy. UK Ofcom’s investigation cited failings in TikTok’s data governance processes, including inadequate checks in place to provide accurate data, delayed escalation of the error to UK Ofcom as TikTok discovered the issue three weeks prior, and delayed and incomplete revised data submissions trailing over seven months past the original deadline, in violation of the Communications Act 2003.
Three factors affected the penalty amount: (i) TikTok being large and well-resourced;
(ii) the data inaccuracies and delays directly affected UK Ofcom’s regulatory work by delaying the report; and (iii) the fact that TikTok is a first-time offender and self-reported the error. The fine includes a 25 percent reduction because TikTok admitted liability and agreed to settle the matter.
What to do: Ahead of receiving a regulatory inquiry, businesses, especially large and well-resourced ones, should review their internal controls and procedures to ensure adequate measures are in place to respond promptly and accurately to potential requests, including data collection, escalation, and approvals processes. In responding to an inquiry, including for those that begin outside of an enforcement context, businesses should ensure full cooperation and promptly notify the regulator of any errors or inaccuracies later uncovered, keeping in mind any context-specific urgency.
UK government proposes new data protection and cybersecurity legislation
What happened: Following the UK Labour Party’s election victory, the new UK government has announced plans to introduce legislation in place of the proposed Data Protection and Digital Information Bill (“DPDIB”), as previously covered here, with different, albeit overlapping, aims.
The Digital Information and Smart Data Bill emphasises economic growth, and while it retains aspects of the DPDIB, such as the proposed reforms to modernise the UK ICO’s governance structure, the bill also proposes to: (i) establish Digital Verification Services that would streamline routine tasks such as pre-employment checks and buying age-restricted goods; (ii) amend the Digital Economy Act to help the Government share data about businesses that use public services; and (iii) set up Smart Data Schemes to simplify the secure exchange of customer data upon their request.
The Cyber Security and Resilience Bill appears designed to offset cybersecurity concerns and strengthen UK cyber defences by: (i) expanding the government’s current regulatory remit to cover a broader range of supply chains and digital services, and become closer in line with that seen in the EU; (ii) providing cost-recovery funded resources for regulators to support enforcement efforts; and (iii) mandating increased incident reporting to give government better data on cyber-attacks, including potential notification of ransom demands. This approach sits within a broader picture in the UK of realignment towards EU standards.
What to do: Nothing for now, as the proposed legislation is not yet published; however, general monitoring of EU data protection developments – including NIS2 – may now provide a stronger indicator of proposed changes in the UK than under the prior government.
Germany adopts draft law implementing NIS2
What happened: Germany adopted a draft law implementing new requirements under NIS2, which we previously covered in detail here and here. In addition to transposing NIS2 requirements, the adopted draft, largely unchanged from the German Ministry of the Interior’s May draft, designates the Federal Office for Information Security (“BSI”) as a central point of contact with new, broad investigative and enforcement powers, and expands the scope of covered businesses beyond NIS2’s provisions by adjusting large enterprise metrics to be either an employee or revenue threshold, rather than both.
Under Germany’s draft law, applicable requirements will vary based upon an entity’s classification as “very important” (under NIS2, “essential”) or “important”. For example, important entities are only subject to ex-post investigation and enforcement mechanisms while very important entities must demonstrate proactive compliance. Maximum fines for non-compliance can range between €7 million and €10 million.
Member states have until 17 October 2024 to adopt and publish legislation implementing NIS2. Since the adopted draft law has yet to be approved by Germany’s parliament—on recess until early September—and later signed into law by the Federal President, it remains to be seen whether this will be met. Germany is not alone as a number of other EU member states are in the midst of similar legislative stages or are even farther behind.
What to do: Businesses operating in Germany or providing services in Germany should consider whether they are covered by NIS2 and Germany’s implementing law, taking into account applicability nuances in the German draft law. Covered businesses may wish to update their existing governance frameworks and cyber incident management policies ahead of the NIS2 implementation deadline to reflect expected German-specific details, such as notification to and registration with the BSI, in addition to uniform NIS2 requirements, such as the two-stage incident reporting framework, and proposed regulation from the European Commission detailing further security measures and incident reporting thresholds.
CJEU permits representative action against Meta to proceed
What happened: The CJEU ruled that a violation of the controller’s information obligations can be subject to a representative action under the GDPR. Importantly, the CJEU’s decision does not decide whether Meta complied with its information obligations – that question will now be considered by the relevant national court.
The case involved Meta’s “App Centre” where users are informed that: (i) the third-party providers of the applications may collect personal data; and (ii) the users are consenting to the applications’ general conditions and data protection policy. A consumer association brought an action against Meta at a German regional court for failing to comply with the requirements to obtain valid consent from its users under the GDPR, even though no specific infringement or claim by a particular user was cited. The claim was eventually referred to the CJEU regarding its admissibility.
The court initially held that the GDPR did not prevent national legislation from authorising consumer protection associations to bring legal proceedings independently of any infringement of a data subject’s rights. Following a second referral, the CJEU clarified that a representative action can be based on a violation of the controller’s information obligations. The court noted the preventative function of the representative action mechanism and reasoned that the controller had to comply with its information obligations to: (i) observe the principles of transparency and fairness; and (ii) obtain informed and valid consent for the processing under the GDPR. Accordingly, non-compliance with these obligations would lead rights being infringed “as a result of the processing” within the meaning of GDPR Art. 80(2).
What to do: The decision increases the risk of litigation for businesses as consumer associations and similar organisations are potentially more likely to have the necessary time and resources to bring claims for potential breaches of the GDPR. Businesses may wish to review their policies and procedures for responding to such actions to ensure that they reflect the latest understanding of the scope of that mechanism.
To subscribe to the Data Blog, please click here.
The cover art used in this blog post was generated by DALL-E.