The key development from April must be the European Data Protection Board (“EDPB”) approving the draft UK adequacy decisions from the European Commission (the “Commission”). Companies will be relieved that they are one step closer towards maintaining the seamless flow of data between the EU and the UK. Other notable developments this month include the publication of the Commission’s highly anticipated draft AI regulation, and the launch of a class action in the UK against TikTok.
EDPB approves draft UK adequacy decisions
What happened: The EDPB has approved the Commission’s draft UK adequacy decisions. The EDPB highlighted key areas of strong alignment between EU and UK data protection laws, including provisions on the legal grounds to process personal data, proportionality, and automated decision-making. They also noted various potential areas of concern which the Commission will monitor, and which may result in the suspension of the decisions, including concerns about the UK ‘immigration exception’ to data protection rights, onward transfers of data from the EU via the UK to other third countries, and access to data transferred to the UK by public authorities. The Commission will now seek final approval of the decisions from Member States’ representatives.
What to do: Companies should continue to monitor the progress of the decisions. Given that the current transitional measures are only valid until the end of June, it is likely that a final decision will be given within the next couple of months. If the draft decisions are approved, over the next four years companies will be able to send personal data from the EU to the UK without needing to adopt other data transfer mechanisms such as standard contractual clauses.
Nonetheless, companies should remain prepared to implement such mechanisms, as the risk remains that the adequacy decisions will not be finalised. The threat that the decisions will subsequently be invalidated based on the areas of concern or future divergence means that the issue may return in the future even if the adequacy decisions do get finalised.
EU Commission drafts AI regulation
What happened: The Commission published its proposal for a regulation governing the use of AI. It has a wide scope, covering all aspects of the development, sale and use of AI systems, and also applies to non-EU providers and users of AI systems if they are either placing AI systems into service in the EU, or using the data resulting from systems operating in the EU.
The proposal focuses on four categories of risk:
- unacceptable risk – these forms of AI would be banned;
- high risk – these would be subject to stringent regulatory and disclosure requirements;
- limited risk – the Commission will introduce specific, limited transparency obligations for systems classified as limited risk; and
- minimal risk – the proposal includes no new rules for minimal risk AI systems.
In addition, the Commission proposes a labelling regime (a ‘CE marking’) whereby certain AI systems would need to be assessed and certified for conformity by a qualifying notified body prior to entering the market. See our blog post for more information on the proposed regulation.
What to do: Although the draft legislation will probably not take effect for a number of years, organisations using or providing AI products should ready themselves to carry out risk assessments, review their data, log activity, and begin to assign responsibility for compliance with new AI regulations internally. If passed, the rules will give EU Member States the chance to pick one or more national regulators to enforce the law, with the possibility of fines of up to 6% of global turnover.
UK group litigation against TikTok
What happened: A representative action (essentially the UK version of a US class action) has been commenced against TikTok in England regarding its collection of children’s data in the UK and EEA. The action cites the personal data that TikTok collects, particularly as regards to children, and which appears to include dates of birth, email addresses, phone numbers, device location information, biometric data, and browsing history. The claim alleges that this data is being collected from users without TikTok having adequate safeguarding measures in place. For example, the claim alleges that:
- there are no measures that prevent children from downloading or using TikTok;
- there is no process for acquiring the necessary consent of children’s parents or guardians;
- there is no effective contractual basis or legitimate interest relied upon by TikTok in the collection of data; and
- the data collected by TikTok is subsequently sold to advertisers, with as much as two-thirds of TikTok’s 2020 advertising-related $30 billion revenue.
While registration to join the represented class is ongoing, it is estimated that around 44% of 8-12 year olds in the UK alone use TikTok.
What to do: For now, nothing. The proceedings have been stayed pending the UK Supreme Court’s determination in Lloyd v Google. In that case, the Court of Appeal held that damages may be awarded for loss of control over data without further proof of ‘actual’ damage – a finding which the Supreme Court’s judges were seemingly skeptical of in the case hearing. Whether that finding is upheld will have a significant impact on the viability of the claim against TikTok, as well as other class action data protection disputes.
Spanish DPA fines Equifax €1 million for GDPR violations
What happened: The Spanish data protection authority (AEPD) fined Equifax €1 million in connection with adding publicly available debt information, published by local authorities, to its debt database. The AEPD found that Equifax breached the following GDPR principles:
- Purpose limitation: as the information was not originally processed for the purpose of inclusion on Equifax’s debt database, the AEPD held that Equifax had violated the purpose limitation principle.
- Lawful basis for processing: the AEPD found that Equifax’s legitimate interest in publishing the debt information was not a valid legal basis for processing the data.
- Transparency and notification: Equifax had not informed data subjects that it had processed their data.
- Accuracy: The data published by Equifax was, in some cases, incorrect and out of date. Individuals also did not have the opportunity to update their information.
- Data minimization: As the purpose limitation was violated, the AEPD found that, by extension, it was impossible for Equifax to comply with the data minimization principle.
The financial value of the penalty reflected the number of GDPR principles breached by Equifax, as well as the following factors: the large number of complaints the AEPD received about the behaviour; the high number of people potentially affected by the violations and the significant harm suffered by them; and that Equifax had knowingly breached the GDPR.
What to do: Companies should take note that the prior publication of personal data in the public domain does not mean it can be republished or reused consistent with GDPR. Companies must always establish a lawful basis for processing personal data, or risk the imposition of significant penalties under the GDPR. Companies should also be mindful that the AEPD is handing out large fines to companies that in its view consistently breach data protection law, as seen with the Vodafone Spain fine last month.
European courts divided over data retention
What happened: The highest administrative courts of Belgium and France issued starkly opposing judgments on whether their respective domestic data retention regimes complied with the CJEU’s Privacy International judgment. In this case, the CJEU held that, in light of the ePrivacy Directive, Member States can only compel companies to retain data in situations where there is a serious threat to national security and if the data is only retained for as long as strictly necessary.
The Belgian Constitutional Court ruled that the Belgian data retention regime did not comply with Privacy International, as it provided for generalized electronic communications data retention without considering whether individuals’ behaviour was linked to any wrongdoing, or was a risk to national security. On a strict reading of the judgment, the Belgian Constitutional Court invalidated the regime, and called on Parliament to establish a new system, where data retention is the exception rather than the rule.
Meanwhile, the French Conseil d’Etat found that France’s generalized retention regime was, in fact, broadly compatible with the judgment and that only certain minor amendments were required. The court held that the existing threat to national security currently justifies the French authorities’ generalized retention of personal data. The court nevertheless ordered the French government to regularly reassess the threat to national security to justify the general retention of data and to make the exploitation of this data by French intelligence services conditional on the authorization of an independent authority.
The judgments highlight the tensions that exist between the GDPR’s data protection requirements, and states’ obligations on matters such as national security, terrorism and human trafficking – a point that was particularly highlighted by the CJEU’s Schrems II judgment.
What to do: Telecoms companies in France and Belgium should consider for the time being following the findings of the respective judgments. Changes will be made to both regimes in the coming months, so companies should additionally monitor these developments to ensure compliance. It seems likely that the EU will issue further guidance on this question, given that it has already caused significant divergence between Member States.
Post-Schrems II data transfer to Cloudflare deemed unlawful
What happened: The Portuguese DPA (CNPD) ordered the Portuguese National Institute for Statistics (NIS) to suspend the transfer of personal data, collected during the 2021 Portuguese Census, to the US. After receiving a number of complaints about NIS’ handling of the census data, CNPD found that Cloudflare, Inc., a Californian software company which ran the census, transferred the data to the US for processing. While the services agreement between NIS and Cloudflare contained the Commission-approved standard contractual clauses for data transfers, CNPD held that these clauses did not provide for adequate safeguards given that Cloudflare is directly subject to US surveillance legislation. As noted by CJEU in Schrems II, this means that the US authorities have unrestricted access to personal data held or kept by Cloudflare – and the standard contractual clauses cannot protect against this access. Therefore, CNPD ordered that the data transfer stop with immediate effect.
What to do: As covered in previous posts, following Schrems II, companies may want to review carefully their data transfers of all types to recipients in non-EEA jurisdictions not covered by an adequacy decision to ensure appropriate measures are in place to protect personal data if required. In particular, following this decision companies using Cloudflare or similar providers may want to revisit those transfers .
The authors would like to thank Olivia Collin, a trainee in the firm’s London office, for her contribution to this article.
To subscribe to the Data Blog, please click here.