Key takeaways this May include:

  • Facial recognition: Businesses, including those with no presence in the EEA, face continued challenges in establishing GDPR-compliant facial recognition technology after the French CNIL fined Clearview AI an additional € 5.2 million for failing to comply with its previous order against the company.
  • GDPR individuals’ rights: Businesses should look to two new CJEU decisions on the scope of GDPR’s rights to compensation (confirming there is no minimum level of damage required for the right to compensation to apply) and to receive a copy of their personal data being processed (confirming that individuals must be given a “faithful and intelligible” reproduction of all of their personal data in response to a subject access request); two rights individuals commonly exercise following personal data breaches.
  • Third country data transfers: Businesses that transfer personal data outside of the EEA may want to review their transfer mechanisms in light of new guidance on the EU and South East Asia SCCs, and the DPC’s record-breaking €1.2 billion fine against Meta.
  • Automated decision-making: Businesses that utilise automated decision-making processes, including on the basis of AI, should be aware of the additional GDPR compliance and transparency requirements surrounding the use of these tools.

These developments, and more, covered below.

(1) CNIL imposes an additional sanction on Clearview AI for not complying with its previous order

What happened: The CNIL fined Clearview AI €5.2 million for failing to comply with the CNIL’s previous order. As we covered here, last October, the CNIL fined Clearview AI €20 million for various data protection violations, including “intrusive and massive” data processing without consent or a valid legitimate interest. The CNIL, therefore, ordered Clearview to: (i) stop collecting and using personal data linked to individuals in France without any legal basis; and (ii) delete the data already collected, with an additional penalty of €100,000 per day of non-compliance after a two-month grace period. However, Clearview AI has not provided the CNIL with evidence that it is now compliant with the order so the CNIL decided to enforce the additional penalty. If the company’s failings continue, further penalties are likely to be imposed.

This is not the first time that Clearview AI has been subject to enforcement; the company’s business operations have caught the attention of multiple data protection regulators around the world. Most recently, the Austrian data protection authority (“Datenschutzbehörde”) decided that Clearview AI is no longer allowed to process a certain complainant’s biometric data, and has to delete all personal data relating to the complainant. In contrast to other EU data protection authorities (“DPAs”), such as the Italian (March 2022) and Hellenic (July 2022) authorities, the Datenschutzbehörde did not impose a general ban on Clearview AI’s operations within Austria or issue a fine against the company. However, it expressly reserved these measures to a separate investigation.

What to do:  Businesses should make sure to follow up on compliance orders within the allotted time to avoid additional sanctions. With enforcement by several European DPAs against Clearview AI, businesses with no presence in the EEA should keep in mind the GDPR’s potential extra-territorial reach, especially given the complexities in complying with the GDPR in the AI, algorithmic decision-making and machine learning contexts.

(2) CJEU rules on GDPR compensation and data subject rights

What happened: the CJEU, the highest EU court, handed down two decisions regarding the GDPR.

In Case C-300/21 (UI v Österreichische Post AG), the CJEU considered the scope of the highly contentious right to compensation for non-material damage caused by a breach of the GDPR, under GDPR Art. 82 (see our May 2021, August 2021, and October 2022 blog posts for previous developments). It confirmed that the mere infringement of the GDPR does not, in itself, entitle affected individuals to compensation; there also must be some level of damage caused by the breach. However, the court noted that there is no minimum level of seriousness that the damage must meet for entitlement to compensation. The amount of compensation should be assessed by Member State courts under their domestic rules.

In Case C-487/21 (Österreichische Datenschutzbehörde and CRIF), the CJEU reviewed the data subject access right under GDPR Art. 15(3). The CJEU held that the right to obtain a “copy” of personal data held by the controller means that individual must be given a “faithful and intelligible” reproduction of all of their personal data. This means that copies of extracts from documents – or the entire document – should be provided to individuals if that is the only way to accurately show all their personal data being processed. The CJEU also held that the right to receive “information” refers exclusively to the “personal data undergoing processing”, and no other details.

What to do: The rights to compensation for non-material damage, and to access a copy of any personal data being processed, are rights that individuals commonly exercise following a personal data breach. Businesses should be mindful of these decisions when navigating such requests, as they provide helpful guidance on the scope of the rights and businesses’ obligations when responding to them. Businesses may also want to review their policies and procedures for responding to subject access requests to ensure that they reflect the latest understanding of the scope of that right in light of this caselaw.

(3) Meta issued with record GDPR fine by Ireland’s DPC

What happened: Ireland’s Data Protection Commission (“DPC”) announced the conclusion of its three-year investigation into Meta Platforms Ireland Limited’s (“Meta”) transfers of users’ personal data from the EEA to the U.S. when providing its Facebook service. Following the EDPB’s binding dispute resolution decision, the DPC fined Meta €1.2 billion – the largest GDPR fine to date – for breaching the GDPR’s Third Country data transfer rules. While Meta had implemented the standard contractual clauses (“SCCs”) and a range of other supplementary measures to protect users’ personal data once transferred, the EDPB held that these measures could not “compensate for the inadequate protection provided by U.S. law” and, in particular, did not address issues identified by the CJEU in the Schrems II decision. In addition to the record penalty, the DPC ordered Meta to suspend all transfers of personal data to the U.S. within five months, and ensure that all its processing is GDPR-compliant within six months.

What to do: The DPC’s penalty highlights the continued complexities surrounding how businesses can validly transfer personal data to the U.S. in the absence of an adequacy decision. The EDPB did not shed any light on what additional measures, under the circumstances, could have been sufficient to ensure GDPR-compliance. Nonetheless, businesses that transfer personal data to the U.S. may wish to review their transfer mechanisms and consider whether they sufficiently address any data protection disparities under U.S. law. Relief in the EU-U.S. relationship can be expected if the EU Commission adopts the EU-U.S. Data Privacy Framework, an adequacy decision replacing the Privacy Shield that was invalidated in 2020 by CJEU in Schrems II.

(4) CNIL fines Doctissimo 380,000 for multiple GDPR and cookies violations

What happened: The CNIL fined Doctissimo, a health and wellbeing website, €380,000 for GDPR violations (€280,000) and cookies law breaches (€100,000).

With respect to the GDPR, the CNIL held that Doctissimo had breached multiple requirements, including failing to: (i) store data for no longer than necessary; (ii) obtain valid consent from individuals to collect their health data; (iii) adequately secure personal data; or (iv) comply with documentation obligations for its joint data processing operations it conducted with other data controllers for marketing purposes.

Similarly, the CNIL once again enforced the French Data Protection Act’s cookies requirements, and found Doctissimo in breach of the law by implementing advertising cookies on the users’ terminal without consent (i) as soon as they arrived on the website; and (ii) even after users clicked “refuse all.”

What to do: Businesses may want to review their existing cookie regime and ensure alignment with the latest regulatory expectations. They should also ensure that they verify their processes for storage of personal data, making sure that they have obtained consent to collect and process personal data.

(5) EU and ASEAN issue joint guidance on navigating their SCCs

What happened: The European Commission and the Association of Southeast Asian Nations (“ASEAN”) released joint guidance on the application and use of their respective SCCs for personal data transfers between the EEA and South East Asia. The guide explains the similarities and differences between the ASEAN and EU SCCs to help facilitate businesses’ compliance with their respective requirements. The guide envisages a yet-to-be-published best practices from businesses that use the SCCs for transferring data.

What to do: Businesses that regularly transfer personal data between the EEA and South East Asia may wish to review their SCCs or other data transfer mechanisms in light of this guidance to ensure that they are fully compliant with the GDPR’s and ASEAN law’s requirements, as applicable.

(6) CNIL publishes AI action plan

What happened: The CNIL published an AI action plan that aims at providing a framework for the development of generative AI i.e. those that create text, images or music, such as ChatGPT. This follows the creation of a dedicated AI department at the CNIL (as we reported here), and is also intended to help businesses prepare for the future implementation of the draft EU AI Act.

The CNIL’s plan has four main components:

  • Understanding how AI systems work and their impact on individuals, with a particular attention to the transparency of data processing, the protection of publicly accessible data and data transferred by users, consequences on individual rights, protection against discrimination and security issues;
  • Enabling and supervising AI developments that respects data privacy, including by publishing guidelines;
  • Supporting innovative businesses in the French and European AI ecosystem; and
  • Auditing and assessing AI systems, particularly those involving AI-enhanced video surveillance and the use of AI to tackle fraud.

What to do:  This action plan signals the CNIL’s growing interest in AI-related privacy issues, with a special focus on generative AIs. Businesses should keep an eye out for further statements and recommendations from the CNIL on these issues, which may echo concerns raised in its enforcement actions to date (such as the CNIL’s fine and additional sanction, discussed above, against Clearview AI).

(7) Berlin DPA fines a bank 300,000 for automated decision-making failings

What happened: The Berlin DPA has fined a bank €300,000 for failing to explain to a customer why their credit card application was rejected by an automated decision making (“ADM”) algorithm. Where ADM is utilised, the GDPR requires not only transparency up-front (GDPR Art. 13(2)(f) & Art. 14(2)(g)), but also upon the exercise of the access right (GDPR Art. 15(1)(h)), which requires businesses, amongst other things, to provide meaningful information on the logic underpinning the ADM decision (GDPR Art. 22(3)).

In this case, the bank, in response to an access request, only provided the customer with generic information about how individuals’ credit ratings are scored by the ADM algorithm. The Berlin DPA held that the bank should have provided the individual with targeted information, including (i) what personal data was inputted; (ii) what factors the algorithm assessed when making its decision; and (iii) the criteria it used for rejecting an application.

What to do: As businesses begin to integrate more AI tools into their systems, in particular ones with ADM capabilities, they should be mindful of the additional compliance obligations under GDPR Art. 22 for these types of technologies, including a right not to be subject to a decision based solely on automated processing. Businesses that utilise ADM tools may want to revisit their policies and procedures to see if they address this right and are transparent. Enquiries about these decisions have to be answered with sufficiently detailed, individualised responses.

 

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Tristan Lockwood is an associate in the firm’s Data Strategy & Security practice. He can be reached at tlockwood@debevoise.com.

Author

Maria Epishkina is a corporate associate and a member of the Mergers & Acquisitions, Capital Markets and Private Equity Groups. She can be reached at mepishkina@debevoise.com

Author

Alexandre Pous is a legal intern and trainee in the Litigation Department at the Debevoise Paris office.