Key takeaways from January include:

  • Transparency about data processing and retention: In a reminder of the importance of transparency under the GDPR, and the need for companies to make their data subject access request processes easy to navigate, the Dutch data protection authority fined Uber €10 million for, amongst other failings: (i) not specifying to drivers how long it retained their personal data; and (ii) making data subject access requests “unnecessarily complicated”.
  • GDPR one-stop-shop: Businesses wishing to take advantage of the GDPR one-stop-shop system should take note of a new digest, published by the European Data Protection Board, which analyses the decisions made by so-called Lead Supervisory Authorities in this context. In particular, businesses should note the case-by-case approach adopted by Lead Supervisory Authorities when considering the appropriateness of technical and organizational measures.
  • Generative AI: Businesses that currently (or intend to) use or develop generative AI tools may wish to monitor the UK ICO’s new consultation series on the impact of data protection law on the use and development of generative AI. In particular, businesses who train AI models should consider the ICO’s latest guidance on the legality of data scraping in the UK.
  • CNIL enforcement action: Highlighting its position as a proactive DPA, businesses should take note of the French regulator’s most recent series of enforcement action, including: (i) fining Amazon Logistics €32 million for excessive employee monitoring; (ii) fining Yahoo €10 million for cookies violations; and (iii) fining a data broker €75,000 for consent failings.
  • Data protection impact assessments: Businesses introducing new types of data processing, particularly when sensitive or special category data is involved, may wish to consider whether they are obliged to carry out a data protection impact assessment, after the Dutch data protection authority fined a credit card company for failing to conduct such an assessment in relation to its identification and verification process.
  • Transfer impact assessment: The CNIL has organized a public consultation on a draft guide regarding transfer impact assessments for organizations transferring personal data outside of the EEA. Businesses operating under the CNIL’s supervision may, therefore, want to revisit their existing approach to TIAs to assess whether they align with the CNIL’s latest expectations.
  • Cybercrime enforcement in France: The French police have increased their firepower against cybercrime, by formally announcing the creation of a specialized office. However, it remains to be seen how this new office will interact with companies in the context of cyber incidents.
  • EU Data Act: Legislation has entered into force which will impose a range of new obligations on businesses from September 2025. These new obligations will include greater regulation of contractual terms in data sharing contracts and greater rights to access and share data generated from “connected devices” and related services.
  • Adequacy decisions: Personal data will continue to be able to be easily transferred from the EU to Andorra, Argentina, Canada, the Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay, after the EU Commission confirmed that pre-GDPR adequacy decisions for all eleven jurisdictions will remain in place.

These developments, and more, covered below.

Dutch DPA fines Uber for transparency failings relating to data retention policies and access requests

What happened: The Dutch Autoriteit Persoonsgegevens fined Uber €10 million for failing to specify how long it would retain Uber drivers’ personal data, which non-EU countries to which the data would be sent, and the specific security measures which it would be putting in place when sending this data outside of the EU.

The authority also held that Uber had violated the GDPR by making data subject access requests (“DSARs”) “unnecessarily complicated” as the relevant form was located “deep within the app” and spread across various menus. Further, when Uber did provide drivers with copies of their personal data, it was not presented in an accessible format.

What to do: Businesses which share GDPR-protected personal data outside of the EU may wish to review their disclosures to ensure that there is sufficient transparency about where this personal data is shared, and what measures are taken to ensure its security. Businesses should also consider reviewing their policies and practices for responding to DSARs to ensure that that the DSAR request process is easy to identify and navigate.

EDPB publishes guidance on “one-stop-shop”

What happened: The European Data Protection Board published guidance on GDPR one-stop-shop cases, examining a diverse range of scenarios and considering how national DPAs have interpreted GDPR security and breach notification requirements. Common themes which emerge from the EDPB’s digest include:

  • When considering the appropriateness of technical and organizational measures, national DPAs have adopted a case-by-case approach, which often considers remedial action taken by the company after the data breach.
  • Despite the fact that the GDPR only requires data controllers to notify supervisory authorities of a data breach if they think that it is “likely to result in a risk to the rights and freedoms of natural persons”, in practice, where there is any doubt about whether this bar has been met, controllers tend to notify authorities instead of taking the risk of non-notification.
  • In relation to the individual notification obligation, only one DPA (of those reviewed by the EDPB) appears to have actually carried out an evaluation of the specific risk imposed by the data breach to the individuals and whether it is “likely to result in a risk to the rights and freedoms” of the individuals concerned.

What to do: Businesses may wish to keep this guidance in mind when considering their response to data breaches and whether to engage in the one-stop-shop system. While determinations of DPAs are made on a case-by-case basis, and can be highly fact specific, the EDPB digest provides helpful examples and analogies for assessing the likely response of national DPAs.

UK ICO launches consultation on data protection law and generative AI

What happened: The UK Information Commissioner’s Office launched a new consultation series on how data protection law should apply to the development and use of generative AI. The ICO intends to share a series of chapters, over the coming months, examining the impact of UK GDPR and the Data Protection Act 2018 on AI development.

The ICO’s first chapter, which examined the lawful basis for data scraping for the purpose of training generative AI models, was published to coincide with the announcement of the consultation series. The ICO reiterated that, in order for training data to be collected lawfully under UK law, the processing must: (i) not be in breach of any laws; and (ii) have a valid lawful basis under UK GDPR – with the ICO noting that the only basis which is likely to be available for training generative AI on web-scraped data is the “legitimate interests” basis.

The consultation series responds to the UK government’s recent confirmation of its AI regulatory strategy: to rely on individual regulators, including the ICO, to oversee AI within their areas of competence using their existing powers. In connection with this strategy, the UK government has requested that the ICO, and multiple other UK regulators, publish an update outlining their strategic approach to AI by 30 April 2024.

What to do: Businesses that currently (or intend to) use or develop generative AI tools may wish to keep track of this series and should consider how the upcoming guidance will affect their internal data protection and compliance procedures.  In particular, business may want to review and/or update their records of data processing and legitimate interests assessments to align with the latest guidance. Businesses should also keep an eye out for the ICO’s upcoming update on its strategic approach to AI regulation.

CNIL fines Amazon Logistics €32 million for excessive employee monitoring

What happened: The CNIL fined Amazon France Logistique, which manages the Amazon group’s large warehouses in France, €32 million for: (i) setting up an excessively intrusive system to monitor employee activity and performance; and (ii) using video surveillance without proper information and sufficient security.  In particular, the CNIL found that each Amazon employee was issued with a scanner device to conduct certain tasks, and Amazon was collecting and analyzing data from the scanners in real time as a proxy for employee productivity. The CNIL identified various issues with this activity, including that the monitoring was excessive and often inaccurate.

Meanwhile, across the Rhine, a German administrative court approved Amazon’s appeal over the company’s continuous employee activity and performance (but not behaviour) monitoring, which it conducts based on its stated legitimate interest in data processing to control logistics, and in accordance with prior notice being given to the employees.  This decision has been appealed by the data protection authority and will be reviewed by a higher German administrative court.

What to do: The CNIL appears to be placing increased scrutiny on employee monitoring practices, as we have previously covered. GDPR-compliant employee monitoring activities can be a tricky landscape to navigate, and is one that employers should be particularly mindful of given the recent rise in AI-based employment and productivity tools, many of which may constitute employee monitoring. Businesses with employees in the EU and UK should ensure that their approach to employee monitoring is proportionate and accurate, that employees are informed, and that personal data collected through this monitoring is processed and retained in accordance with GDPR requirements.

CNIL fines Yahoo €10 million over alleged cookie consent violations

What happened: The French data protection authority fined Yahoo EMEA Limited €10 million over alleged violations of the French Data Protection Act’s cookies requirements.  The CNIL held that the company implemented advertising cookies on “Yahoo.com” users’ devices without consent and failed to allow “Yahoo! Mail” users to freely withdraw their consent to cookies.

What to do: As previously noted, the CNIL continues to be a leading enforcer in cookies practices.  Businesses may want to review their existing cookie practices and ensure alignment with the authority’s expectations.  As a reminder, under the CNIL’s guidance, refusing cookies must be as easy as accepting them.

CNIL fines data broker €75,000 for collecting prospect data without valid consent

What happened: The CNIL fined Tagadamedia, a French digital acquisition marketing company, €75,000 for several GDPR violations.  The French DPA held that the forms used by the company to collect data from prospective customers did not allow free, informed and unambiguous consent, as its presentation (including the highlighting of buttons and text size) strongly encouraged users to agree to the transmission of their data to partners.  The CNIL also noted that the company had failed to correctly implement its register of processing activities.

What to do: Investigating direct marketing was one the CNIL’s top priorities in 2022 and the practices of professionals in this sector, including data brokers, are apparently still on the authority’s radar.  Businesses engaging in commercial prospecting in France should verify that their consent forms are aligned with the authority’s expectations, including by ensuring that refusing transmission of personal data to partners is as easy as accepting it.

CJEU clarifies the right to compensation under GDPR Art. 82 

What happened: The German Local Court of Hagen requested a preliminary ruling from the CJEU on the correct interpretation of GDPR Art.82, which covers the right to compensation.

The request arose following a personal data breach of a customer attempting to purchase a household appliance. Contracts relating to this purchase, which contained the personal data of the individual (including their name, address, employer, income and bank details), were handed over to a third party, in error, by an employee of the data controller and were not recovered for a period of 30 minutes. The data subject sought compensation from the data controller under Art.82 GDPR.

In its preliminary ruling, the CJEU confirmed that:

  • An individual claiming compensation under Art.82 must prove not only that there was an infringement of the obligations under the GDPR, but also that this infringement caused material or non-material damage;
  • A mere feeling of discomfort from the possibility of a third party accessing an individual’s personal data is not sufficient to qualify as non-material damage;
  • The function of Art.82 is compensatory, not punitive;
  • The quantum of compensation under Art.82 should be based on the damage suffered by the claimant rather than the severity of the infringement; and
  • The mere fact that an employee had passed the contracts to a third party was not sufficient for a finding that there were not suitable technical and organisational measures. Instead, the assessment of whether measures were suitable must be made in a “concrete manner”.

What to do: While this guidance is broadly consistent with previous CJEU judgments, the CJEU’s specific clarifications should be kept in mind when businesses are deciding how to respond to GDPR compensation claims. In particular, it demonstrates that the right to compensation under the GDPR is more narrowly construed than is sometimes presented on public forums, as it requires both the business to have breached their GDPR obligations and for the breach to have caused material or non-material damage. Further, judgments such as this should serve as a salient reminder for businesses to ensure that they have suitable technical and organisational measures in place when processing personal data, in order to ensure that data subjects do not suffer material or non-material harm if/when personal data breaches occur.

Dutch DPA fines International Card Services for failing to complete a data protection impact assessment

What happened: The Dutch Autoriteit Persoonsgegevens fined credit card company, International Card Services (“ICS”), €150,000 for using customers’ personal data without conducting a data protection impact assessment (“DPIA”).

Under the GDPR, a DPIA is required where processing is likely to result in a “high risk to the rights and freedoms of natural persons”. A valid DPIA will consider the risks that arise from that particular processing of sensitive information and is used to determine the necessary measures to counteract these risks.

The Dutch DPA held that ICS should have carried out a DPIA before introducing a system of online identity checks involving approximately 1.5 million people in the Netherlands. The Autoriteit Persoonsgegevens held that this processing presented a high risk to rights and freedoms because of the significant number of people impacted and the sensitive nature of the personal data involved (e.g., pictures of customers and government-issued identification numbers).

The authority also stated that the limited impact assessment which ICS had carried out was insufficient because it focused on financial services regulation rather than data protection requirements and did not include a sufficient role for the company’s data protection officer.

What to do: Before commencing a new type of data processing, businesses should carefully consider whether a DPIA is required. Businesses should be particularly mindful of this requirement if the type of data processing involves: (i) a systematic evaluation of natural persons based on automated decision making; (ii) processing, on a large scale, of special category data or personal data relating to criminal convictions; or (iii) the systemic monitoring of public areas. Business should also be mindful of mandatory DPIA lists issued by member states, supplementing the broad requirements under the GDPR itself.  If they determine that a DPIA is required, businesses may wish to evaluate their DPIA processes to ensure that there is sufficient involvement of their data protection officer.

CNIL organizes public consultation on draft Transfer Impact Assessment guide

What happened: Before transferring personal data outside of the European Economic Area (“EEA”), organisations must assess the level of data protection in the destination country, also known as a Transfer Impact Assessment (“TIA”), and – in most cases – guarantee essentially the same level of protection as the GDPR if the transfer is to go ahead.  In order to assist this process, the CNIL organised a public consultation on a draft guide for TIAs ending 12 February 2024.  The TIA guide will provide a methodology and a checklist which identifies various elements to be considered when carrying out a TIA.

What to do: Businesses transferring or receiving personal data from individuals in France should keep an eye out for this CNIL guide, as it should provide some practical tips for TIAs.  While technical developments such as cloud services allow actors to easily transfer and process personal data outside the EEA, businesses still find it challenging to ensure that transferred data is processed in compliance with GDPR requirements.

French police announce creation of cybercrime office, OFAC

What happened: The French national police formally announced the creation of a new anti-cybercrime office, OFAC, that will help combat specialised, organised and transnational cybercrime. The group was created following growing cyber threat concerns around the 2024 Paris Olympic Games. It replaces two previous investigative units within the national police and has reportedly already assisted French enforcement in the arrest of a member of the Hive ransomware group.

What to do: The creation of this specialized office is expected to boost cybercrime investigations in France and will likely facilitate international cooperation and cross-border investigations.

EU Data Act enters into force

What happened: The EU Data Act entered into force. This new legislation is intended to ensure fairness in the data economy by: (i) stipulating who can create value from data and under which conditions; and (ii) stimulating a competitive data market by unlocking industrial data. It emerged in response to the rapid growth in “connected devices” (i.e., devices connected to the internet which generate or collect data concerning their use and/or environment), and the subsequent increase in the volume of data being generated. The Act will impose a range of obligations and protections, including:

  • Enabling users of connected devices to access and share the data generated by the device;
  • Introducing mechanisms for public bodies to access and use data held by the private sector to help respond to public emergencies;
  • Protecting businesses from unfair contractual terms in data sharing contracts;
  • Allowing customers to switch seamlessly between different cloud providers to promote choice and competition on the market;
  • Safeguards against unlawful requests by non-EU authorities to transfer or access non-personal data held in the EU; and
  • Measures to promote the development of interoperability standards for data sharing and data processing services.

These new obligations and protections will take effect from September 2025.

What to do: Businesses who manufacture connected devices or provide related services, should begin reviewing and auditing their operations to ascertain how they are likely to be impacted by the Act. Such businesses may wish to assess their compliance strategies well in advance of the mid-2025 enforcement deadline to allow sufficient time for technical and operational adjustments.

EU Commission confirms that eleven pre-GDPR adequacy decisions will remain in place

What happened: Under its obligation to periodically reviews all adequacy decisions, the EU Commission upheld the eleven adequacy decisions adopted pre-GDPR (covering Andorra, Argentina, Canada, the Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay). Therefore, these adequacy decisions will remain in place, and personal data can continue to flow freely from the EU to these jurisdictions without the need for further safeguards.

What to do: Nothing – businesses can welcome the continued legal certainty offered by the the adequacy decisions. Moreover, businesses should note that the Commission is actively exploring the possibility of launching adequacy talks with its Asian and Latin American partners, which could further improve businesses’ ability to transfer personal data across border.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

 

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Jeevika Bali is a trainee associate in the Debevoise London office.

Author

Samuel Thomson is a trainee associate in the Debevoise London office.

Author

Alexia Turpin is a trainee associate in the Debevoise Paris office.