Our top-five European data protection developments from August are:

  • Uber fined for personal data transfer: The Dutch Data Protection Authority fined Uber €290 million for the unlawful transfer of European drivers’ personal data to the U.S., following Uber’s move away from relying on the standard contractual clauses (“SCCs”) in 2021. Businesses may wish to assess their own cross-border data transfer arrangements against the decision to assess whether they are compliant.
  • Breach enforcement: The UK ICO announced a provisional decision to fine a service provider to the UK National Health Service £6.09 million for failing to implement measures to protect personal data following a ransomware incident in August 2022. In light of the decision, business – including data processors – may wish to review their ransomware preparedness, including by reference to the ICO’s ransomware guidance.
  • Criminal enforcement action for unlawfully accessing personal data: A former employee of Enterprise Rent-A-Car was fined £10,000 by a UK criminal court for unlawfully accessing personal data. Businesses might consider using this case as a reminder to employees of the potential criminal sanctions for data protection-related misconduct.
  • Italian DPA fines bank $1 million for GDPR consent failings: The Italian DPA fined Credit Agricole Auto Bank for failing to obtain the necessary permission when performing fraud checks on vehicle financing customers. Business might want to consider whether current methods and policies are suitable to ensure compliance with data subject access requests.
  • AI chatbots: Following the Dutch DPA’s warning of a rise in data leaks from employees using AI chatbots, businesses may wish to assess whether internal policies and controls adequately cover the risks associated with chatbot usage.

Dutch DPA fines Uber €290 million for data transfer violations

What happened: The Dutch Data Protection Authority fined Uber €290 million for transferring European drivers’ personal data to the U.S. without sufficient safeguards. The DPA investigated after the French human rights interest group the Ligue des droits de l’Homme complained to the French DPA on behalf of 170 French drivers.

The Dutch DPA found that Uber collected account details, taxi licences, location data, photos, payment details, identity documents and, in some cases, criminal and medical data. For a period of over two years, Uber transferred that data from the Netherlands to its U.S. headquarters unlawfully after it shifted away from using Standard Contractual Clauses to transfer data in August 2021. Instructions accompanying the SCCs stated they were not to be used where the importer is subject to the GDPR. Uber instead, therefore, relied upon GDPR derogations for contractual necessity or a contract in the interest of the data subject. The Dutch DPA held that the Uber could not rely upon the derogations because, in its view, they were neither “incidental” or “necessary” and therefore did not meet the relevant requirements. The transfers were systematic and repetitive, and the mere existence of an agreement to share data could not itself constitute necessity.

Uber has since signed up to the new EU-U.S. Data Privacy Framework introduced in 2023 addressing the issue and has signaled its intent to challenge the fine.

What to do: Businesses that transfer personal data to third countries in reliance on the derogation for contractual necessity may wish to review those transfers to consider whether they align with the Dutch DPA’s interpretation of them. Those wishing to rely on the SCCs for transfers to GDPR-covered parties in third countries might also see the decision as support for that choice.

ICO proposes £6.09 million fine against NHS service provider for August 2022 ransomware incident

What happened: Following a cyberattack in August 2022, the UK Information Commissioner’s Office (ICO) has provisionally decided to fine Advanced Computer Software Group (OneAdvanced), an IT software and services provider to the NHS, £6.09 million for failing to implement appropriate measures to protect personal data. This is a relatively rare example of a data processor facing enforcement action for a data breach, although the ICO will consider OneAdvanced’s representations before reaching a final decision. Nevertheless, the ICO has provisionally found that the ransomware incident in August 2022 involved attackers accessing various of OneAdvanced’s systems via a customer account that did not have multifactor authentication enabled. The attackers exfiltrated 82,946 people’s personal data including phone numbers and medical records, and details of how to gain entry to the homes of 890 people who were receiving care at home.

What to do: Businesses may wish to review their ransomware preparedness, including by reference to the ICO’s ransomware guidance, as well as the ICO’s guidance on the responsibilities and liabilities of both data processors and controllers, explicitly referred to in the ICO’s press release. The proposed penalty also highlights the importance of implementing secure log-in mechanisms including multifactor authentication.

Cardiff Crown Court fines UK individual £10,000 for unlawfully accessing personal data

What happened: The UK ICO announced that a former employee of car rental business Enterprise Rent-A-Car (“Enterprise”) was fined £10,000, plus costs of £1,700 after pleading guilty to the criminal offence of unlawfully accessing personal data contrary to section 55 of the Data Protection Act 1998 (“DPA 1988”).

Under section 55 DPA 1988, it is an offence for a person to knowingly or recklessly, without the consent of the data controller, obtain or disclose personal data or procure the disclosure to another person of the information contained in personal data.

This offence now exists under section 170 of the Data Protection Act 2018 (“DPA 2018”) in addition to a new offence of knowingly retaining personal data without the consent of the data controller, even if the data has been lawfully obtained.

The former employee left Enterprise in 2009 and established his own personal injury firm. He remained in contact with former colleagues at Enterprise and, between 2009 and 2011, used these connections to obtain individuals who had been involved in road traffic accidents details in order to contact them to offer legal services. The former employee was previously ordered to pay a civil settlement of £300,000 to Enterprise. Following the guilty plea, the ICO thanked Enterprise for promptly informing the ICO of the breach and for supporting the ICO in its case.

What to do:  Businesses may wish to use this case as a reminder to employees of the potential criminal sanctions for data protection-related misconduct, alongside ensuring that data protection systems and internal controls are sufficient to mitigate the risk of similar personal data breaches involving current and former employees.

Dutch DPA warns businesses of potential data leaks when employees use AI chatbots

What happened: The Dutch DPA warned of a rise in data leaks in the Netherlands following employees sharing customers’ personal data with AI-powered chatbots. The DPA highlighted its position that most AI chatbot suppliers store all data entered into the chatbot, creating the risk of GDPR violations where, for example, personal data ends up stored on a service provider’s servers without consent or knowledge.

The warning follows reports of employees using AI chatbots without their employer’s knowledge (and in some cases, in express breach of internal data policies). For example, the DPA pointed to a data leak where an employee at a medical practice entered patients’ medical data into an AI chatbot. The DPA considered this a major violation of the affected patients’ privacy. Similarly, the DPA shared details of a report from a telecom company where an employee had entered customer addresses into an AI chatbot.

What to do: Businesses might consider establishing clear policies regarding the use of AI chatbots. If businesses allow employees to use AI chatbots in their work, it is important to clarify what data employees can enter. Where possible and practicable, businesses may wish to consider taking steps to ensure third-party service providers do not store data entered nor use it for their own purposes. For more on AI chatbots, read our guidance on mitigating AI risks for customer service chatbots.

Italian DPA fines Credit Agricole Auto Bank €1 million for breaching GDPR

What happened: The Italian DPA fined Credit Agricole Auto Bank (CA Auto Bank) €1 million for GDPR breaches related to the unlawful processing of personal and income data of customers seeking financing for long-term car rentals.

The DPA intervened following a customer complaining that they had not received a response regarding why they had been denied financing or why they were included on a blacklist. The DPA’s investigation found that CA Auto Bank breached GDPR provisions regarding the right of access of data subjects to certain information and the obligation to provide information on a request to the data subject without undue delay.

The DPA also fined Drivalia €250k in connection with CA Auto Bank’s unauthorised access to the Scipafi (Centralised Fraud Prevention System) database. When performing fraud checks, CA Auto Bank accessed the Scipafi  database on behalf of Drivalia (the bank’s car leasing subsidiary) without the prerequisite permission of the Ministry of Economy and Finance.  In its decision, the DPA noted that when denying access to financing, Drivalia did not provide customers with adequate information to identify: (i) the type and origin of data processed; (ii) the databases consulted to verify customers’ incomes; and (iii) whether the databases were consulted by Drivalia or through CA Auto Bank.

What to do: The DPA’s decision serves as a reminder of the significant financial consequences for failing to adequately comply with data subject access requests. Businesses performing fraud checks in Italy using the Scipafi database (whether directly or through a third party) may want to consider ensuring that the necessary permission is obtained from the Italian Ministry of Economy and Finance.

***

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by Microsoft Copilot.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Deniz Tanyolac is a trainee associate in the Debevoise London office.

Author

Oliver Binns is a trainee associate in the Debevoise London office.

Author

Anna Chirniciuc is a trainee associate in the Debevoise London office.

Author

Olivia Halderthay is a trainee associate in the Debevoise London office.

Author

Samuel Thomson is a trainee associate in the Debevoise London office.