Our top five European data protection developments from May are:

  • UK guidance on ransom payments: The UK NCSC and various insurance industry bodies co-published guidance on key considerations for ransomware payments. The guidance does not introduce new restrictions or obligations, and is consistent with prior industry standards, as well as UK NCSC and UK ICO messaging. However, there may be an added expectation on insured entities to closely follow guidance recommendations as a condition of claiming on their cyber-insurance coverage.
  • Guidance on AI and data protection: Consolidating prior advice from various German DPAs on generative AI and data protection, the DSK has published guidance on the development, implementation and deployment of generative AI systems in compliance with the GDPR. While the guidance does not mark a major shift in the German DPAs’ approach to AI regulation, it helpfully consolidates the existing advice on this important topic.
  • Security failings enforcement: In a reminder of the importance of implementing adequate security measures, the Spanish DPA fined a lending institution €360,000 for, among other things, failing to have two-factor authentication in place for user accounts used for loan applications. A reminder to many companies that – while not specifically required under the GDPR – multi-factor authentication is increasingly considered a standard security control.
  • Facial recognition: The Dutch DPA has published guidance on its approach to the use of facial recognition software on employees and members of the public. As a general rule, the use of such software is not permitted under the GDPR and local law unless a specific exemption applies.
  • Enforcement for employee monitoring failings: Following the trend of DPAs across Europe scrutinising workplace monitoring, the Italian DPA fined a local municipality for relatively minor breaches of employee monitoring requirements, emphasising the importance of compliance for businesses, and providing insights into regulator expectations.

These developments are covered in more detail below.

UK NCSC and insurance industry bodies publish guidance on ransom payments

What happened: The UK National Cyber Security Centre (“NCSC”) and several insurance industry bodies have jointly published guidance on key considerations for ransomware payments.

Importantly, the Guidance does not prohibit, or put additional restrictions on, cyber ransom payments. Instead, it outlines general recommendations which are consistent with current industry standards, and prior NCSC and ICO communications aimed at reducing the number and sizes of ransom payments in the UK. The guidance reiterates the NCSC’s and ICO’s prior joint letter, clarifying that payment of a ransom will not be viewed as a mitigating factor when assessing the impact of a data breach, nor will the ICO reduce any regulatory penalties in light of a ransom payment. However, the guidance does highlight that engagement with UK NCSC during a ransomware attack could be a mitigating factor in any associated ICO penalty. This follows the spirit of the prior blogpost by the NCSC and ICO, which urged for more transparency with UK authorities on cyber-attacks, and in particular ransomware attacks. Similarly, the UK Joint Committee on National Security Strategy recently called on the government to require companies to report to a central database all ransomware incidents within three months of their occurrence.

What to do: The guidance does not introduce new restrictions or obligations, and is consistent with industry standards and prior NCSC and ICO messaging around ransom payments. However, the guidance is a helpful reminder of the potential benefits of engaging with law enforcement during cyber incidents. From a commercial standpoint, insurers also may more closely monitor and expect that insured entities follow the recommendations outlined in the guidance as a condition for their cyber insurance coverage.

Germany’s DSK issues guidance on generative AI and data protection

What happened: The joint body of German DPAs (“DSK”) published guidance consolidating prior advice from various state DPAs on generative AI and data protection. It considers how generative AI systems, particularly large language models, can be deployed in compliance with data protection law by focusing on three phases of the AI lifecycle: (i) development; (ii) implementation; and (iii) deployment. The guidance follows DSK’s position paper on the interrelationship between data protection and the EU AI Act, and aligns with advice provided by other European DPAs (including the CNIL and the ICO).

For the development stage, the guidance notes that where an AI system’s use of personal data constitutes automated decision-making under GDPR Art. 22, then there must be sufficient human-involvement in that processing for it to be GDPR-compliant. For example, the guidance suggests that having an AI system independently evaluate job applications and invite candidates for an interview, without any human involvement, would breach GDPR Art. 22. The guidance also indicates that: (i) using closed AI systems is preferable from a data protection standpoint; and (ii) businesses should consider giving individuals the option to opt-out of having their personal data used to train the AI system, where applicable.

For the implementation phase, the guidance notes that: (i) each data controller must have its own legal basis for processing any person data entered into, or associated with, the AI system; (ii) businesses should issue internal policies about the use of generative AI systems; and (iii) a data protection impact assessment should be carried out where high risk data is involved. Finally, for deployment, the guidance notes the importance of reviewing outputs for accuracy and avoiding discrimination.

What to do: Data privacy considerations continue to be an important topic for AI use given the vast amount of data that is entered into, or associated with, an AI system. Data protection authorities have been the regulatory forerunners in bringing AI-related enforcement action. While the guidance does not mark a major shift in the approach of the German DPAs to AI regulation, it helpfully consolidates the existing advice on the topic. Businesses operating in Germany, that have meaningful AI deployment, may wish to review the guidance closely.

Spanish DPA fines lending institution for inadequate data security measures

What happened: Spain’s AEPD fined 4Finance Spain Financial Services S.A.U (“4Finance Spain”), a lending institution, €360,000 for inadequate security measures when approving loans, including a lack of two-factor authentication.

Last year, 4Finance Spain experienced a data breach after a threat actor used a brute-force attack to gain access to the financial accounts of approximately 9,000 clients and employees. The threat actor then took out loans in the data subjects’ names, which 4Finance Spain approved. It took several months for 4Finance Spain to identify the data breach, despite repeated client reports of unauthorised loan activity. Further, once the breach was identified, 4Finance Spain apparently initially assessed it as not presenting a high risk of harm to the underlying data subjects, despite the fact that the exposed data elements included names, birth dates, national identification numbers, passports, payment data and contact information, and had resulted in certain individuals experiencing financial fraud.

The AEPD found that 4Finance Spain had likely violated both GDPR Art. 5(1)(f) and Art. 32 by failing to: (i) process personal data in a way that ensures appropriate security protection against unauthorised processing and loss; and (ii) implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk. In making the latter finding, the AEPD reminded 4Finance Spain of the enhanced level of caution required when processing financial or identification data and criticised the controller for lacking two-factor authentication in relation to its systems. The AEPD also criticised 4Finance Spain’s internal policies and processes for identifying when a data breach had occurred, and then assessing the severity of the breach, as being inadequate.

What to do: Businesses processing sensitive categories of personal data (in particular, financial data or information which could be used to commit identity fraud) may wish to undertake a review of their technical and organisational measures to verify that they offer a level of security appropriate to the risk. In particular, this is another timely reminder of the increasing regulatory expectations around implementing multi-factor authentication, especially for financial accounts or those containing other types of more sensitive personal data.

Dutch data authority publishes guidance on use of facial recognition software

What happened: The Dutch DPA published guidance on the requirements for using facial recognition software on employees or members of the public under both the GDPR and local law. According to the guidance, the general rule is that the use of facial recognition software is prohibited, unless one of the specific exemptions applies. The Dutch DPA clarified that, in its view:

  • The most common exemption is where the use of facial recognition is necessary for public security (e.g., verifying the identity of nuclear power plant employees or individuals purchasing hazardous substances). In these circumstances, the guidance reminds businesses that they need to complete a data protection impact assessment in advance of using the facial recognition software.
  • The GDPR’s general prohibition on processing “special category” personal data (which includes race and religion) without consent also applies to data collected via facial recognition software for the purpose of confirming someone’s identity.
  • Facial recognition software can be used in device applications (e.g., to unlock your phone) as long as the data collected is stored locally on users’ devices, and the user retains the choice to use another means of identification, such as a PIN.

What to do: Businesses with operations or employees in the Netherlands should review any current or planned use of facial recognition software in light of the guidance. If any exceptions potentially apply, businesses should consider in advance what documents (such as an impact assessment) they might need to complete in advance of deploying the system. The guidance may also be indicative of the approach and considerations to this issue in other European markets.

Italian DPA fines local municipality for unlawful employee monitoring

What happened: The Italian DPA fined the Municipality of Madignano €3,000 for unlawful employee monitoring without having first implemented a trade union agreement, or obtained public authority authorisation, per local law requirements. The Municipality installed CCTV cameras over the clock-in machine employees use to record their attendance. It subsequently used evidence from the CCTV in disciplinary proceedings against an employee as evidence of them not adhering to their contracted working hours.

The Italian DPA determined that the Municipality failed to comply with the relevant domestic legislation which required it to obtain trade union agreement or public authority authorisation before undertaking employee monitoring. Further, the Italian DPA held that the Municipality’s use of these recordings amounted to unlawful processing of personal data contrary to GDPR Arts. 5(1)(a) and 6. Finally, the Municipality’s failure to properly inform the employee of the monitoring constituted a breach of the GDPR’s transparency obligations.

What to do: Employee monitoring continues to be a focal topic for European DPAs. As previously covered, the UK, French and Finnish DPAs have all recently taken similar actions in relation to employee monitoring. Further, the increasing scope and wide-scale accessibility of AI technology, which could be used in a way that constitutes employee monitoring, has triggered renewed attention in this area. This penalty is a timely reminder of the importance of ensuring any employee monitoring complies with local privacy and employment law requirements, with a particular focus on transparency and proportionality, even if the monitoring ultimately proves wrongdoing by the employee in question. Given the Italian DPA’s willingness to pursue a relatively minor breach, businesses should also be mindful that enforcement action is not necessarily reserved only for the most egregious breaches.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.


Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.


Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.


Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.


Michiko Wongso is an associate in the firm’s Data Strategy & Security Group. She can be reached at mwongso@debevoise.com


Anna Chirniciuc is a trainee associate in the Debevoise London office.


Alexander McGinley is a trainee associate. He can be reached at amcginley@debevoise.com


Deniz Tanyolac is a trainee associate in the Debevoise London office.


Samuel Thomson is a trainee associate in the Debevoise London office.