Key takeaways from October include:

Employee monitoring: Following new guidance issued by the UK ICO, employers may want to review their existing employee monitoring to ensure it meets the regulator’s latest expectations, including ensuring that any monitoring is necessary, proportionate, and conducted transparently.

Data protection & AI: In particular: (i) the French CNIL published its first set of guidance on GDPR compliance when developing AI tools; and (ii) the UK ICO issued a preliminary enforcement notice against Snap over its AI chatbot, alleging that Snap had not adequately assessed the privacy risks posed to child users of the tool.

Data subject access requests: The CJEU reminds businesses that: (i) those making subject access requests do not need to provide a reason for their request; and (ii) the first copy of documents should be provided free of charge and includes in all documents required to understand the personal data it contains.

Consent for use of personal data: The German competition regulator used national competition laws to extend the Digital Market Act’s user consent requirements to additional Google services offered within Germany.

UK FCA’s £11m data breach penalty: Failing to maintain adequate oversight of outsourced data processing activities, even when performed by group entities, may breach UK-regulated financial services firms’ duty to implement adequate risk management systems, as highlighted by the UK Financial Conduct Authority’s £11 million fine against Equifax Ltd for its 2017 data breach. The penalty also highlights the importance of promptly notifying affected individuals and providing accurate information, as a key part of regulated entities’ obligations to treat customers fairly and not mislead them.

Jigsaw identification: A photo of an individual being arrested, in absence of that individual’s name, may still constitute personal data where individuals who witnessed the arrest knew the individual in question, according to the UK First Tier Tribunal. Businesses may want to review their data subject rights requests policies, in particular, to ensure they address the potentially broad scope of jigsaw identification.

EU-U.S. Data Privacy Framework: EU businesses intending to take advantage of the EU-U.S. Data Privacy Framework to transfer personal data to the U.S. may want to consider guidance issued by the French and German data protection authorities. While guidance from the French CNIL, alongside that of the Bavarian DPA and the German Data Protection Conference, indicated a general acceptance of the DPF, the Thuringia DPA advised businesses to exercise caution and consider not transferring sensitive data to U.S. providers, notwithstanding the framework.

GDPR’s jurisdictional scope: Businesses that collect personal data of UK/EEA-based individuals may be monitoring them, and therefore are subject to the GDPR, even if the businesses themselves are based elsewhere. “Monitoring” is a highly fact-specific concept but can include single incidents. However, in a reminder of the GDPR’s jurisdictional scope limitations, the English courts overturned the ICO’s penalty against Clearview AI on the basis that Clearview’s customers – law enforcement and national security agencies – fall outside the GDPR’s scope.

DMA gatekeepers reporting templates: The EU Commission has published a template for the compliance report that DMA gatekeepers will have to submit on an annual basis.

These developments, and more, covered below.

UK ICO clarifies employee monitoring requirements under the UK GDPR

What happened: The UK ICO issued new guidance on the relationship between employee monitoring and data protection law. The ICO published the guidance in response to perceived increases in employee monitoring, e.g., by tracking calls, messages and keystrokes, and in some instances, taking screenshots and accessing webcam and audio data.

The guidance clarifies how employers can monitor employees lawfully and fairly, consistent with data protection laws. The ICO emphasises that:

  • Whilst employers may monitor employees under data protection law, any monitoring must be necessary, proportionate, and respect workers’ rights.
  • Employers must only collect data that is relevant to a clearly defined purpose and must use the least intrusive means of collection (including collecting the minimum amount of personal data necessary to meet the purpose).
  • Workers should be made aware of the nature, extent and reasons for monitoring in a way that is easy to understand. This includes providing information collected through monitoring in response to subject access requests.
  • Employers must carry out a Data Protection Impact Assessment before implementing any monitoring that is likely to result in a high risk to the (privacy) rights of workers (e.g., keystroke monitoring).

What to do: As we have previously covered, workplace monitoring practices continue to be scrutinised closely. Businesses with UK employees may want to review the guidance to help ensure that their employee monitoring complies with the recommendations. In particular, businesses may want to assess whether monitoring is truly necessary and, if it is, proportionate to achieve a clearly-defined purpose, and that employees are provided with sufficient information about it. Tools that continuously monitor employee’s behaviour, for example, may be unlikely to meet these requirements in the regulators’ eyes.

UK financial regulator fines Equifax £11 million for 2017 data breach

What happened: The UK Financial Conduct Authority (“FCA”) fined consumer credit rating agency, Equifax Ltd., £11 million for failings related to its 2017 cybersecurity breach that exposed 13.8 million UK individuals’ data, including names, phone numbers, addresses, date of birth and partially exposed credit card details. The penalty follows the ICO’s 2018 £500,000 fine (the maximum permitted at the time).

The FCA found that Equifax Ltd. failed to meet its duties to implement adequate risk management systems, treat customers fairly and not mislead them by, among other things:

  • Independence and Oversight: Failing to treat its U.S. parent, Equifax Inc., as an outsourced service provider, leading to what the FCA viewed as inadequate oversight. This included criticism from the FCA that the “Equifax Security Incident Handling Policy & Procedures” created a risk that the interests of other parts of the Equifax group “could be placed above the interests of Equifax Ltd”. The FCA said this risk was potentially exacerbated by the Equifax Ltd. “Security Executive” reporting to Equifax Inc.’s global security executive; and
  • Notifications and Complaints Handlings: Making public statements that gave consumers an inaccurate impression of the scale of the incident and inadequate complaints handling procedures. The FCA found this was due to, among other factors, Equifax Ltd. only being informed by Equifax Inc. about the incident roughly five minutes before it was publicly announced. The FCA also criticised what it saw as an unreasonable delay in notifying UK-based individuals, caused in part as a result of Equifax Inc. preventing Equifax Ltd.’s direct access to certain U.S. systems in the wake of the incident.

What to do: In light of the FCA’s findings, regulated firms may want to review their cybersecurity and technology outsourcing policies, procedures and structures to ensure that they adequately address risks posed by, and exercise appropriate oversight over, intra-group service arrangements.  In particular, the FCA’s findings of potential intra-group conflicts of interests may provide useful guidance for firms considering how best to structure reporting lines. Finally, when responding to incidents, firms may want to ensure that internal policies, procedures and contractual arrangements ensure that incidents are escalated to group entities promptly and that appropriate information is shared to allow notifications to take place in a timely fashion.

French CNIL issues guidance on GDPR considerations when developing AI tools

What happened: The CNIL issued guidance on GDPR considerations when using personal data in developing AI systems. It consists of seven ‘how-to’ sheets covering: (i) determining whether the GDPR applies to the data in question; (ii) defining the purpose of data collection; (iii) determining the legal status of AI system providers; (iv) ensuring the lawfulness of data processing; (v) when to conduct a data protection impact assessment; (vi) embedding data privacy in system design; and (vii) responsible management of learning data. This is the first set of guidelines published by CNIL as part of an ongoing public consultation into the data protection issues raised by AI systems.

What to do: As previously covered on the blog, the intersect between AI and data protection – especially the GDPR – can be challenging to navigate, and there are a number of associated privacy considerations when inputting personal data into AI tools (either as prompts or as training data). Businesses that are developing or deploying AI systems in France that involve personal data may want to review the CNIL’s guidance to help ensure they comply with applicable data protection laws.

UK ICO issues Snap with preliminary enforcement notice over its AI chatbot

What happened: The UK ICO issued Snap, Inc with a preliminary enforcement notice for a possible failure to assess privacy risks posed by its generative chatbot, “My AI”. The chatbot, which leverages OpenAI’s ChatGPT technology, is available to all UK Snapchat users. The ICO highlighted concerns that Snap did not adequately assess the privacy risks to child users of the chatbot, and stressed the importance of undertaking thorough risk assessments before making such products available to the public (including children). Snap now has the opportunity to respond to the preliminary notice before the ICO reaches its final decision.

This enforcement notice comes as part of a broader regulatory trend on upholding children’s privacy rights. Recent examples include the UK’s new Online Safety Bill and the Irish DPC’s €345 million fine against TikTok.

What to do: Businesses need to be mindful of their data protection obligations when developing or utilising AI tools that use personal data, and ensure that they have completed suitable risk assessments (including data protection impact assessments where necessary) before the tools are rolled out. Businesses should expect increased scrutiny from the ICO and other DPAs when such products or services are being offered to children.

CJEU preliminary ruling on charging data subjects for copies of their personal data

What happened: The Court of Justice of the European Union (CJEU) gave a preliminary ruling on when data controllers can charge data subjects a fee for providing them with a copy of their personal data. The ruling was handed down following a complaint from a German dental patient that their dentist was only willing to provide them with a copy of their medical records if the patient agreed to cover the costs, as is provided for in German law. The CJEU confirmed that:

  • Under GDPR, a data subject has the right to obtain the first copy of their personal data without charge; the right supersedes national laws protecting the economic interest. A data controller can then levy a charge for any subsequent additional copies.
  • A data subject is not obliged to provide reasons for their request.
  • The right to a copy means, in the context of medical records, to give a faithful and intelligible reproduction of all documents essential to understand the personal data contained therein.

This judgment comes amid a heightened focus on access rights, following the EDPB’s announcement that data controllers’ implementation of the rights of access will be the topic of its 2024 third coordinated enforcement action.

What to do: Companies who hold personal data may want to review their processes for responding to data subject access requests, to ensure that charges are not being levied on data subjects requesting their first copy of their personal data, and that they are responding to requests irrespective of the reason (if any) for the request.

French and German DPAs publish guidance on implementing the EU-US Data Privacy Framework

What happened: The French CNIL has published an FAQ on the recent adequacy decision for the EU-U.S. Data Privacy Framework (the “DPF”), which enables businesses in the EU to transfer personal data to DPF-certified U.S. businesses without having to implement additional data protection safeguards. The FAQ  emphasises that French businesses can only transfer GDPR-covered personal data to non-DPF certified U.S. businesses if other appropriate safeguards have been implemented (such as Standard Contractual Clauses).

The CNIL FAQ follows the publication of guidance from various German DPAs in September. While the Bavarian DPA and the German Data Protection Conference, a body convening the German DPAs, were generally accepting of the DPF, recognising it as an effective data transfer method when properly implemented, the Thuringia DPA objected to the views of the Conference and advised in separate guidance companies to exercise caution and consider not transferring sensitive data to U.S. providers.

What do to: EU businesses wishing to rely on the DPF may want to take the guidance into account when transferring personal data to DPF-certified U.S. businesses, and in particular consider whether any further protections or mitigating measures are appropriate when transferring sensitive personal data.

German Federal Cartel Office gives Google users increased control over personal data

What happened: The German Federal Cartel Office (“FCO”) terminated its legal proceedings against Google under German competition law (the Act against Restraints of Competition, “ARC”) following Google’s commitments to give users better choices about how Google processes their data. Google will have to ensure that it obtains free, specific, informed and unambiguous user consent whenever it seeks to combine personal data from multiple Google services or different sources, or cross-use personal data in different Google services.

As a designated “gatekeeper” under the Digital Markets Act (“DMA”; for more detail see our client update), Google already faces similar consent obligations for its core platform services, such as Google Shopping, Google Play, Google Maps, or Google Search. Google’s FCO commitments extend those DMA consent obligations to additional Google services within Germany.

What to do: The decision demonstrates that the DMA and ARC complement each other And businesses should be mindful that the FCO will resolutely apply competition rules and potentially extend obligations for large digital companies to further services not covered by the DMA

Clearview AI successfully appeals £7.5 million ICO fine

What happened: Clearview AI successfully appealed its £7.5 million fine, imposed by the UK ICO in 2021, on the basis that the company’s operations fall outside the material scope of the UK GDPR. The First Tier Tribunal held that, while Clearview did process the personal data of UK persons, its operations fall outside the scope of UK GDPR as Clearview’s clients were all law enforcement or national security bodies, and the UK GDPR does not apply to the acts of foreign governments. In response, the ICO has issued a statement clarifying that this judgement does not “remove the ICO’s ability to act against companies based internationally” and re-iterating that the judgement was reached on the basis of a specific exemption for foreign governments.

Clearview, which offers its clients access to a database of up to 30 billion images of individuals, has previously been find by the French, Italian, Austrian, Greek, Canadian and Australian data protection authorities. It remains to be seen whether the UK appeal will have any effect on the decisions of other data protection authorities.

The First Tier Tribunal also took the opportunity to provide guidance on what constitutes “monitoring of behaviour” under A3(2)(b) GDPR. It held that the concept was intensely fact specific but that it did not require continuous activity and could include single incidents, for example, establishing where a person is at a particular point in time.

The ICO has announced that it is seeking the First Tier Tribunal’s permission to appeal this decision.

What to do: Data scraping and mass data processing by AI technologies continue to be hot topics for regulators, and businesses should be mindful that Clearview’s successful appeal was due to a specific, narrow limitation in the GDPR’s scope. Businesses that engage in these activities may want to review their data processing operations to confirm that they comply with all of their obligations under (UK) GDPR.

Is jigsaw identification an easy puzzle to solve? The ICO and courts appear to think so, as they reaffirm a low threshold for personal data.

What happened: Jigsaw identification refers to the principle that information that does not identify a specific individual can still constitute personal data if it can be used in connection with other information to identify a specific person.

In this case, a member of the public submitted a Freedom of Information Act request to the Home Office for information about an unknown individual they had witnessed being arrested during an immigration raid. The Home Office refused to provide the information inter alia for data protection reasons; this decision was upheld by the ICO. The member of the public appealed the decision, claiming that as the requested information did not identify the name of the individual who was arrested, it did not constitute personal data so could be disclosed. The Tribunal held that the information constituted personal data because the individual in question had been arrested in front of people who knew them so, if the requested information was disclosed, those people might be able to identify the individual via a ‘jigsaw’ of information. Nonetheless, the decision emphasizes that what is deemed (in)sufficient to create the possibility of “jigsaw identification” may vary greatly, and relies heavily on assumptions made about the possibility of identification from the available evidence.

What to do: While this case was based on the Freedom of Information Act, the decision aligns with, and is helpful for interpreting, the definition of personal data under the GDPR. Businesses should be mindful of the possibility for jigsaw identification when reviewing their data privacy policies and procedures. In particular, businesses may want to review their approach to data subject access requests, since this case appears to set a low threshold for determining what constitutes personal data.

EU Commission publishes compliance report template for DMA gatekeepers

What happened: The European Commission published a template for the compliance report that gatekeepers under the Digital Markets Act will have to submit. The template seeks detailed and transparent information for each core platform service for which the business has been designated a gatekeeper. It calls for specific information for each measure implemented in the context of the DMA, such as its scope, any changes to customer experience and remuneration flows, and whether any market analysis/testing was undertaken. Following submission, the Commission will publish the non-confidential summary.

Reports must be submitted within six months of designation as a gatekeeper, and updated at least annually thereafter. Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft, designated as gatekeepers in the initial round on 6 September 2023, will have until 7 March 2024 to file their first compliance report. The Commission, in addition to the template for the compliance report, continuously issues rules and further templates relating to the DMA implementation.

What to do: Already designated gatekeepers should start compiling the requisite information to meet the 7 March 2024 deadline. Those anticipating a future gatekeeper designation should be aware of the reporting requirements that are now in place.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Sergej Bräuer is an international counsel in the firm’s Frankfurt office and a member of the Antitrust & Competition Group. He advises clients on the full spectrum of antitrust and competition matters as well as foreign direct investments. He has deep experience in cartel damages cases and complex merger control proceedings, including coordination of worldwide merger control filings and approvals.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Sophie Michalski is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at smichalski@debevoise.com.

Author

Oliver Binns is a trainee associate in the Debevoise London office.

Author

Ryan Fincham is a trainee associate in the Debevoise London office.

Author

Lucas Orchard-Clark is a trainee associate in the Debevoise London office.

Author

Barney Lynock is a trainee associate in the Debevoise London office.

Author

Alfred Scott is a trainee associate in the Debevoise London office.

Author

Samuel Thomson is a trainee associate in the Debevoise London office.