Key takeaways from April include:

  • UK FCA’s AI regulation: UK FCA-regulated firms should take note of the FCA’s newly confirmed approach to AI regulation that seeks to be outcome-focused, principle-led, and flexible and consider whether their use of AI is consistent with the FCA’s objectives to mitigate risk to consumer protection, market competition, and market integrity.
  • UK Generative AI: Adding to a busy month of AI-related announcements, businesses deploying generative AI models should consider the UK ICO’s consultation on the accuracy of such models, specifically to: (i) identify and mitigate the risks of inaccurate training data; (ii) provide clear and targeted information on statistical accuracy; and (iii) monitor how the AI system is used to inform improvements and restrictions.
  • UK Internet of things: To comply with the UK’s new regime on the security for ‘internet of things’ (“IoT”) products, effective 29 April, covered manufacturers, importers, and distributors should ensure that: (i) the IoT device has a strong password; (ii) they provide information on how to report security issues; and (iii) they publish information on minimum security update periods.
  • UK Health data: Businesses operating in the health and social care sectors that process personal health data should consider new UK ICO guidance to ensure compliant transparency measures, including through the identification of potential harms associated with failing to provide adequate levels of transparency information and, in some cases, employing public engagement processes.
  • UK Privacy notices: Businesses may wish to familiarise themselves with the UK ICO’s new privacy notice generator and consider using it when revising their existing privacy notices.
  • UK Enforcement: Businesses that use third-party data to conduct marketing may wish to review their practices to ensure they comply with the UK First-tier and Upper Tribunals’ (“FTT” and “UT” respectively) interpretation of the transparency principle under UK GDPR, which supported user-friendly consumer information portals and prominently displayed privacy notices that avoid industry jargon.
  • CNIL AI and privacy: Organisations regulated by the French CNIL that are developing AI systems and handling personal data may wish to review the authority’s recommendations to reconcile innovation and privacy rights, which include that such AI system providers consider seven steps including: (i) defining a purpose of processing; (ii) determining their legal role as a controller or processor; and (iii) determining a lawful basis for processing.
  • German Data protection law amendments: Businesses subject to German data protection law should take note that the government’s attempts to amend its federal data protection law to, among other things, reduce the burden of rules around the remit of individual DPAs in Germany, such that joint data controllers subject to multiple DPAs could designate just one German DPA for supervision, has been met with criticism that could delay or alter passage.
  • EDPB “Consent or pay” models: Businesses operating large online platforms should consider the European Data Protection Board’s recent opinion indicating that “consent or pay” models are unlikely to be GDPR-compliant. As although the opinion is non-binding, it is likely to influence the decisions national DPAs in the EEA on “consent or pay” models going forward.
  • EU DSA and DMA Whistle-blowing: Businesses subject to the Digital Services Act and the Digital Markets Act may wish to consider updating their relevant policies or procedures to note the availability of the European Commission’s new encrypted reporting tools that enable whistle-blowers to anonymously report violations of both laws.

These developments, and more, are covered below.

UK FCA publishes its approach to AI regulation and sets 12-month plan

What happened: On 22 April 2024, the UK Financial Conduct Authority (“UK FCA”) published its ‘AI Update’, detailing its approach to the regulation and supervision of AI. Its approach aims to be principles-based, informed by proportionality, and “outcomes-focused” on identifying and mitigating risks to consumer protection, market competition, and market integrity rather than mandating or prohibiting specific technologies. The UK FCA aims to allow regulated firms flexibility to adapt and innovate.

The AI Update assesses the key AI principles outlined in the UK government’s AI White Paper  (safety, security, and robustness; appropriate transparency and explainability; fairness; accountability and governance; contestability and redress), which we previously covered here, and considers how these are addressed within the UK FCA’s existing guidance and regulations. For example, the FCA Principles for Business, including Principle 2 on due skill, care, and diligence, maps onto the need for businesses to ensure their AI systems function in a secure and safe manner. Similarly, the FCA’s Consumer Duty aligns to the fairness principle, such that businesses should be providing fair value and access to consumers.

The UK FCA also sets out its 12-month plan, which includes working with other regulators within the Digital Regulation Cooperation Forum to deliver a pilot AI and Digital Hub (a multi-regulator service advising businesses on AI or digital products), and exploring opportunities for an AI Sandbox allowing businesses to test and refine ideas in a controlled environment. The CNIL implemented a similar sandbox initiative last year.

What to do: FCA-regulated entities should review existing UK FCA publications in parallel to the AI Update to ensure their AI use complies with the FCA’s existing expectations. Although businesses may welcome the UK FCA’s intended flexibility, businesses with an EU presence may want to consider how this approach may overlap with or diverge from the more prescriptive requirements under the upcoming EU AI Act.

UK ICO consults on the accuracy of generative AI

What happened: The UK ICO ran the third instalment in its consultation series on how data protection law should apply to the development and use of generative AI, seeking evidence on the ways in which decisions taken at different stages of the generative AI lifecycle affect the accuracy of the outputs (specifically, accurate and up-to-date personal data), with a view to preventing misinformation, reputational damage and other related harms. The UK ICO will use the responses to inform its position on the interpretation of specific requirements of UK GDPR and provide guidance, including expectations for generative AI developers to understand how accuracy impacts their models and for generative AI deployers to monitor and mitigate associated risks.

More specifically, expectations on developers include consideration of:

  • data protection accuracy, via the types of systems and development stages that require accuracy of the underlying input;
  • statistical accuracy, as to whether the purpose of the generative AI system requires that the outputs be accurate;
  • the impact of training data on statistical accuracy; and
  • the need to communicate their conclusions on each of these points to developers.

Expectations on businesses deploying generative AI include:

  • consideration and mitigation of the risks of inaccurate training data;
  • the provision of clear and targeted information on statistical accuracy; and
  • monitoring of how the AI system is used to inform improvements and, if needed, restrictions.

The fourth instalment of this consultation series, on engineering individual rights into generative AI models, is live until 10 June 2024.

What to do: Both developers and deployers of generative AI should consider the UK ICO’s current expectations in light of its expected forthcoming guidance, and for those entities that will be subject to multiple AI regulatory regimes, consider how the UK ICO’s expectations may align or diverge from those of other regulators.

Upgraded UK IoT regime comes into force

What happened: The UK’s Product Security and Telecommunications Infrastructure regime came into effect on 29 April, adding a new layer of regulation to IoT devices. The regime applies to manufacturers, importers, and distributors of ‘smart’ devices (connectable to the internet or a home network, other than some excluded products) and will be enforced by the Office for Product Safety and Standards.

Covered businesses must ensure that:

  • products have unique passwords that are not based on incremental counters, publicly available information or product identifiers (e.g., password1, password2, password3, the MAC address or Wi-Fi SSID, or the model number), unless encrypted using good industry practice;
  • product security issues are consistently reported to the user;
  • mechanisms for users to report security issues are easily accessible;
  • clear status update timelines are communicated until the resolution of security issues; and
  • information on minimum security update periods is made available to consumers in a clear, accessible and transparent manner.

What to do: Manufacturers, importers, and distributors of ‘smart’ devices should ensure that they comply with the new requirements and have incorporated them in processes and procedures at the operational level, for example, by reviewing their password-setting mechanisms and complaints procedures.

UK ICO pushes for transparency in health and social care

What happened: On 15 April 2024, the UK ICO published guidance to help organisations processing personal health and social care data to be transparent with people about how their personal information is used in accordance with the DPA 2018 and UK GDPR. The guidance sets out:

  • what data protection transparency means in the context of health and social care, with a focus on the extra measures that are required to protect special category information that is frequently encountered in health and social care settings;
  • how organisations can comply with the obligation to be open and honest with the public by developing effective transparency information (with a case study on a medical research organisation); and
  • what factors organisations should consider when assessing their level of transparency, including considering whether patient engagement processes should be employed to evaluate transparency material.

Ultimately, the UK ICO envisages that greater transparency will empower people to share their health data and boost the efficiency and public benefit of health and social care services.

What to do: Businesses that process UK personal data in health and social care settings should consult the UK ICO’s guidance and related compliance checklist when assessing or reviewing transparency requirements under data protection law. This includes consideration of existing privacy notices, identifying potential harms associated with failing to provide adequate levels of transparency information and employing public engagement processes to refine transparency material.

UK ICO unveils a privacy notice generator

What happened: The UK ICO has announced a new online Privacy Notice Generator. It replaces the UK ICO’s previous templates and allows smaller organisations to tailor their privacy notices to their particular needs. Organisations can choose either to generate a privacy notice for: (i) customers and suppliers; or (ii) staff and volunteers, in each case, customising the explanation of how the organisation processes personal information.

The currently available privacy notice generator is designed for small businesses in “general” sectors, such as retail or manufacturing, with sector-specific generators to follow by mid-2024 for (i) professional services (including finance, insurance and legal services); (ii) education and childcare; (iii) health and social care; and (iv) charities and the voluntary sector.

What to do: Businesses may wish to consider whether to revise their privacy notices, either based on the already-available generator, or following the publication of sector-specific options. While businesses should note that privacy notices created from these templates likely still require tailoring to the business, they may serve as a useful gap assessment and benchmark of regulatory expectations around transparent and clear, user-friendly language.

UK UT rejects UK ICO’s appeal of Experian’s enforcement notice

What happened: The UK UT dismissed the UK ICO’s appeal of the first-instance decision to overturn an enforcement notice against Experian, a credit reference agency that holds and processes data relating to essentially the whole of the UK’s adult population.

In 2020, the UK ICO issued an enforcement notice against Experian for its reliance on legitimate interests as a legal basis for direct marketing data processing and because its privacy notice was not sufficiently transparent or available to UK residents. On appeal by Experian, the FTT disagreed with the UK ICO in part, narrowed the enforcement action, and in relevant part found that Experian’s privacy notice was sufficiently transparent under UK GDPR Arts. 5(1)(a) and 14.

The UK ICO appealed, arguing that the FTT misapplied the transparency principle by over-focusing on the benign consequences to data subjects of Experian’s processing of their personal data instead of considering their rights, protections and the expectations they would have in this particular context. The UK UT dismissed the UK ICO’s appeal, finding that the FTT’s context-specific analysis properly took into account that Experian’s processing was not significantly privacy-intrusive or otherwise likely to be harmful to data subjects in practice. It relied, among other things, on the FTT’s finding that Experian had taken extensive measures to inform data subjects via its consumer information portal and the credit reference agency information notice. Either party can seek permission to appeal the UT’s decision to the Court of Appeal, and the UK ICO stated that it is considering next steps, including whether to appeal.

What to do: Businesses may want to review their policies around providing information to data subjects regarding their data processing activities to ensure that they comply with the principles of transparency and proportionality under the UK GDPR as interpreted by the UK FTT and UT. For instance, businesses may want to update their consumer information portals to provide concise information about data processing activities “at a glance” and should ensure that privacy notices are prominently displayed and easy to understand, avoiding industry jargon.

CNIL publishes first AI data protection recommendations

What happened: The CNIL published its first recommendations on the development of AI systems in order to assist businesses in their GDPR compliance efforts. These recommendations follow from the CNIL’s AI action plan published in May 2023 (examined here). The CNIL’s “how-to” sheets, previously covered here, were enhanced and consolidated following a public consultation with various AI actors. The CNIL recommends seven steps to AI system providers dealing with personal data: (i) defining a purpose of processing; (ii) determining their legal role as controller or processor; (iii) determining a lawful basis for processing; (iv) checking whether re-use of data is permitted; (v) minimising personal data; (vi) defining a retention period; and (vii) conducting a data protection impact assessment. Over the coming months, the CNIL will be supplementing these initial recommendations with other “how-to” sheets, which will also be submitted for public consultation.

What to do: Businesses that are developing or deploying AI systems in France or that process French personal data may want to review the CNIL’s guidance to help ensure they meet local regulatory expectations and comply with applicable data protection laws. Businesses should also keep an eye out for additional recommendations to come, which we will cover on the blog.

German regulators criticise pending amendments to the Federal Data Protection Act

What happened: Awaiting final approval from the German parliament, a new law to change the German Federal Data Protection Act (the “Amendment”) has come under criticism from the Independent Data Protection Conference, assembling the state and the federal regulators and, in a separate statement, from the federal data protection regulator.

Among other things, the Amendment: (i) confirms that businesses which act as joint controllers and which process personal data for scientific, historical or statistical reasons are no longer subject to all the DPAs where their operations are located and instead are subject to supervision by just one German DPA; (ii) confirms that joint controllers which were previously subject to separate DPAs may now designate the same supervisory authority for both controllers; (iii) confirms that data controllers can be exempted from data subject access request obligations where such concerns are outweighed by the confidentiality of business or trade secrets; and (iv) creates some limited exceptions to the ban on automated decision-making for scoring purposes under Art. 22(1) GDPR.

The Amendment has been criticised by German regulators for not altering the requirement for German DPAs to attribute violations to named individuals or entities before they can issue fines in light of a recent CJEU decision previously covered here and here. The German regulators have requested a range of further clarifications and changes to the Amendment.

What to do: Nothing for now—the criticism may delay or alter the finalisation of the Amendment, which we will monitor on the blog. If passed, businesses subject to supervision by multiple German DPAs can welcome the simplification of the supervisory regime and should update their policies and procedures to address the revised exemptions and exceptions.

EDPB considers “consent or pay” models unlikely to be GDPR-compliant

What happened: The EDPB adopted a non-binding opinion regarding the validity of consent to process personal data for behavioural advertising purposes under “consent or pay” models on large online platforms. The EDPB concluded that the models are unlikely to be GDPR-compliant if they only offer the potential user a choice between consenting to such processing or paying a fee.

The EDPB recommended that large online platforms (including “very large online platforms” within the meaning of the DSA, previously covered here) consider providing a free alternative which does not involve behavioural advertising and clarified that the validity of consent in this context depends more broadly on four factors:

  • Conditionality: whether the fee is likely to compel consent from potential users because, for example, it is large;
  • Detriment: whether the withdrawal of consent will result in harm or damage to the user;
  • Imbalance of powers: between the potential user and platform, an assessment of, among other considerations: (i) the position of the large online platform in the market; (ii) the extent to which the individual relies on the service; and (iii) the main audience of the service;
  • Granularity: whether the potential user can select particular purposes for processing rather than only one bundle of multiple purposes.

What to do: While the opinion itself is non-binding, it is likely to be applied by EEA DPAs in assessing the GDPR compliance of “consent or pay” models on large online platforms. Businesses operating such platforms in the EEA should consider how their “consent or pay” models match up against the EDPB’s factors and consider available modifications or alternatives, where potentially needed.

European Commission establishes encrypted DSA and DMA whistle-blower reporting tools

What happened: The European Commission has established tools to enable whistle-blowers to anonymously report violations of the Digital Services Act (“DSA”) and the Digital Markets Act (“DMA”). The tools utilise encryption to improve the privacy and integrity of the communications between the whistle-blower and the European Commission.

Whistle-blowers could use the tools to report violations under the DSA, in relation to, among other things: (i) content moderation practices; (ii) functioning of recommender systems; and (iii) advertising practices. Under the DMA, whistle-blower reports might include alleged violations such as the gatekeeper’s use of not publicly available data of business users.

What to do: Businesses subject to the DSA or the DMA may wish to consider updating any applicable whistle-blower policies or procedures to highlight the availability of these tools.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by Microsoft Copilot.


Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at


Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.


Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at


Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at


Michiko Wongso is an associate in the firm’s Data Strategy & Security Group. She can be reached at


Oliver Binns is a trainee associate in the Debevoise London office.


Anna Chirniciuc is a trainee associate in the Debevoise London office.


Barney Lynock is a trainee associate in the Debevoise London office.


Alfred Scott is a trainee associate in the Debevoise London office.


Deniz Tanyolac is a trainee associate in the Debevoise London office.