Key takeaways from September include:

UK-US data bridge: From 12 October 2023, UK businesses will be able to transfer personal data to certain US organisations certified under a UK-specific extension to the EU-U.S. Data Privacy Framework, without additional GDPR safeguards;

AI Foundation Models: Consumers and developers of AI Foundation models (“FMs”) should take care to ensure they respect existing data protection legislation when training and using FMs, as the UK Competition and Markets Authority highlights potential market consolidation and competition-related risks in its Initial Report on AI Foundation Models;

Handling children’s data: Businesses who process children’s data may wish to review their processes to ensure that they align with regulators’ latest expectations in light of the Irish DPC’s €345 million TikTok fine, in particular with regard to social media companies’ use of “public by default” settings for children;

Rights under the UK GDPR: Businesses handling UK GDPR-covered data may need to consider the implications of UK Government amendments to the “fundamental rights and freedoms” protected by these regulations;

Recruitment processes: Following enforcement action brought by the French CNIL against SAF Logistics, businesses should consider reviewing their recruitment processes to ensure that they comply with the “data minimization” principle and take particular care to ensure special category data is collected only with a valid lawful basis and limited to what is strictly necessary; and

Bulk emails: Businesses should consider using alternatives to the blind carbon copy (“BCC”) function when sending emails containing personal data, following new ICO guidance.

These developments, and more, covered below.

UK-U.S. data bridge comes into force 12 October 2023

What happened: Parliament passed Regulations giving effect to the UK-U.S. data bridge from 12 October 2023. For the wider UK-U.S.-EU context, see this Debevoise blog.

From 12 October, UK businesses can transfer UK GDPR-covered personal data to certified U.S. organisations without the need for further safeguards such as the UK International Data Transfer Agreement, and others.

Additional points to note:

  • U.S. organisations in certain sectors including banking, insurance, and telecommunications cannot participate in the data bridge, for now;
  • “Sensitive” data including genetic and biometric data and data concerning sexual orientation must be flagged as such to the U.S. organisation receiving the data; and
  • HR data can only be transferred to U.S. organisations which have specifically covered such data in their EU-U.S. Data Privacy Framework commitments.

What to do: A UK business that wishes to transfer personal data to the US under the data bridge should go to the DPF List, search for the US organisation in question, and confirm they are signed up to the “UK Extension”. The UK business should also consider what kinds of information it is transferring and check any additional requirements are met. U.S. businesses wanting to benefit from the data bridge will need to complete a process of self-certification, by opting to extend their DPF certification to include UK-U.S. data transfers.

The UK’s Competition & Markets Authority publishes Initial Report on AI Foundation Models

What happened: The UK’s Competition & Markets Authority (“CMA”) published its Initial Report on AI Foundation models (“FMs”), considering the current and potential impact of these rapidly developing models on innovation and competition.

In the Report, the CMA highlights a need to balance the huge promise of FMs against the risks they pose to the protection of consumers’ data, and intellectual property rights. The CMA gathered feedback from key stakeholders including experts in FMs and businesses, and proposed six principles for FM developers and deployers to follow, which are aimed at promoting accountability: (i) access; (ii) diversity; (iii) choice; (iv) flexibility; (v) fair dealing; and (vi) transparency.

The CMA further identified the availability of solutions for businesses of all sizes as an anti-competitive risk. The high upfront cost in developing FMs, along with the unavailability of data and access to computing power, may negatively impact availability in the future. The CMA further highlighted that transparency around the data used to train FMs is critical to reducing bias and improving accuracy of outputs, and to ensuring accountability. Two areas that have been stressed previously in the data protection compliance context.

What to do: Creators of FMs may need to anticipate concerns about anti-competitive behaviour as the market develops. The CMA has indicated the types of the behaviours they are concerned about and will issue a follow-up report in which they consider the extent to which their principles have been adopted. Business may wish to take note of those findings.

Irish DPC fines Tiktok €345 million for children-related GDPR violations

What happened: The DPC fined Tiktok Technology Ltd €345 million for GDPR violations  related to its processing of children’s data, and gave TikTok three months to correct the contraventions.

The DPC held that TikTok violated GDPR requirements by: (i) using “public by default” settings for children’s accounts; (ii) designing pop-up notifications, targeted at children, which failed to present opt-in options in an “objective and neutral way”; and (iii) failing to properly take into account the risks to children under the age of thirteen who could circumvent TikTok’s age verification procedures to access the platform.

This was the second major enforcement action taken against TikTok in recent months, after the ICO fined them £12.7 million for illegally processing the data of 1.4 million children without parental consent.

What to do: Businesses processing children’s personal data may want to review their policies, interfaces and practices to align with the latest regulatory expectations set out in the penalty. In particular, by making sure that opt-in options are presented to children in an objective way and by reviewing the use of “dark pattern” techniques which may undermine the ability of users to freely consent.

UK government amends the “fundamental rights and freedoms” protected by UK GDPR

What happened: The UK government published a statutory instrument which amended the “fundamental rights and freedoms” protected by UK data protection law.

Previously, the stated objective of the UK GDPR was to protect “the fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data” (Article 1(2)). This was a concept repeated throughout, and aligned with Article 8 EU Charter of Fundamental Rights which contained a specific right to “protection of personal data”.

The amended version removes the reference to a right to the “protection of personal data” and refers instead to “the protection of individuals’ rights and freedoms”. This reflects the government’s view that the protection of personal data falls within the more general right to “respect for private and family life” under Article 8 of the European Convention on Human Rights.

Article 8 under the Convention has a narrower scope than Article 8 under the Charter. Article 8 under the Convention allows individuals to rely on a right to privacy in relation to their data but this may not extend to personal data which is already in the public sphere. Therefore, this change may weaken the rights of data subjects and make it easier for data controllers to establish that they meet the requirements of necessity, fairness and proportionality when processing personal data.

What to do: Businesses should keep this amendment in mind when considering the implications of the UK GDPR for their business. In particular, with regard to:

  • Making Legitimate Interest Assessments;
  • Determining whether to make regulator and individual data breach notifications;
  • Processing personal data relating to criminal convictions; and
  • Determining appropriate data security measures.

For many businesses complying with both the EU and UK GDPR in parallel though, the change may have limited practical significance.

French CNIL fines SAF Logistics €200,000 for GDPR violations

What happened: The CNIL fined air freight company, SAF Logistics, €200,000 for violating the GDPR through its internal recruitment processes. Following an onsite investigation, the CNIL found that SAF:

  • Infringed the data minimization principle by unnecessarily collecting employees’ family members’ personal data (including, their identity, contact details, employer and marital status) when conducting internal recruiting;
  • Unlawfully processed special category data by improperly requiring employees to disclose their blood type, ethnicity and political affiliation, without a lawful basis;
  • Unlawfully processed personal data relating to criminal convictions and offences by improperly keeping extracts from the criminal records of employees; and
  • Breached its statutory duty of cooperation by providing incomplete translations of documents the CNIL requested.

What to do: Businesses should review their recruitment processes to ensure that the personal data they collect is adequate, relevant and limited to what is necessary for their purposes and ensure a valid lawful basis exists (and is recorded). The penalty also highlights the obligation to cooperate with supervisory authorities’ inquiries.

UK ICO warns against using the BCC email function

What happened: The ICO has encouraged organisations to consider using alternatives to the blind carbon copy (“BCC”) email function when sending emails containing sensitive personal data. Failure to use BCC correctly (i.e., using carbon copy (“CC”) instead) is consistently within the top 10 non-cyber data breaches reported to the ICO, with almost a thousand reported to the ICO since 2019. Businesses must implement appropriate technical and organizational measures to ensure personal data is kept safe, and not inappropriately disclosed to others. The importance of using appropriate methods to send bulk communications is demonstrated in recent ICO enforcement action. In August, the ICO reprimanded two Northern Irish organisations for inadvertently disclosing people’s information via email when using inappropriate group email options, leading to recipients being able to view other recipients’ email addresses. In March, the ICO issued a reprimand to NHS Highland for a “serious breach of trust” following a data breach involving those likely to be accessing HIV services. NHS Highland emailed 37 people likely to be accessing HIV services, inadvertently using CC instead of BCC, resulting in recipients being able to view the email addresses of other recipients.

What to do: Businesses should consider using alternatives to BCC, such as bulk email services, mail merge, or secure data transfer services to ensure personal data is not accidentally disclosed when communicating with third parties.

To subscribe to the Data Blog, please click here.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Anna Chirniciuc is a trainee associate in the Debevoise London office.

Author

Samuel Thomson is a trainee associate in the Debevoise London office.

Author

Jeevika Bali is a trainee associate in the Debevoise London office.

Author

Lucas Orchard-Clark is a trainee associate in the Debevoise London office.