Our top-eleven European data protection developments for the end of 2024 are:
- EU Cyber Resilience Act: The Council of the European Union approved the Cyber Resilience Act, introducing cybersecurity requirements for digital products sold in the EU. Businesses may wish to start applying the requirements to products and processes ahead of the Act becoming fully enforceable on 11 December 2027.
- UK Upper Tribunal judgment could affect data breach enforcement: The UK Upper Tribunal has overturned a key issue in a UK ICO data breach decision against DSG Retail Limited. The decision considered that although pseudonymized information is personal data in the hands of the original data controller, it may not be personal data in the hands of third parties if they cannot identify individuals from it.
- LinkedIn fined €310 million for GDPR breach: The Irish DPC fined LinkedIn for breaching GDPR requirements of lawfulness, fairness and transparency. Businesses may wish to review their data processing activities to ensure they are relying on a correct legal basis and are providing adequate information to users about data practices undertaken.
- Data Protection Authority powers: The European Court of Justice (“CJEU”) has ruled that Data Protection Authorities are not obliged to exercise corrective powers in the event of a breach. If a data breach occurs, businesses may wish to take steps to rectify it and prevent recurrence and maintain a record of its response.
- Misuse of cookies: Following further UK ICO enforcement action against the misuse of cookies, businesses may wish to assess whether their cookies policy is compliant. In particular, that all non-essential cookies are deployed only after valid user consent is obtained.
- CJEU rules legitimate interest: The CJEU’s judgment does not rule out the possibility that the scope of “legitimate interest” under the GDPR can extend to a controller’s commercial interest, provided that such interest is lawful, and that the legitimate interest test is sufficiently met. Although the judgment is a welcome clarification, it likely only enforces existing expectations under the legitimate interest test, rather than expanding the scope of legitimate interest as a legal basis.
- CJEU issues judgment concerning the processing of sensitive data under the GDPR: The CJEU issued a judgment against Meta Platforms Ireland Ltd (“Meta”) relating to the processing of special categories of personal data under the GDPR and the principle of data minimisation.
- Formal warning against sharing personal data with OpenAI: In a reminder of increasing regulatory pressure on the use and development of AI, the Italian DPA has issued a formal warning against publishing group, GEDI, concerning its data sharing agreement with OpenAI. Businesses who carry out large-scale processing activities or train algorithms may wish to ensure that they have a clear lawful basis for processing personal data.
- OpenAI GDPR breach: Businesses processing personal data in the context of AI platforms should note the Italian Data Protection Authority’s €15 million fine against OpenAI OpCo regarding its management of ChatGPT for violating GDPR principles of transparency and information obligations, alongside processing personal data without a legal basis.
- Meta fined following data breach: Businesses should be reminded of their notification and documentation obligations following a personal data breach and ensure that only personal data necessary for specific purposes is processed, following the Irish Data Protection Commission’s €251 million fine against Meta for GDPR violations connected to a 2018 Facebook user token exploit that was found to have exposed users’ personal information.
- GDPR right to information: Businesses may wish to look to a new CJEU decision on the right to information when personal data is collected indirectly, allowing an exception to the obligation of informing data subjects about their data collection, regardless of the data source.
These developments are covered below.
New EU Cyber Resilience Act introduces cybersecurity requirements for digital products
What Happened: The Cyber Resilience Act (“CRA”) has been approved by the Council of the European Union and was published in the Official Journal of the European Union on 20 November 2024, entering into force on 10 December 2024.
As discussed in our September 2022 and December 2023 roundups, the Act introduces binding cybersecurity requirements for digital products sold in the EU that are directly or indirectly connected to another device or network. This includes products such as software, webcams and smart TVs. The CRA will apply to manufacturers, distributers and importers of hardware and software. Key provisions include:
- mandatory cybersecurity requirements for digital products form the design phase onwards;
- making manufacturers, importers and distributors responsible for the safety and security of digital products; and
- introducing mandatory security updates and reporting of security vulnerabilities.
Businesses that fail to comply can be fined up to 2.5% of global turnover.
The enforcement of vulnerability reporting requirements under the CRA will apply from 11 September 2026 and the main obligations introduced by the CRA will apply from 11 December 2027, including the requirements for security updates and vulnerability management.
What to do: Businesses may want to start work to align ahead of the “go live” date, given the challenges of retrofitting compliance. Businesses may consider formulating plans to adapt existing products and product lifecycles to the CRA requirements.
Upper Tribunal considers data protection compliance and standards for safeguarding personal data
What happened: In DSG Retail Limited v The Information Commissioner [2024] UKUT 287 (AAC), the UK Upper Tribunal considered the interpretation of personal data under the ‘old’ data protection regime under the Data Protection Act 1988 (“DPA 1988”). The UK Upper Tribunal did not consider the provisions under the UK GDPR. The Tribunal allowed an appeal by DSG Retail Limited (“DSG”) against an earlier decision by the First-tier Tribunal which had found that the Information Commissioner’s Office (“ICO”) had incorrectly issued a Monetary Penalty Notice of £500,000 following a cyber-attack on the company’s in-store payment systems.
The ICO issued the penalty in 2020 against DSG on the basis that there were serious inadequacies in DGS’s security systems over an extended period and that substantial volumes of personal data were alleged to have been unlawfully accessed as a result.
The UK Upper Tribunal held that the unique 16-digit numbers on credit/debit cards (the PAN) and expiry dates on credit/debit cards (together known as EMV data) alone could not be considered “personal data” because they do not identify an individual directly. EMV data will only be personal data if it can be combined with other personal data in the hands of the data controller or third party. The First-Tier Tribunal was therefore incorrect to have found that payment card data that consisted solely of payment card numbers and expiry dates could be considered personal data when accessed by hackers who cannot identify any individuals from it.
What to do: It remains to be seen whether the ICO will successfully challenge this decision, but as it currently stands the decision proposes that pseudonymized data may in fact not be personal data when in the hands of someone that has no way of re-identifying it. Businesses collecting pseudonymized information may consider limiting any further information it receives to potentially limit the “personal data” held on its systems. Businesses storing pseudonymized information may wish to ensure that any further information that may identify the individuals is maintained separately in a secure manner to reduce the risks of a “personal data breach.”
Irish DPC fines LinkedIn €310 million for GDPR breach
What Happened: The Irish Data Protection Commission (“DPC”) fined LinkedIn Ireland Unlimited Company (“LinkedIn”) €310 million for GDPR breaches relating to the lawfulness, fairness and transparency of the processing of users’ personal data for behavioral analysis and targeted advertising.
The DPC found that LinkedIn failed to obtain valid consent for processing its members personal data and improperly relied on “legitimate interests” and “contractual necessity” as legal grounds for their processing practices. Additionally, LinkedIn failed to provide adequate information to its users about its data practices. In addition to the fine, the DPC issued a reprimand and an order for LinkedIn to bring its processing into compliance with the GDPR. This penalty highlights regulators’ emphasis on the lawfulness of processing as a key factor in evaluating potential violations of users’ data protection rights.
What to do: Businesses handling personal data may wish to review their data processing activities to ensure that they are validly relying on the correct legal bases under the GDPR and ensure that the correct legal grounds are reflected in their privacy notices.
CJEU finds that data protection authorities are not required to issue corrective measures for breaches of GDPR
What happened: The European Court of Justice (“CJEU”) has ruled that Data Protection Authorities (“DPA”) are not obliged to exercise corrective powers, including issuing fines, where these are not appropriate, necessary or proportionate to remedy the breach and ensure that the GDPR is enforced.
The case in question concerned a German bank’s employee who accessed a customer’s personal data without authorization. The customer was not notified, as the bank did not consider the incident results in a high risk of harm to the customer’s rights and freedoms. The bank also took several other measures to rectify the breach, including: (i) disciplinary measures against the employee; and (ii) promptly notifying the relevant DPA of the breach
After incidentally discovering the breach, the customer complained to the relevant DPA, and dissatisfied with the DPA’s response, subsequently escalated the issue to the German courts requesting the DPA fine. The German courts referred the interpretation of GDPR to the CJEU.
The CJEU found that no corrective measures are required when these are unnecessary to remedy the breach. For example, no measures are needed when, as soon as the business becomes aware of the breach, it takes appropriate and necessary measures to ensure that it is brought to an end and does not recur.
What to do: In the event of a data breach, businesses might want to take appropriate and necessary measures to rectify breach and prevent re-occurrence, and to document its response. The latter will allow businesses to explain the decisions it has taken in response to the breach, if faced with a subsequent regulator query or data subject complaint.
UK ICO fines Sky Betting and Gaming for cookies violations
What happened: The UK ICO reprimanded Bonne Terre Limited (“Bonne Terre”), an online betting and gaming company trading as Sky Betting and Gaming, in connection with breaches of the UK GDPR related to Bonne Terre’s use of advertising cookies. This reprimand was issued further to the ICO’s broader concerns around gambling addictions being manipulated in the targeted advertising context.
The ICO found that, while Sky Betting’s website did include a cookies banner, advertising cookies were being placed on users’ devices as soon as they accessed the website, before users had given their consent to the processing of their personal data for those purposes (in breach of the UK GDPR requirement for the processing of personal data to be lawful, fair and transparent). Having been notified of the problem by the ICO, Bonne Terre promptly changed its website to ensure that users had sufficient opportunity to accept or reject cookies before their personal data was processed.
Whilst on this occasion, the ICO found no evidence to support allegations of Sky Betting and Gaming deliberately targeting of vulnerable gamblers, the case highlights the broader concerns around the misuse of customers’ personal data in this way, particularly in relation to data flows in the online gambling industry.
What to do: The ICO have indicated that this reprimand comes amid a wider crackdown on non-compliance relating to website cookies and targeted advertising. Businesses may wish to review and monitor their cookies processes to ensure compliance with the UK GDPR, in particular that all non-essential cookies are deployed only after valid user consent has been obtained.
CJEU clarifies scope of “legitimate interest” under the GDPR
What Happened: In Koninklijke Nederlandse Lawn Tennisbond v Autoriteit Persoonsgegevens, the CJEU clarified the concept of “legitimate interest” as a lawful basis for processing. The CJEU did not rule out the possibility that a purely commercial interest may be regarded as a legitimate interest.
The case related to Koninklijke Nederlandse Lawn Tennisbond (“KNLTB”), a Dutch sports federation whose members include tennis associations and their respective members. On two separate occasions in 2018, KNLTB disclosed its members’ personal data (including names, addresses and telephone numbers) to sponsors, so that the sponsors could conduct marketing campaigns. The Dutch DPA subsequently fined KNLTB for breaching the GDPR as it concluded that the KNLTB could not rely on purely commercial interests as a legitimate interest. KNLTB appealed this decision to the Amsterdam District Court, which in turn stayed proceedings and referred the case to the CJEU.
The CJEU emphasized that a wide range of interests is, in principle, capable of being regarded as legitimate and the concept of legitimate interest is not limited to interests provided for by law (although the alleged legitimate interest should be lawful), provided that it meets the other requirements of the legitimate interest test, namely that: (i) the processing is strictly necessary for the legitimate interest; (ii) the interests or fundamental rights and freedoms of individuals do not override the legitimate interest. The CJEU left it to the national court to assess whether the commercial interest of the controller which consists in the promotion and sale of advertising space for marketing purposes may be regarded as a legitimate interest under the GDPR. The CJEU did opine, however, that: (iii) in determining what is “strictly necessary”, the referring court should ascertain if the processing activities cannot be reasonably achieved as effectively through means that are less restrictive to data subject’s rights and freedoms, e.g., via consent; and (iv) in carrying out a balancing exercise against individual’s rights, the referring court should consider whether individuals could reasonably expect, at the time their personal data were collected, that their data would be used in a particular way.
What to do: The CJEU’s judgment is welcome as it clarifies and provides businesses in Europe greater leeway to process personal data for commercial interests. However, in practice, it likely only enforces existing expectations under the legitimate interest test, rather than expanding the scope legitimate interest as a legal basis. Businesses may wish to continue to ensure that any such processing should be strictly necessary and weighed against the interests of data subjects.
CJEU issues judgment concerning the processing of sensitive data under the GDPR
What Happened: The CJEU ruled that Meta Platforms Ireland Ltd. (“Meta”) cannot process special category data, such as information about sexual orientation, without user’s explicit consent. The decision arose from a case initiated by an individual Facebook user who alleged that Meta had used his publicly disclosed sexual orientation, which he disclosed during a public panel discussion, for personalised advertising without his consent.
The CJEU underlined that if users do make sensitive information public, this does not authorise social network platforms to process other sensitive data, obtained outside that platform using partner third-party websites and apps, with a view to aggregating and analysing those data, in order to offer that person personalised advertising. Furthermore, the GDPR’s data minimisation principle must be interpreted as precluding such personal data obtained by a controller either on or outside of the platform, from being aggregated, analysed and processed for the purposes of targeted advertising without restriction as to time and without distinction as to type of data.
The ruling aims to reduce the information in Meta’s data pool which can be used for advertising without explicitly receiving users’ consent to do so. Furthermore, it reinforces the stringent consent requirements under the GDPR for the processing of sensitive personal data and will also apply to any other online advertising companies which do not have strict data deletion policies in place.
What to do: Businesses may wish to ensure they abide by the data minimisation principle contained in the GDPR, such that only permitted data is used for advertising even when users give their consent to their personal data being collected and used for directing personalised advertising at them.
Italian DPA issues formal warning to GEDI publishing group regarding potential GDPR violations
What happened: The Italian Data Protection Authority, Garante, has issued a formal warning to GEDI Gruppo Editoriale S.p.A. (“GEDI”), an Italian publishing group, against sharing its personal data archives to OpenAI in order to train algorithms.
The GEDI Group signed a Data Protection Agreement with Open AI to give OpenAI users access to attributed quotes, content and links to GEDI publications and improve the overall accuracy of its AI systems. The Garante stated that “the digital archives of newspapers contain the stories of millions of people, with information, details and even extremely sensitive personal data that cannot be licensed without due care for use by third parties to train artificial intelligence”.
The Garante stated in its formal warning that the legitimate interest of each GEDI subsidiary as a data controller to exercise journalistic activities according to innovate methods is not a sufficiently clear legal basis for carrying out large-scale processing activities or training algorithms. The Garante further warned that by sharing the personal data contained in its archives, GEDI could violate the provisions of the GDPR, which may result in sanctions against GEDI. No editorial content has yet been made available to OpenAI and GEDI is currently reviewing its position.
What to do: The formal warning highlights increasing regulatory pressure on the use and development of AI. Businesses may wish to ensure that they have a clear lawful basis for the processing of personal data and that businesses fulfil their transparency obligations under the GDPR.
Italian DPA fines OpenAI for ChatGPT breach and processing of personal information
What happened: The Italian Data Protection Authority, the Garante, fined OpenAI OpCo €15 million for GDPR violations regarding its failure to notify the Garante of a March 2023 breach of its generative artificial intelligence (“AI”) model, ChatGPT, its processing of users’ personal data to train ChatGPT without a legal basis and a lack of transparency and information obligations.
Following a March 2023 data breach where a vulnerability in the Redis open-source library affected ChatGPT users’ conversations and subscriber payment information, OpenAI notified affected users that their payment information may have been exposed but did not notify the Garante. The Garante initiated an investigation and subsequently found that OpenAI had processed personal data to train ChatGPT without first identifying an adequate legal basis and violated the principle of transparency and its information obligations by having a privacy policy available only in English and not easily accessible on the website. The Garante also stated that OpenAI failed to provide mechanisms for age verification.
OpenAI notified the Irish DPC, believing that the DPC would communicate the information to the other supervisory authorities, however, the Garante viewed that breach notification in Italy was still required as OpenAI’s establishment in Ireland occurred after the data breach.
According to the Garante, OpenAI had not identified a lawful basis for training ChatGPT before its launch. The Garante referenced the recent adoption by the European Data Protection Board (“EDPB”) of an opinion on the use of personal data for the development and deployment of AI models in its statement on the fine. The opinion, which does not rule out relying on legitimate interests as a lawful basis for training AI models, emphasized the need for data controllers deploying AI models to carry out an appropriate assessment on whether the model was developed lawfully. The determination of the question on whether OpenAI can rely on legitimate interests to train its AI model has been transferred to the Irish DPC, who requested the EDPB opinion.
What to do: Businesses processing personal data in relation to the development and use of AI models may wish to ensure that they have documented a legal basis under the GDPR for those processing activities and note the EDPB’s recent opinion which provides guidance on relying on legitimate interests, including suggested measures to mitigate risks posed by processing personal data, such as ensuring that certain data categories are not collected and excluding collection from certain websites which clearly object to web scraping.
Irish DPC fines Meta in connection with data breach
What happened: The Irish DPC fined Meta Platforms Ireland Limited (“Meta”) €251 million for violations of four GDPR provisions arising from a Facebook personal data breach that Meta reported to the DPC in September 2018.
Unauthorized third parties exploited “user tokens” on Facebook’s platform and “View As” feature, affecting approximately 29 million Facebook users globally and three million in the EU/EEA and involved full name, email address, phone number, location, place of work, date of birth, gender and religion, including data of children. The DPC noted that the breach was remedied by Meta and its parent company promptly after it was discovered.
The DPC’s final decision outlined four instances of GDPR infringement and corresponding fines:
- €3 million for failing to adequately document the facts of the breach and the steps taken to remedy it, (Article 33(5));
- €8 million for failing to include the required information in its data breach notification, including, for example, the nature of the breach and the measures taken to mitigate possible adverse effects of the breach (Article 33(3));
- €110 million for failing in its obligation, as a data controller, to ensure that, by default, only personal data that is necessary for specific purposes is processed (Article 25(2)); and
- €130 million for failing to ensure that data protection principles were protected in the design of processing systems (Article 25(1)).
What to do: In light of this significant penalty, businesses who process personal data should ensure that appropriate breach notification and documentation measures are included in policies and procedures to allow supervisory authorities to verify compliance, and confirm that controls and monitoring is in place to implement data protection principles “by design” and by “default” throughout the lifecycle of products and services, as discussed here. Once the Irish DPC publishes the full decision, additional context should be made available to enable businesses to assess what specific steps should be taken to document incident facts and what default data processing may be deemed of concern by regulators.
CJEU rules on an exception to the obligation for information to be provided for indirect personal data collection
What happened: In Case C-169/23 (Másdi), the CJEU ruled that the exception to the data controller’s obligation to inform data subjects about their data collection, where the data has been obtained indirectly, includes data generated by the controller.
The case concerned a data subject who obtained a COVID-19 immunity certificate from the issuing authority (the controller), which included data generated by the controller itself, such as an ID number and QR code. In April 2021, the data subject alleged that the controller had not published any data protection statement concerning the issue of these certificates. The controller claimed that the processing of personal data was covered by the exception in Art. 14(5)(c) GDPR, which permits the data controller to waive the information requirements outlined in Art. 14 GDPR when obtaining or disclosing personal data is explicitly mandated by Union or Member State law to which the controller is subject. The COVID-19 immunity certificate was issued as mandated under Hungarian law at the time. The exception provides that the controller does not have to provide information to the data subject if Member State law provides “appropriate measures to protect the data subject’s legitimate interests”.
The CJEU held that the exception concerns “all personal data, without distinction” that has been collected indirectly by the controller from the data subject, whether this data was generated by the controller itself, or from a person other than the data subject. The CJEU also held that in a complaints procedure, the supervising authority is competent to verify whether the Member State law to which the controller is subject provides appropriate measures to protect the data subject’s legitimate interests, for the purposes of the application of Art 14(5)(c).
What to do: Businesses may wish to be mindful of this decision when navigating and determining the level of information legally required to be outlined in privacy notices.
***
To subscribe to the Data Blog, please click here.
The cover art used in this blog post was generated by Dall-E.