Key takeaways from November include:
- AI Regulation: Businesses utilizing AI in the EU, particularly those in healthcare and generative AI, should keep in mind that European authorities and regulators continue to stress pre-existing obligations relating to AI in technology-neutral regulations, notwithstanding the recently reached political agreement on the EU AI Act;
- Employee monitoring: Businesses with employees in France should be mindful of the CNIL’s simplified sanctions procedure, which has galvanized a series of recent enforcement actions, in particular, against employee monitoring and data minimization;
- EU ePrivacy Directive: The EDPB has issued, subject to a public consultation, new guidelines to clarify which technical operations (in particular, tracking techniques) are covered by the Directive, as implemented by member state laws. Businesses may want to review the guidance against their existing technologies to ensure they are complying with the Directive’s requirements, where relevant;
- Data security: Organisations that process personal data may want to consider implementing measures such as multi-factor authentication, separation of functions and task-limited access rights, to comply with their GDPR data security obligations, according to new guidance from the Danish data protection authority, reiterating findings from many prior enforcement actions;
- Remote Access: A recent ICO reprimand highlighted the importance of businesses having appropriate safeguards, such as multi-factor authentication and formalised, regular IT infrastructure testing, in place to protect against the confidentiality risks inherent in using remote desktop services;
- Targeted advertising practices: Businesses that engage in targeted advertising practices may want to re-evaluate their compliance with the GDPR and options to obtain user consent following the EDPB’s binding decision that restricted Meta’s data processing for behavioural advertising on the legal bases of contractual necessity or legitimate interests; and
- GDPR concept of personal data: In a limit to the use of GDPR as a defense to obligations to disclose information, vehicle identification numbers are not in all cases personal data, depending on whether the recipient has the means to identify an individual from it, and disclosure of personal information may otherwise be permissible under legal necessity, according to a new ruling from the CJEU. Businesses may want to consider how the courts reasoning may apply to other circumstances when dealing with disclosure requests.
These developments, and more, covered below.
Regulators publish AI-related guidance to advise businesses on their existing obligations
What happened: As discussed in previous blog posts, the EU AI Act, which has now concluded its passage through the EU “trilogue negotiations”, is expected to have a wide-reaching impact on businesses which use AI systems in, or sell them into, the EU. With the Act still yet finalised and with lead in times of six or more months for key obligations once it is, several authorities in the EU, the UK and the U.S. have released guidance clarifying businesses’ existing obligations in relation to AI:
- The German Baden-Württemberg DPA has outlined its views on the legal protections for personal data which already exist in relation to AI systems. In particular, the authority advises businesses to consider the relationship between their data security obligations and their use of AI from the outset in order to promote sustainable digital development. For further discussion on the principle of “security by design”, see our previous blog post.
- The Italian Garante published guidance on the use of AI in the healthcare sector. This guidance, which draws on the GDPR as well as national and EU case law, contains relevant advice for using AI in the healthcare space more broadly. For example, the Garante notes the need to incorporate data protection by design and by default principles within any AI systems used in the healthcare space.
- The Confederation of European Data Protection Organisations, a private organization, has published a paper on the data protection implications for both creators and end-users of generative AI models that is informative of potential consensus views on GDPR and AI. In particular, the paper recommends the use of internal data access controls, regular auditing of data security measures, and the use of data protection impact assessments. The intersection between GDPR compliance and AI has been the subject of detailed analysis in a previous blog post.
- UK and U.S. regulators have developed new guidelines, endorsed by authorities in 18 countries, which aim to promote the implementation of secure-by-design, transparency, and accountability principles by AI developers. The guidance applies to all types of artificial intelligence and machine learning and is divided into four key topics: (i) secure design; (ii) secure development; (iii) secure deployment; and (iv) secure operation and maintenance.
Further, following its decision earlier this year to lift a ban on ChatGPT, the Italian Garante has now announced an investigation into the use of data collection for algorithm training by both public and private entities. According to the Garante, this investigation is intended to confirm that suitable security measures are in place to prevent the use of mass data scraping of personal data.
What to do: Despite the EU AI Act’s lead time, businesses may already be subject to a range of technology-neutral regulations that affect their implementation and use of AI. Businesses may want to consider the above publications and whether they are relevant to their specific operations. As noted here, businesses can take steps now to respond to growing regulatory attention on AI and prepare for the arrival of the EU AI Act. For example, businesses could consider designating a committee to oversee and monitor their use of AI tools and creating an AI-related incident response plan to ensure AI incidents are reported to regulatory authorities as required.
CNIL uses simplified enforcement process to fine ten entities for excessive employee monitoring
What happened: The French CNIL has fined ten entities a total of €97,000 under a simplified enforcement process first introduced in April 2022.
The CNIL’s “simplified sanction procedure” is available for matters which the CNIL determines to be of limited complexity or seriousness. It follows the same steps as the CNIL’s ordinary procedure but the President of the “formation restreinte” rules alone and a public hearing is not held, unless requested by the subject of the action. The repeated use of this procedure arguably indicates CNIL’s desire to systematically bring enforcement actions for breaches of data protection law, even where specific breaches are of limited severity and against entities that may not typically expect themselves to be enforcement targets.
While most details of the ten recent fines remain non-public, data minimisation and the tracking of employees were common themes in the decisions. Continuous geo-location tracking and video surveillance is (per the CNIL), bar few exceptions, a breach of employees’ freedoms, disproportionate to the aims that are being pursued, and inconsistent with data minimisation requirements.
What to do: As previously covered, DPAs across Europe are placing increasing scrutiny on workplace monitoring practices. In light of the CNIL’s views in particular, businesses with employees in France may wish to ensure that their approach to employee monitoring is proportionate, given the signal that the CNIL has increased its capacity to pursue enforcement actions in less serious and complex matters, which may reach a broader range of entities than those previously prioritised.
EDPB clarifies the technical scope of the ePrivacy Directive
What happened: The EDPB has published draft guidelines on the application of the EU’s ePrivacy Directive.
Under the Directive, the “storing of information” (or the gaining of access to information already stored) in the “terminal equipment of a subscriber or user” is only allowed on the basis of consent or necessity for a specified purpose set out in Art. 5(3). Following concerns that ambiguities in the definitions gave rise to a risk of circumvention, the Guidelines clarified that:
- “Information” includes both non-personal and personal data, regardless of how it was stored and by whom.
- “Storage of information” refers to placing information on a physical electronic storage medium. The Directive does not place any upper or lower limit on the amount of time that the information must survive on storage medium to be counted as stored.
- “Terminal equipment of a Subscriber or User” encompasses equipment directly or indirectly connected to the interface of a public telecommunications network to send, process or receive information, but only where that devise is an endpoint of a communication. The user or Subscriber may own, rent or otherwise be provided with this “terminal equipment”.
What to do: Businesses may want to study the Guidelines to assess whether their data processing operations fall within the technical scope of the ePrivacy Directive and the EU member state implementing law and, if so, confirm that appropriate compliance measures are in place.
Danish Data Protection Authority publishes data privacy safeguard catalogue
What happened: Based on its experience from inspections, data breaches, and existing EDPB guidelines, the Danish DPA published a catalogue of technical and organisational security measures to address common data security risks, including:
- ensuring that employee access to individuals’ personal data on internal IT systems is limited to work-related need, especially where the access rights can cause irreparable damage, such as the ability to delete personal data;
- automatically terminating employee access to IT systems after a certain period of inactivity;
- implementing multi-factor authentication for employees with access to personal data; and
- separating functions or access rights to reduce the likelihood of abuse, for example, by creating distinct functional separation between users (who enjoy access to particular data or systems), those responsible for authorisation (who approve access) and user administrators (who create or remove access).
What to do: While issues by a single DPA, the measures outlined in the catalogue broadly track those highlighted in enforcement action and guidance across the EU. Businesses may therefore wish to use the catalogue as a guide to regulatory expectations in order to design and implement appropriate security measures, tailored to the organisation’s own risk assessment and existing data security framework.
EDPB acts against Meta’s targeted advertising practices
What happened: The EDPB announced an urgent binding decision against Meta’s personal data processing of Facebook and Instagram users across EU member states and EEA countries for the purpose of behavioural advertising on the legal bases of contractual performance or legitimate interests. The decision came in response to a request from the Norwegian data protection authority for the EDPB to make an existing Norwegian interim injunction permanent and applicable across the EU and a series of EDPB and Irish DPC findings that contractual necessity was not a suitable legal basis for the processing of personal data.
Ahead of the EDPB decision being published, Meta announced that it was introducing a subscription model for ad-free Facebook and Instagram, which the EDPB is currently evaluating.
What to do: Businesses that use behavioural advertising may wish to consider this decision when assessing whether their data processing is compliant with the GDPR and whether they obtain user consent for such activities. In particular, businesses may wish to keep an eye on the response of the EDPB, and other authorities, to Meta’s new subscription model.
ICO reprimands GRS (Roadstone) Ltd over personal data breach
What happened: The UK ICO reprimanded but did not fine construction materials firm, GRS (Roadstone) Ltd (“GRS”), in relation to a personal data breach affecting more than 2000 people.
The ICO found that GRS had failed to ensure the ongoing confidentiality of its systems, in particular, by using remote desktop services without sufficient safeguards (such as a multi-factor authentication) and failing to conduct regular, formalised testing of its IT infrastructure. As a result of these failings, a threat actor was able to deploy ransomware onto GRS’ systems and access the personal data of current and former employees.
Given the number of data subjects affected, this case is consistent with the ICO’s recent tendency to opt against imposing fines in all but the most serious of cases, while still publishing the reprimand leading to potential negative publicity for victim companies.
What to do: This reprimand highlights the importance for businesses of having robust technical and organizational measures to safeguard personal data which they process. In particular, businesses are reminded of the ICO guidance that “you should not use single-factor authentication on internet facing services, such as remote access, if it can lead to access to personal data”. Moreover, businesses may wish to refer to the ICO and NCSC guidance on the standards expected in relation to the testing of internal systems, with the NCSC recommending such testing at least monthly.
CJEU clarifies scope of “personal data” in vehicle manufacturer case
What happened: In a ruling on the extent of the obligation of car manufacturers to share vehicle identification numbers (“VINs”) with independent operators, the CJEU has clarified that VINs do not automatically fall within the definition of “personal data” under GDPR, and even when VINs may be personal data, sharing may be permissible as necessary for compliance with a legal obligation.
Under EU law, vehicle manufacturers are legally obliged to provide certain third parties, such as repairers and spare parts distributors, with the necessary data to repair and maintain vehicles. In its action against Scania, the German trade association for motor vehicle parts argued that Scania should share VINs in compliance with the Regulation. Scania argued that VINs were personal data and therefore refused to share the numbers on the basis that it would breach GDPR.
The CJEU rejected Scania’s arguments and ruled that VINs are not, as such, personal data. They only become personal data when someone with access to the number has the means enabling them to identify the owner or other user of the vehicle. In cases where VINs can be considered personal data, the GDPR does not stop car manufacturers from sharing them, as disclosure is justified under GDPR Art. 6(1)(c), on the basis that processing is necessary for compliance with a legal obligation to which the controller is subject.
What to do: Businesses should bear this judgment in mind when considering the scope of “personal data” under the GDPR, and its significance for compliance with legal obligations in other contexts that may appear to run up against GDPR requirements.
To subscribe to the Data Blog, please click here.
The cover art used in this blog post was generated by DALL-E.