The big news in June were the EU Standard Contractual Clauses for cross-border data transfers to non-EEA countries. There were also significant developments for companies engaging in employee surveillance, ad tech, data scraping and the use of AI. Here are our highlights:
European Commission adopts new Standard Contractual Clauses
What happened: As reported in our blog post, the European Commission adopted its new Standard Contractual Clauses (“SCCs”) for the cross-border transfer of personal data from the EEA to “third countries”. Shortly after, the European Data Protection Board (“EDPB”) issued its final guidance on supplementary measures that businesses may need to implement to support transfers based either on SCCs or Binding Corporate Rules.
What to do: Read our blog post for five actions to take now.
Greenlight for EU-UK data transfers
What happened: The European Commission adopted adequacy decisions for both business and law enforcement transfers of personal data to the UK. Personal data can continue flowing from the EU to the UK without businesses needing to adopt data transfer mechanisms such as the SCCs, or needing to consider supplementary protective measures to legitimise the transfer.
What to do: For now, nothing. The adequacy decisions will automatically expire in four years’ time, and the Commission can still alter them during this period if the UK deviates from its current level of data protection safeguards. We will continue to report developments that may threaten the decisions on the Blog.
CJEU releases judgment on application of “One-Stop Shop” mechanism
What happened: The CJEU ruled that there are certain exceptional circumstances when a non-Lead Supervisory Authority can initiate enforcement proceedings for breaches of the GDPR.
The CJEU confirmed the general principle under the GDPR One-Stop Shop mechanism that only the Lead Supervisory Authority should initiate regulatory proceedings for GDPR infringements, although non-Lead Supervisory Authorities may support cross-border data processing investigations.
Significantly, the CJEU carved out several limited exceptions to the general rule. Non-Lead Supervisory Authorities can initiate proceedings against alleged infringements of the GDPR where:
- there is an urgent need to act to protect the rights and freedoms of data subjects; or
- a GDPR infringement substantially affects data subjects only in the non-Lead Supervisory Authority’s Member State.
What to do: The CJEU judgment takes a strong stance against companies using the One-Stop Shop mechanism to “forum shop”. Those based in jurisdictions with historically limited GDPR enforcement should note the increased risk of action from non-lead Supervisory Authorities, particularly where there is an urgent need for a non-Lead Supervisory Authority to take action to protect individuals’ rights and freedoms. The risk of parallel proceedings for alleged GDPR breaches therefore remains.
IKEA France fined €1m for unlawful employee surveillance
What happened: The Versailles Criminal Court fined IKEA France €1 million for unlawful employee surveillance. Further, IKEA France’s former CEO and its former head of risk management both received suspended sentences and fines. The court found that IKEA France had conducted systematic surveillance on employees, union representatives and job applicants, as well as some customers. The surveillance included using private detectives to collect details about the individuals’ lifestyles and illegally searching names on the police computer to check their criminal records.
What to do: Consider carefully employee-focussed and other monitoring schemes. Covert monitoring, save for exceptional circumstances, can be unlawful in many European jurisdictions. Companies should therefore review and update their employee privacy policies to ensure that they have notified employees of any data collection and processing they are conducting in connection with employee-monitoring. Companies should also establish a lawful basis for the associated processing of personal data and take steps to manage compliance with applicable labour laws, which might require prior consultation with employees and/or their representatives, as well as consider restrictions under laws governing the interception of communications data, which often carry criminal penalties for non-compliance.
French Competition Authority fines Google €220M for anti-competitive adtech practices
What happened: The French Competition Authority fined Alphabet/Google €220 million for abusing its dominant position in the adtech space to grant preferential treatment to its own proprietary technologies. The Authority found that Google had ensured that its DoubleClick for Publishers ad server favoured its own ad sales platform SSP AdX, and vice versa, at the expense of competitors and publishers. The UK Competition and Markets Authority and the European Commission have also opened investigations into Facebook’s anti-competitive adtech behaviour. Also this month, the Irish Council for Civil Liberties filed an action before the Hamburg District Court against a number of parties, challenging the adtech system real-time bidding, which it claims breaches the GDPR.
What to do: Adtech is a key priority for data protection and competition regulators (see May and January Roundups); companies in the UK, EU and globally should note the increased risk of regulatory action against anti-competitive practices and focus on the processing of personal data. Adtech players may need to take steps to ensure their algorithms do not unfairly favour their own products or risk enforcement actions and significant fines. Companies must also ensure their practices do not breach the GDPR or the European e-Privacy regime through the use of cookies.
Dutch consumer rights body files class action against TikTok
What happened: SOMI, a Dutch consumer and privacy protection association, brought a class action against TikTok in Amsterdam concerning TikTok’s collection of children’s data. The claim alleges that TikTok breached the GDPR by:
- collecting data from minors without properly obtaining consent;
- sharing data with third parties without a valid lawful basis;
- failing to specify to users the purposes for which data was collected;
- processing more data than necessary; and
- not adequately securing the data it collected.
The action further argues that TikTok has been negligent for failing to prevent children viewing harmful videos and illegal content, and exposing them to intrusive, personalised advertisements. It is reported that roughly 64,000 parents have so far opted-in to the litigation.
What to do: In April, we reported that a similar representative action was brought against TikTok in the UK. The increasing number of claims reflects a wider trend of data protection-driven litigation. Companies should therefore be mindful of increased litigation risk, across a variety of jurisdictions, when implementing GDPR compliance frameworks and making risk-based business decisions.
EDPB and EDPS adopt joint opinion on the draft AI Regulation
What happened: The EDPB and the European Data Protection Supervisor (“EDPS”) adopted a joint opinion on the Commission’s draft regulation governing the use of AI (the “Draft AI Regulation” – see our Blog Post and the April Roundup). The EDPB and EDPS welcomed the aim of addressing the use of AI systems within the EU but expressed concern at the exclusion of international law enforcement cooperation from the scope of the Draft AI Regulation. The bodies also stressed that existing EU data protection legislation applies to any processing of personal data within the scope of the Draft AI Regulation. The bodies also called for bans on:
- any use of AI for automated recognition of human features in publicly accessible spaces, such as recognition of faces, fingerprints, DNA and voice; and
- AI systems using biometrics to categorise individuals into clusters based on grounds on which discrimination is prohibited under Article 21 of the Charter of Fundamental Rights, including ethnicity, gender and sexual orientation.
What to do: Looking ahead, companies using or providing AI products should monitor the Draft AI Regulation and take its (likely) obligations into account when building their AI models and compliance frameworks.
CJEU issues ruling on data scraping
What happened: The CJEU ruled that data scraping and reuse from online databases is only prohibited if it impacts the database maker’s investment. In particular, the CJEU held that:
- the Database Directive prevents specialised search engines from copying and indexing substantial parts of freely accessible online databases without the consent of the maker of the database; and
- for this prohibition to apply, the unauthorised copying and indexing must have deprived, or risk depriving, the database maker of revenue intended to enable them to redeem the cost of their investment in setting up and operating the databases.
What to do: Those scraping data potentially subject to EU database rights protections will need to consider how the judgment may affect their activities. When scraping without consent, companies should analyse whether the scraping deprives, or risks depriving, the database maker of revenue as a core element of determining whether the scraping is lawful. Data scraping by GDPR-covered entities will need to comply with the Regulation’s requirements if it involves personal data, and the French data protection authority, the CNIL, issued data scraping guidance in April 2020.
French DPA fines Brico Privé €500,000 for GDPR and e-Privacy violations
What happened: The CNIL fined French DIY chain Brico Privé €500,000 for multiple violations of the GDPR and French e-Privacy rules. The CNIL found various failings in Brico Privé’s conduct, including:
- allowing customers and staff to choose weak passwords;
- retaining the data of more than 16,000 customers who had not placed orders for at least five years and 130,000 people who had not logged into their customer accounts for five years;
- only explaining its processing activities in general terms in its privacy policies and terms of sale;
- placing tracking cookies on users’ browsers without consent; and
- sending marketing emails to users who had created accounts but not made any purchases.
What to do: Companies need to be aware that, like here, in some jurisdictions, regulators conduct data protection audits without cause. The penalty shows how regulators can group various discrete compliance failings to make a case for enforcement and further highlights the need for robust, and enforced, data retention/deletion programs. Regularly reviewing and updating GDPR compliance frameworks so they are comprehensive and readily accessible can help minimise this risk.
The authors would like to thank Debevoise trainee associates Clementine Coudert, Olivia Collin and Diana Moise for their contribution to this article.
To subscribe to the Data Blog, please click here.