August proved to be another busy month for data protection developments in Europe, fuelled in part by the aftermath of the Court of Justice of the European Union’s (“CJEU”) decision in the “Schrems II” case.

Enforcement

The most noteworthy GDPR enforcement-related developments from August include:

Marriott indicates potentially significant decrease in anticipated ICO fine. The ICO had announced in July 2019 that it intends to issue a record fine of £99 million against Marriott International for a security breach (further details below, in the “Litigation” section). However, according to its quarterly report, Marriott expects this fine to be reduced by roughly 50%, to US$ 65 million. The ICO’s final decision is currently expected by 30 September 2020, after multiple extensions of the regulatory process. In our July 2020 Round Up, we noted that the British Airways fine, another record ICO fine announced just a day before Marriott’s, is also expected to be significantly reduced.

Online retailer fined €250,000 for multiple GDPR failings. The CNIL issued its first fine as a lead supervisory authority in cooperation with other European data protection authorities (“DPAs”) against France-based online shoe retailer, Spartoo. Covering many of the GDPR’s fundamental principles, the penalty is a helpful reminder of some GDPR basic hygiene:

  • Data minimisation – Spartoo breached the GDPR by, among other things: (i) recording all customer service calls even though only a small proportion of them were ever subsequently consulted; and (ii) collecting and retaining copies of Italian customers’ health identity cards for fraud prevention even where other alternative (and equally valid) forms of identification had been provided.
  • Storage limitation – Spartoo breached its obligation to retain data for no longer than necessary for the purposes for which it was collected, including by: (i) permanently retaining records of customer phone calls, including those recording bank details; and (ii) failing to establish and enforce data retention periods. See our blog post here for more on this topic.
  • Transparency – Spartoo failed to adequately inform its employees that their phone calls were recorded.
  • Technical and organisational safeguards – Spartoo had inadequate measures to protect personal data, including failing to require customers to use sufficiently robust passwords to protect their Spartoo accounts.

Given the breadth of failings, the CNIL has given Spartoo three months to become GDPR compliant, or face a further daily penalty of €250.

Forbes fined €13,005 for publishing top-50 wealthiest Hungarians and family-owned businesses. Driving home the fact that the GDPR applies even where personal data is already publicly available, in late July, Hungary’s DPA (“NAIH”) imposed two fines (available here and here) on Forbes Hungary for publishing lists of the 50 wealthiest Hungarians and the largest family-owned businesses. After Forbes rejected individuals’ erasure requests and related objections on the basis that the information was already in the public domain, NAIH found that Forbes had failed to meet its transparency obligations by providing inadequate information on the purpose and lawful basis for processing the data.

UK ICO issues £100,000 fine for unsolicited email marketing. Highlighting that enforcement relating to unsolicited marketing continues to be a hot topic, the ICO fined Koypo Laboratories Limited £100,000 for sending 21,166,574 unsolicited direct marketing emails to subscribers without their consent. The ICO in particular has issued a significant number of penalties for such breaches, including imposing a £500,000 fine against CRDNN Limited in March 2020.

Dutch National Credit Register fined €830,000 for charging for access to personal data. After an investigation into complaints about difficulties accessing data held by the Dutch National Credit Register (the “BKR”), the Dutch DPA has issued a fine of €830,000. When exercising their GDPR data subject access rights, the BKR charged a fee for digital access and made free paper access available only once per year. The DPA found this overly restrictive and fined the BKR, which is now appealing. Nevertheless, the fine clearly shows the risks associated with placing administrative or other restrictions on individuals’ data subject rights.

Hamburg Data Protection Commissioner requests information on Clearview AI biometric data collection. Following a complaint filed in February 2020, the Hamburg Data Protection Commissioner has issued an administrative order to Clearview AI requesting information about its collection and processing of biometric personal data associated with its facial recognition software. Clearview AI argued that the GDPR is not applicable to its processing and that it is therefore not obliged to answer the questions. The Hamburg regulator disagreed, stating that the GDPR applies to Clearview AI as behavioural monitoring through placing cookies would target individuals in Europe. The Hamburg Data Protection Commissioner stated that “[b]usiness models that consist of collecting images on the internet in massive scale and without cause, and making the faces of people identifiable through biometric analysis, endanger privacy on a global scale[…]To protect those affected under the EU Charter of Fundamental Rights, such companies must be monitored, regulated and, if necessary, stopped on the basis of the GDPR”. Clearview AI has to provide comprehensive information by mid-September 2020 or face a possible €10,000 fine for each unanswered inquiry. See our blog post on ways to reduce regulatory and reputation risks for AI-powered applications here.

The CNIL issues formal warnings to employers using automatic photo devices to monitor employee working hours. The CNIL has formally warned public and private organisations using devices which automatically take photographs to record when employees arrive and leave work. The CNIL found that systematic collection of photographs (two to four times a day) to monitor employees’ working hours was excessive and breached the GDPR’s data minimisation Instead, the CNIL recommended using systems which only record the day and time of arrival. Organisations served with formal warnings have three months to comply. Demonstrating the risks associated with automated monitoring systems, the fine follows a previous CNIL decision banning automatic number plate recognition technology used by four French municipalities to monitor and issue parking tickets.

German DPAs launch investigation into media cookie consent practices. The Baden-Württemberg DPA has launched an investigation, in cooperation with other German DPAs, into the use of cookie banners and consent practices on media websites. The announcement stressed that tracking technologies such as cookies are often not GDPR compliant. The regulator also emphasised that valid consent is needed if organisations want to use tracking technologies meaning it must be informed, voluntary, provided in advance and be able to be revoked at any time. The announcement says that cookie banners often do not lead to a voluntary and genuinely informed choice. While currently focused on media websites, it seems likely that the scope of the investigation will expand in the future. Thus, now may be a good time for companies to confirm that their cookie consent practices are in line with the GDPR and e-privacy legislation.

Schrems II Developments

With the fallout from the CJEU “Schrems II” decision (covered here and here) continuing, the most noteworthy developments in August include:

First German state DPA issues guidance on Schrems II. The German Baden-Württemberg DPA has issued first-of-its-kind guidance on international data transfers following Schrems II that includes recommendations for companies and gives an idea of how the authority may approach enforcement. The guidance is relevant for businesses within the remit of the authority but may be instructive for others as well. The key takeaways include that:

  • While the Standard Contractual Clauses (“SCCs”) remain a valid mechanism for EU-U.S. data transfers, the adequacy requirements set out by the CJEU in Schrems II may only be met in rare cases.
  • To be effective, businesses would have to take additional measures to prevent access by U.S. intelligence services, such as encryption with the key remaining in the EU, anonymization, or pseudonymization where re-identification can only be carried out in the EU.
  • Companies should assess the cross-border transfer instruments they use and discuss Schrems II compliance with the non-EU data importer.
  • Specific proposals for how, in particular, the 2010 controller-to-controller SCCs can be strengthened with additional clauses that may also require the data importer to not comply with foreign authority disclosure requests until ordered to do so by the respective highest court.
  • The Baden-Württemberg authority says it will prohibit non-compliant data transfers unless the transferor can show that it cannot replace the service provider or other counterparty in the short or medium term with a European data protection-compliant alternative.

Facebook announces transition to SCCs. Following the Schrems II decision, Facebook has announced that it is updating its international data transfer mechanisms, and is migrating to SCCs for the transfer of data relating to its ads and measurement products, as well as for customers using the Workplace platform. The announcement states that Facebook will be migrating to SCCs for these products but does not specify whether any additional measures are being put in place.

EU and U.S. discuss potential enhanced Privacy Shield framework. A joint statement by the U.S. Department of Commerce and the European Commission announced discussions about the potential for an enhanced EU-U.S. Privacy Shield framework to comply with the Schrems II decision.

Litigation

In August, seven cases particularly piqued our interest covering the UK, Germany and the Netherlands, but reflecting a broader trend across the whole of the EU:

Personal data transfers to overseas High Commissions are lawful for legal proceedings. The English Court of Appeal was asked to consider whether the transfer of personal data (including sensitive personal data such as criminal records) to the British High Commission in Jamaica was unlawful. Before dismissing the appeal, the Court of Appeal considered rights of objection, rights to erasure and overseas transfers under the GDPR. Johnson sought to appeal his deportation from the UK in 2016. When the appeal was heard, the appellant was at the British High Commission in Jamaica, but protested that giving live evidence from Jamaica and the electronic transfer of the court bundle breached the GDPR. The Court held that the appellant was not entitled to object to the transfer of his data under the GDPR as the transfer was necessary for the establishment, exercise or defence of legal claims—specifically Johnson’s deportation proceedings. The judgment also noted that there was no infringement of the right to erasure under Article 17 of the GDPR as the appellant was assured that the transferred data would be destroyed after a week. On the question of whether the transfer of the data to Jamaica involved a transfer to a third country, the Court declined to address the point, noting that even if the British High Commission in Jamaica is not considered to be part of the UK (which is disputed), the transfer was nonetheless also necessary for the establishment, exercise or defence of legal claims.

Germany’s Federal Court of Justice rejects right-to-be-forgotten privacy claim against Google and seeks clarification from the CJEU on second claim. The German Federal Court of Justice (the “BGH”) has issued two decisions on right-to-be-forgotten privacy claims against Google. The BGH found in favour of Google in the first claim and sought guidance from the CJEU in the second claim.

The first claim was brought by a managing director of a regional charity who was named in different media articles in connection with the charity’s financial shortfall of nearly €1 million in 2011. The claimant wanted these articles to be removed from the results of Google searches against his name. The BGH sided with Google and rejected the claim, holding that the right to be forgotten under the GDPR requires a comprehensive consideration of fundamental rights of the claimant and the interests of the website users, the public and the content provider. There is no presumption of priority of the fundamental rights of the claimant, and conflicting interests should be weighed against each other equally. In the view of the BGH, search engines do not have to take action until they become aware of an obvious violation of the rights of the person concerned. In this case, the BGH held that there was no obvious violation and the de-listing request had to be rejected.

The second claim was issued by two individuals who offered financial services through different companies and were shown in photos in connection with a critical media article. The website publishing the article faced allegations of blackmailing their customers by publishing negative articles on them and offering to delete these articles in return for payment. The two individuals claimed to be blackmailed as well and requested that Google remove the article in question from the results of searches against their names and the companies names, arguing that the allegations made in the article are false. Google stated that they are unable to assess if the allegations made in the article are true.

The BGH has referred the case to the CJEU, asking for clarification of whether a right to be forgotten exists under the GDPR if the relevant link that is sought to be removed includes factual claims and judgements, the truthfulness and lawfulness of which is disputed, especially if reasonable legal means, such as preliminary injunction, exist to achieve a preliminary clarification of the truthfulness of the content.

UK addresses the Crown Prosecution Service’s (“CPS”) duties and the ICO’s powers in responding to subject access requests. In a judicial review application in Dalton v Crown Prosecution Service [2020] EWHC 2013 (Admin), the High Court was asked to consider whether the CPS was required to respond to the claimant’s request for documents held by it. The claimant, in custody for tax evasion, wished to rely on these documents in an application to appeal against his sentence. The Court rejected the application, finding that the claimant should have exercised his right to complain to the ICO about the CPS’s refusal to comply with data protection legislation, rather than first resorting to using the courts.

The Court noted that the common law duty to disclose material that may cast doubt on a conviction is not as wide-ranging as a subject access request. Such a request, if properly filed, may yield more detailed results, even in relation to an active case. A point-blank refusal to provide documents under the heading that disclosure may jeopardise prosecution of criminal offences was deemed unsatisfactory. The Court considered that the CPS must provide reasons and show that an assessment has been made in respect of each relevant document.

The Court emphasised the ICO’s wide enforcement powers in relation to subject access requests. These include the power to review unredacted data in the hands of the data controller, perform an independent evaluation of redactions or interrogate the data controller about missing categories of information or specific documents (such as faxes, emails or notes of telephone communications). The Court noted that the claimant should have asked the ICO to utilise these powers to assess the legitimacy of the CPS’ refusal to comply with his requests before resorting to using the courts.

Since the CPS withdrew the arguments relying on legal professional privilege, the Court only touched briefly on this issue. The Court found that the position under the DPA is less straightforward than under the GDPR, as the DPA, in its Schedule 15, restricts its power to issue a warrant regarding certain categories of privileged documents.

Salesforce and Oracle face cookie-related class action lawsuits in Netherlands and United Kingdom. The Privacy Collective filed the first Dutch class action under the GDPR against Salesforce and Oracle for alleged failure to obtain consent from customers for personal data collected through third-party tracking cookies. The cookies known as BlueKai and Krux are used in the real-time bidding processes for linking advertisements to individual users. It is alleged that personal data is collected and shared across a number of websites without providing adequate information for the users to give informed consent. A similar case is expected to be brought in the United Kingdom. The Privacy Collective estimates that the combined claims could exceed €10 billion, with €500 to be distributed to each individual claimant.

Marriott faces a class action before the English courts following security breach. Another class action was brought against Marriott International following a hack that compromised personal data belonging to roughly 7 million British former guests—the same incident that led to the ICO’s announcement of an intention to fine (mentioned above). The claim filed in the English High Court relates to a security breach that allowed hackers to access records of bookings made between July 2014 and September 2018 through the Starwood Hotels group, later acquired by Marriott. The claimants are consequently suing for an unspecified sum for the loss of control over their personal data, which included names, home addresses, telephone numbers, passport details, credit card details and email addresses. The claimants allege that after acquiring Starwood, Marriott failed to review the data protection practices and systems.

Guidance

ICO issues Age Appropriate Design Code (the “AAD Code”). The ICO has announced that the AAD Code came into force on 2 September 2020, but organisations have a 12 month transition period to comply with it. The AAD Code applies to online services which are likely to be accessed by children in the UK. It sets out 15 flexible standards that online service providers should meet in order to ensure that their sites have inbuilt privacy protection for children. This includes turning on “high privacy” settings, and turning off geolocation services, by default. Other resources published alongside the AAD Code include a data protection impact assessment template and an explanatory memorandum to the AAD Code issued by the Department of Digital, Culture, Media & Sport.

The Dutch DPA (“AP”) approves code of conduct for organisations in the information and communication technology sector (the “Data Pro Code”). The Data Pro Code was developed by NLdigital, an industry association, and is the first code of this kind approved by the AP. Aimed at helping organisations be GDPR compliant, the Data Pro Code offers practical applications of open standards and safeguards, including principles for privacy policies, standard clauses and agreements for processing personal data. NLdigital is currently awaiting approval by the AP as a supervisory body.

Author

Robert Maddox is an associate based in the London office and a member of Debevoise's White Collar & Regulatory Defense and International Dispute Resolution Groups, as well as the firm’s Data Strategy & Security practice. His practice focuses on complex multi-jurisdictional investigations, disputes and cybersecurity matters. He can be reached at rmaddox@debevoise.com.

Author

Christopher Garrett is an English-qualified associate who is a member of the Debevoise Data Strategy & Security practice and part of the Corporate Department, also specialising in employment law. He can be reached at cgarrett@debevoise.com.

Author

Dr. Friedrich Popp is a senior associate in Debevoise's Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law. He can be reached at fpopp@debevoise.com.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Hilary Davidson is a corporate associate and a member of Debevoise's Mergers & Acquisitions Group. Ms. Davidson’s practice focuses on private M&A, with particular experience advising private equity clients. This has included advising on joint ventures, cross-border mergers and acquisitions and secondary and co-invest transactions. She can be reached at hdavidson@debevoise.com.

Author

Jennifer Deschins is an associate in the Frankfurt office and a member of the firm’s Litigation Department. Her practice focuses on Arbitration, Litigation, Internal Investigations, Cyber Privacy, Data Protection, Anti-Money Laundering and Competition Law.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.

Author

Sara Ewad is an associate in the London office and a member of the firm’s International Dispute Resolution Group.

Author

Jeremy Feigelson is a Debevoise litigation partner, Co-Chair of the firm’s Data Strategy & Security practice, and a member of the firm’s Intellectual Property and Media Group. He frequently represents clients in litigations and government investigations that involve the Internet and new technologies. His practice includes litigation and counseling on cybersecurity, data privacy, trademark, right of publicity, false advertising, copyright, and defamation matters. He can be reached at jfeigelson@debevoise.com.