Throughout September, companies, regulators and policymakers have continued to respond to the fallout from Schrems II.  Since our last update we have also seen the second largest fine to date under the GDPR, the start of a major class action against YouTube, as well as a raft of new policy developments covering topics ranging from artificial intelligence to antitrust in digital markets and the application of the GDPR to connected vehicles.

Enforcement

H&M fined €35.5m for employee privacy violations.  While announced on 1 October 2020 so not technically a “September” update, we did not want to wait another month to cover the breaking news that H&M, the Swedish- headquartered clothing retailer, was fined €35.3m by the Hamburg DPA following an investigation into the extensive recording of information relating to employees’ private lives, amounting to the second largest fine issued under the GDPR.  The fine related to information collected in interviews conducted following employee absences, collated and permanently stored.  Details of employees’ vacations, as well as symptoms of illness and medical diagnoses were all recorded.  Some supervisors also acquired broad knowledge of their employees’ private lives through other discussions, ranging from less intrusive details to family issues and religious beliefs.

The press release issued by the DPA records that some of this knowledge was recorded, digitally stored and partly readable by up to 50 other managers throughout the company. In addition to being used for work performance evaluation, the data was used to obtain a detailed profile of employees for measures and decisions regarding their ongoing employment.  The combination of the collection of details about employees’ private lives and the recording of their activities led to a particularly severe encroachment on employees’ civil rights according to the DPA.

While the DPA praised H&M for its response to the investigation, that wasn’t enough to stop H&M receiving a significant penalty.  The fine highlights a number of fundamental GDPR principles, such as the data minimisation principle, which requires that only the minimum amount of personal data necessary for the purpose of the processing be processed, that it should be retained only for as long as necessary for that purpose and that individuals must be provided with transparency information about the processing of their personal data, and that “special category” data (including health data) can be processed only in tightly prescribed circumstances.

Spanish Association fined €1,800 for breaching data confidentiality.  The Barcelona Airport Security Guard Association (“AVSAB”) was fined €1,800 by the Spanish DPA for breaching the GDPR’s confidentiality principle.  A member of the AVSAB trade union sent a screenshot of members’ personal data to their union WhatsApp group and asked the recipients to confirm the accuracy of their data.  The penalty serves as a reminder that the obligations established by the GDPR apply to all individuals working for a data controller, and that the data controller can be held liable for their actions.  The DPA also noted the lack of evidence that new measures had been introduced to ensure that a similar breach did not occur in the future as an aggravating factor.  While the fine was small, it reminds companies that DPAs will expect them to learn from past mistakes.

Fines for unsolicited marketing messages. Two companies have received fines for sending unsolicited messages to customers, once again highlighting that unsolicited marketing continues to be a hot topic for DPAs.  The ICO fined Digital Growth Experts Limited £60,000 for sending thousands of nuisance texts to promote a hand sanitising product that claimed to be “effective against coronavirus”.  Similarly, in the Czech Republic DPA fined a company CZK 6 million (approximately €221,200) for sending almost 500,000 unsolicited emails to customers without their consent.  The DPAs found that the companies had failed to obtain the necessary consent to send direct marketing messages.

Marriott ICO decision delayed again.  The UK regulator has once again pushed back its final decision in its Marriott investigation following Marriott’s widely reported data breach.  As discussed in our August Round Up, the delays suggest that the fine, originally proposed £99 million fine, could be reduced significantly.  The announcement mirrors the ICO’s action against British Airways, which has also faced a series of delays that could culminate with the airline only paying a fraction of the initial proposed penalty, as reported in our July Round Up.

Ride-hailing app seeks ICO support.  Wheely has reportedly written to the UK ICO claiming that the Moscow Department for Transportation (“MDOT”) attempted to pressure the company into sharing customer information.  In August, the London-based ride-hailing app was suspended from operating in Russia for 90 days after it refused to hand over customer details.  While MDOT claims that it is only interested in driver and route information, Wheely is reportedly seeking ICO guidance on whether this information can be used to identify customers.

Guidance

ICO launches Accountability Framework.  The ICO has issued practical guidance to help organisations meet their accountability obligations under the GDPR while stressing that accountability “is not about ticking boxes”.  The Accountability Framework covers 10 core areas, each containing a series of expectations and examples on achieving “accountability”.  The guidance is accompanied by two online tools – the accountability self-assessment and the accountability tracker – which can be used to ensure that current and future practices meet the ICO’s expectations.

French DPA issues guidance on voice assistants.  The French DPA issued an 80-page White Paper exploring the ethical, technical, and legal issues posed by voice assistants.  The paper makes clear that recording and analysing someone’s voice amounts to processing biometric data, which is subject to the GDPR.  The French regulator has provided a list of best practices for software developers, as well as practical tips for users, such as disabling targeted advertising settings and trying to avoid private conversations being recorded.

French Cybersecurity Agency publishes ransomware guidance.  The Agency’s new guidance reaffirms the importance of preparing for ransomware attacks by developing incident response plans and a communication strategy in advance something we covered in our previous post on preparing for ransomware attacks. The guidance recommends that when attacks happen, companies should keep a detailed record of the attack, establish internal communication channels to ensure that the company’s external communication is centralized, and file a complaint with the police.  The Agency also recommended against paying ransoms given that doing so does not always ensure decryption and encourages attackers to continue their criminal endeavours.

EDPB publishes guidance on “controllers” and “processors”.  EDPB has published draft guidance on the definitions, roles and responsibilities of “controllers” “joint controllers” and “processors” under the GDPR, giving welcome clarity, in particular, on when two (or more) controllers will be considered joint controllers.  The accountability principle under Article 5 of the GDPR, the idea that those subject to the GDPR are responsible for demonstrating GDPR-compliance (through appropriate documentation), is at the heart of the draft guidance.  For example, the guidance calls for data processing agreements to contain a higher level of specificity and go beyond the basic core elements listed in GDPR Article 28.  The draft document gives detailed guidance on the requirements for agreements between controllers and processors and will be important reading for anyone responsible for managing outsourced data processing.  It also brings greater clarity to the relationship between joint controllers and determining their respective responsibilities, something that companies engaging in novel data sharing or other joint-processing activities might find helpful to consult.

Bank of England official stresses importance of operational resilience.  Earlier in the month, Elisabeth Stheeman, of the Bank of England, delivered a speech citing cyber risk as one of the most prominent operational risks for firms over the past decade.  To build operational resilience – which Ms Stheeman describes as “the ability of firms and the system as a whole to prevent, respond to, recover and learn from operational disruptions” – the speech suggests that firms should conduct regular resilience tests and implement tested arrangements to respond to cyber incidents.  As with the French Cybersecurity Agency’s ransomware guidance, it is a reminder that operational resilience is a key aspect of cyber incident preparedness, which needs to be considered alongside protecting personal data.

European Banking Federation reviews PSD2 and GDPR guidelines.  The European Banking Federation issued its response to the EDPB’s consultation on draft guidelines governing the relationship between the Second Payment Services Directive (“PSD2”) and the GDPR.  The Guidelines published in July seek to clarify uncertainties between the two legal frameworks.  The EBF welcomed the confirmation that explicit consent under PSD2 Article 94 is different from consent under the GDPR, but called for further revisions to its current data minimisation measures.  The EBF also suggested that a clear distinction be made between the respective GDPR responsibilities of the payment service providers based on the roles described in the PSD2.

Litigation

YouTube facing class-action style litigation in the English High Court.  A major group litigation has been brought against YouTube, alleging that the platform has violated the UK Data Protection Act 2018 and the GDPR by targeting the data of up to five million children.  The representative action alleges that as users don’t need an account (which are subject to a minimum age restriction of 13) to watch the majority of its video content, YouTube is unable to verify the age of its users and that when children watch videos, their data is collected, processed and monetised without obtaining parental permission or providing appropriate disclosures.

The success of this litigation will, to an extent, depend on the UK Supreme Court’s much anticipated decision in the Lloyd v Google case.  In 2019, the Court of Appeal established that the mere misuse of an individual’s personal data may constitute unlawful conduct, capable of being compensated.  If that is upheld it will mean that the bar for actionable damage under the GDPR is set low, increasing the claimants’ chances of success.

Oracle puts an end to third-party data targeting.  Since our August 2020 Round Up, where we covered the €10 billion class action being brought against Oracle and its rival Salesforce, Oracle has announced that it will stop offering third-party data targeting services across Europe.

Schrems II

The continued fallout from Schrems II has prompted a variety of responses from DPAs, companies and policymakers.  For more information, see our previous updates on the decision and its aftermath as well as our July and August Roundups.

Irish DPA threatens to block Facebook’s data transfers.  Earlier this month, the Irish regulator commenced an investigation into Facebook’s EU-U.S. data transfers and issued a preliminary decision to suspend transfers based on Standard Contractual Clauses (“SCCs”).  Facebook subsequently brought a judicial review before the Irish High Court, which has put a hold to the DPA’s investigation.  In an affidavit submitted to the court, Facebook Ireland’s head of data protection stated that it was not clear how the social media giant could offer services in the EU if transfers are banned.

Swiss DPA deems Swiss-U.S. Privacy Shield inadequate.  Following its annual assessment, the Swiss DPA concluded that the Swiss-U.S. Privacy Shield does not provide adequate levels of data protection.  Interestingly, the DPA went one step further than the Schrems II decision and made clear that SCCs could not prevent U.S. authorities from accessing personal data.  Although companies should no longer rely on the Privacy Shield for transfers from Switzerland to the U.S., since the Swiss DPA’s assessment has no influence on the continued existence of the Privacy Shield regime, those concerned can invoke the regime as long as it is not revoked by the U.S.

Companies shift towards SCCs.  Back in July, None of Your Business (“Noyb”) – the privacy NGO headed by Max Schrems – contacted prominent companies about their response to Schrems II.  Noyb asked the companies if they transfer data outside of the EU, and if so, on what legal basis and what protections they provide for transfers to the U.S.  On 25 September, Noyb published the 33 responses it received, where two thirds of companies confirmed that they were using SCCs or planned to do so.

Cloud computing community proposes new transfer safeguards.  The creators of the EU Cloud Code of Conduct (the “Code”) announced that work is underway on a mechanism to transfer personal data outside the EU in a GDPR-compliant manner. The creators are now looking to work with stakeholders to adapt the Code to incorporate an “effective but accessible” safeguard for third-country transfers. If successful, the Code might provide an alternative mechanism to the recently annulled Privacy Shield for transferring data to the U.S.

Council of Europe calls for the adoption of “Convention 108”.  Representatives from the Council of Europe have called upon states to use Schrems II as an opportunity to adopt a new international framework, similar to that of “Convention 108”.  Adopted in 1981, and recently modernised in 2018, Convention 108 imposes robust checks on national security authorities processing personal data during international transfers.  While it doesn’t deal with privacy issues surrounding mass surveillance, the Council of Europe believes that it is proof that an international rights-based framework is possible.

EDPB Task Force to review Facebook and Google complaints.  The EDPB has launched a taskforce to respond to complaints made by Noyb to multiple DPAs.  Noyb alleges that 101 companies still forward visitor data to Facebook or Google which, in turn, transfer the data to the U.S. via the now-defunct Privacy Shield, or by SCCs which lack sufficient protection.  The taskforce aims to ensure close cooperation among the Member States.

Competition

Germany approves Draft Act on Digitisation of German Competition Law.  Earlier this month, the German government approved the Draft Act on Digitisation of the German Competition Law, which seeks to tackle the antitrust challenges surrounding digital markets and tech companies.  Perhaps the most interesting aspect of the Draft Act is the introduction of a new category of market power: companies which hold “paramount significance across markets.”  If a company comes within this definition, the German Federal Cartel Office would be able to prohibit self-preferential treatment of its own services and the use of third-party data to create barriers to entry.  As we covered previously, the FCO has taken novel steps in recent years to curtail dominance in digital markets and it looks likely to continue down that path.

Italy’s competition authority investigates Apple, Google and Dropbox.  Italy’s Competitions and Markets Authority (“AGCM”) announced that six investigations into iCloud, Google Drive and Dropbox for possible unfair practices.  AGCM claims that the companies may be failing to adequately explain to consumers how their personal data is collected and processed, casting doubt on the validity of customer consents.  AGCM will also investigate whether the services’ terms and conditions violate domestic Italian consumer law.  Particularly terms giving the service providers the right to unilaterally suspend or interrupt access to the platforms, as well as an exemption of liability in the event any data loss.  The companies face a €5 million fine for each unfair practice established.

Artificial Intelligence

AI civil liability regime study published.  The European Parliamentary Research Service published its study on the civil liability regime for AI-related bodily injury and harm.  The findings suggest that a clear and coherent EU civil liability regime for AI has the potential to increase safety, reduce legal uncertainty and enhance consumer rights and trust.  Establishing such a framework is likely to increase the uptake of AI technology, which could generate €54.8 billion in added value for the EU economy by 2030.

Expert report on the ethics of driverless vehicles.  The European Commission has published an independent expert report on the ethics of connected and automated vehicles (“CAVs”).  As well as tackling pertinent questions on road safety and accident liability, the report makes several recommendations relating to the role of data and AI in CAVs.  The report encourages policymakers and manufactures alike to develop new models of consent, transparency strategies, interfaces which maintain AI-literacy, AI auditing procedures and anti-discrimination measures.

European Data Protection Supervisor publishes blog on the EU’s digital future. Wojciech Wiewiórowski, the European Data Protection Supervisor, published a blog article on the EU’s digital future and the role of AI.  Drawing reference to previous opinions on the European Strategy for Data and the Commission’s White Paper on AI, Wiewiórowski calls for a coherent yet prudent approach to AI.  A particular area of caution relates to facial recognition, where the blog supports the idea of a moratorium on automated recognition in public spaces.

European Parliament committee on AI holds first meeting.  The European Parliament’s new special committee on Artificial Intelligence in a Digital Age (“AIDA”) held its first meeting.  Upon being elected as AIDA’s first Chair, Dragoș Tudorache stated that his main priority is to build and support AI that drives innovation and prosperity, while ensuring that it respects fundamental rights and does not discriminate against individuals.  We will continue to monitor the committee’s work.

To subscribe to the Data Blog, please click here.

Author

Jeremy Feigelson is a Debevoise litigation partner, Co-Chair of the firm’s Data Strategy & Security practice, and a member of the firm’s Intellectual Property and Media Group. He frequently represents clients in litigations and government investigations that involve the Internet and new technologies. His practice includes litigation and counseling on cybersecurity, data privacy, trademark, right of publicity, false advertising, copyright, and defamation matters. He can be reached at jfeigelson@debevoise.com.

Author

Christopher Garrett is an English-qualified international counsel in the Corporate Department and a member of the Data Strategy & Security practice, practising employment law and data protection. He has significant experience advising employers on all aspects of employment law and advising companies on compliance with UK and EU data protection law. Mr. Garrett has substantial experience in advising on the employment aspects of mergers & acquisitions transactions, including transfers of employees or other issues arising under TUPE/the Acquired Rights Directive. Mr. Garrett has a wide range of experience advising on other matters such as boardroom disputes, senior executive contracts and terminations, disciplinary and grievance matters, a variety of employment tribunal claims (including high-value discrimination claims), advising employers faced with industrial action, consultation on changes to occupational pension schemes and policy and handbook reviews. Mr. Garrett also has a particular focus on handling privacy and data protection issues relating to employees, as well as online privacy, marketing and safety practices, regular advice to clients on privacy policies, online marketing practices and related matters.

Author

Fanny Gauthier is an associate in Debevoise's Litigation Department, based in the Paris office. Ms. Gauthier is a member of the firm’s International Dispute Resolution Group, as well as the firm’s Data Strategy & Security practice. Her practice focuses on complex commercial litigation, international arbitration and data protection. She can be reached at fgauthier@debevoise.com.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Hilary Davidson is a corporate associate and a member of Debevoise's Mergers & Acquisitions Group. Ms. Davidson’s practice focuses on private M&A, with particular experience advising private equity clients. This has included advising on joint ventures, cross-border mergers and acquisitions and secondary and co-invest transactions. She can be reached at hdavidson@debevoise.com.

Author

Jennifer Deschins is an associate in the Frankfurt office and a member of the firm’s Litigation Department. Her practice focuses on Arbitration, Litigation, Internal Investigations, Cyber Privacy, Data Protection, Anti-Money Laundering and Competition Law.

Author

Sara Ewad is an associate in the London office and a member of the firm’s International Dispute Resolution Group.

Author

Ariane Fleuriot is an associate in Debevoise's Litigation Department. She can be reached at afleuriot@debevoise.com.

Author

Martha Hirst is an associate in Debevoise's Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group, and the Data Strategy & Security practice. She can be reached at mhirst@debevoise.com.