Key takeaways from August include:

Conflicts of interest: Businesses should consider re-evaluating their data protection officer’s role and responsibilities, including dual roles on boards and committees, to prevent conflicts of interest arising in light of the Spanish AEPD’s €5,000 fine for related failures;

Automated decision-making: Businesses need not disclose the algorithms used in automated decision-making in response to data subject access requests, according to a recent Austrian court decision;

Biometric data: Organisations using or planning to deploy biometric recognition may want to assess their practices against the UK ICO’s draft guidance clarifying regulatory expectations, for example, that the data used will be considered special category data and that a data protection impact assessment is almost always needed;

Data scraping: Businesses should consider proactively strengthening their data protection measures, including rate limiting, IP blocking, and available legal action against data scraping, in response to a recent joint statement by a number of data protection authorities setting out their regulatory expectations;

Data minimisation: Businesses should consider a strict data minimisation policy for identity documents, following the Irish data protection authority’s reprimand of Airbnb for storing user identity documents after the verification process is completed;

Digital Services Act: Following the latest EU Digital Services Act implementation deadline, very large online platforms should continue to assess compliance with the Act’s obligations, including for advertising and content moderation; and

Dark patterns: Organisations should consider the steps provided by the UK ICO and Competition and Markets Authority in a new joint position paper to avoid potentially harmful digital design tactics to obtain consumer data, such as the use of default settings that are difficult for users to change or that they are discouraged from changing.

These developments, and more, covered below.

The Spanish AEPD fines a controller €5,000 for its data protection officer’s conflicts of interest

What happened: Arising from a data subject complaint, Spain’s data protection authority, the AEPD, fined the Official College of Architects of Granada €5,000, finding that the data protection officer’s (“DPO”) responsibilities within the organisation posed a conflict of interest in violation of GDPR Art. 38(6).

The GDPR requires that there must be no conflict of interest in the appointment of the DPO, which could arise from, for example: (i) a certain level of seniority; (ii) a financial benefit in a business’s success; or (iii) responsibilities involving the business’s data processing activities.

The AEPD held that a DPO cannot hold a position that leads them to determine the purposes and means of data processing. The DPO role and the individual’s parallel appointments to the organisation’s governing board and permanent commission each involved responsibilities influenced by data protection, including those related to admissions and applications, processing requests, communications, and disciplinary measures, keeping minutes, distribution of the school’s funds, and carrying out the governing board’s guidelines.

The controller was also fined €9,000 for not having the proper contact information for the DPO in its privacy policy, and for using cookies without user consent.

What to do: Businesses may wish to review the role and responsibilities of their DPO to ensure the proper separation of responsibilities between the DPO and roles involved in determining the purposes and means of data processing.

Austrian court limits required disclosure of automated decision-making logic

What happened: An Austrian court determined its decision determining that where a controller, in this case, a credit ranking agency, is responding to a data subject access request involving automated decision-making, the obligation to provide “meaningful information about the logic involved” does not require disclosure of the algorithms or mathematical formulas used.

Instead, businesses are expected to provide: (i) the categories of personal data and their relevance; (ii) how the profile is created by automated means, including the statistical method used; (iii) why the profile is relevant for the decision; and (iv) how the profile is actually used for the decision.

What to do: Businesses utilising automated decision-making involving personal data may want to review their data subject access request policies and practices, and consider updates in accordance with this Austrian court’s expectations.

ICO draft guidance on biometric data confirms regulatory expectations

What happened: The ICO released draft guidance on the use of biometric data recognition systems, following increased use of such systems, including for access authentication and MFA purposes. The scale and data protection risks associated with such technologies has been further complicated recently by their increasing integration with artificial intelligence systems.

Key points from the guidance include:

  • personal data used in biometric recognition systems—whether or not successful in identifying or verifying an individual’s identity—will satisfy the test for “biometric data” under the GDPR and is therefore special category data;
  • using biometric recognition systems is “highly likely” to trigger the requirement to prepare a data protection impact assessment (“DPIA”);
  • providers of biometric recognition systems that use biometric data to further develop their models will ordinarily be controllers for that purpose;
  • in most cases, explicit consent is likely to be the only valid condition for processing biometric data (but there may be difficulties in obtaining such consent); and
  • assessing and dealing with security risks associated with processing biometric data is particularly important, and biometric data must be encrypted.

What to do: Businesses that have implemented, or are considering implementing, biometric recognition technologies should assess their own implementation against the draft guidance. While the guidance has not been finalised, major changes seem unlikely and the current draft provides useful insights into the ICO’s current expectations. The consultation period for the draft guidance closes on 20 October 2023, with a second phase of draft guidance on biometric classification and data protection expected in early 2024.

Coalition of data protection authorities issue statement on data scraping concerns

What happened: A diverse coalition of data protection authorities spanning multiple continents—members of the GPA’s International Enforcement Cooperation Working Group—endorsed a statement outlining expectations of social media platforms and other sites to safeguard against unlawful data scraping and minimise associated privacy risks, including for targeted cyberattacks, identity fraud, monitoring, profiling and surveilling individuals, unauthorised political and intelligence gathering, and unwanted direct marketing and spam.

Emphasising that publicly accessible personal data is still subject to data protection and privacy laws in many countries, and that the collection of such data could constitute a reportable breach when unauthorised, the note calls for businesses to do more to protect data from scraping through enhanced and multi-layered technical measures (such as rate limiting, usage monitoring, CAPTCHAs, IP blocking, and other proactive bot detection steps), taking legal action where scraping is confirmed or suspected, and supporting user decisions around privacy (such as proactively informing users about utilising privacy-enhancing settings).

What next: Businesses that make personal data publicly accessible should take note of the statement as a potential signal of regulatory interest in perceived misuse of that data. Such businesses may wish to evaluate the adequacy of their technical and organisational measures directed to protecting such data in light of the specific measures identified in the statement. Given the limits of such technical measures, businesses may also wish to consider opportunities to minimise the volume and sensitivity of information made publicly available and better inform users about the risk of data misuse. Businesses reliant on scraping may also want to consider how these expectations might impact their business model.

Airbnb faces reprimand and corrective orders following finding that it retained identification documents too long

What happened: The Irish DPC announced its June 2023 determination of a complaint by an Airbnb host concerning Airbnb’s processing of her personal data for identity verification.

The DPC found in favour of Airbnb on most of the substantive issues raised by the complaint, including that Airbnb:

  • Could properly rely on legitimate interest as the lawful basis for processing copies of the host’s identification documents for the purposes of verifying her identity, given the real-world safety implications of host verification and the narrow tailoring of Airbnb’s identity checking process to that end.
  • Complied with the principle of data minimisation when requesting identity documents. In particular, the DPC noted Airbnb’s staggered approach to user verification, which only requested identification when other verification processes (such as comparing details on file against public sources) had proven unsuccessful.
  • Issued a privacy notice and other documentation which satisfied the GDPR’s transparency principle, including by highlighting that Airbnb may seek a copy of photographic identification to verify host identity.

Nevertheless, the DPC required Airbnb to revise its internal policies and procedures: (i) to ensure that previously submitted defective identification documents are deleted once a user is verified; and (ii) the period for which all identification documents (including valid documents) are stored is strictly limited, finding that its existing policy of retaining such documents while an account remained open violated the GDPR.

What to do: Businesses should take some comfort in the DPC’s decision, which recognises the important interests associated with identity verification and the work Airbnb had done to limit the circumstances in which it collects identification documents. The decision nevertheless highlights the need for businesses to consider retention policies relating to copies of the identification documents, including how long it is strictly necessary to retain such documents following successful verification.

Very large online platforms introduce measures to comply with new Digital Services Act obligations

What happened: 25 August 2023 marked another key date of the implementation of the EU’s Digital Services Act (“DSA”), where very large online platforms (“VLOPs”) became subject to the full range of the DSA’s obligations. We previously discussed the DSA and its implications on VLOPs, including prohibitions on certain targeted advertising practices, requirements that platforms monitor and remove illegal and harmful content, and stringent transparency requirements regarding advertising and content moderation practices.

In advance of the compliance deadline, TikTok announced new features for European users aimed to comply with its new obligations under the Act. In July, TikTok performed a voluntary stress test to assess what was needed to be fully compliant with the Act. In response, TikTok unveiled key features for European users, including making it easier to report illegal content, providing an option to turn off personalised recommendations for videos, and prohibiting targeted advertising for users between the ages of 13 and 17 years old.

What to do: Businesses should take note of compliance obligations the DSA will create for them and the applicable DSA’s implementation timeline. While the DSA is currently only effective for VLOPs, other entities within its scope must comply by 17 February 2024. Businesses should consider what implications the DSA may have on their advertising platforms or their use of AI and automated decision-making tools, especially in the context of content moderation and targeted advertising practices.

French CNIL closes cookie consent injunction against Google

What happened: The French CNIL published its closure of its December 2021 injunction against Google LLC and Google Ireland Ltd, which had ordered Google to allow users of the websites google.fr and youtube.com located in France to refuse cookies as easily as to accept them. In response, Google had placed a refusal button entitled “Only allow essential cookies” near the acceptance button. On 13 July 2023, the CNIL’s restricted committee determined that Google had complied with the injunction and closed the procedure.

What to do: As the CNIL continues to be a leading enforcer in cookies practices, organisations, particularly if processing personal data of individuals in France, may wish to review their current practices against the CNIL’s latest expectations, including in its guidance on the topic.

EDPB’s Art. 65 resolution leads to €345 million Irish DPC fine of TikTok over its handling of children’s data

What happened: The EDPB issued its binding decision to resolve disagreement among concerned DPAs regarding the Irish DPC’s draft decision of its inquiry into TikTok’s processing of personal data of users aged 13 to 17. In particular, the DPC investigated whether TikTok had failed to ensure that its age verification processes sufficiently protected the children’s privacy. Objections were lodged by other European DPAs regarding whether: (i) there had been an infringement of data protection by design regarding age verification; and (ii) there had been an infringement of the principle of fairness regarding certain design practices.

The Irish DPC announced a €345 million penalty on September 15, which we will analyse in next month’s blog post. TikTok was also fined £12.7 million in April 2023 by the UK ICO for illegally processing data of children under 13, one of the ICO’s largest ever penalties.

What to do: Businesses should take particular care to ensure compliance when processing children’s data, especially around: (i) age verification and parental consent collection mechanisms; (ii) transparency and adequacy of privacy notices and collection of children’s data; and (iii) any necessary assessments to establish a lawful basis for special data processing activities.

UK ICO and CMA issue a joint position paper on deceptive online practices

What happened: The UK ICO and Competition and Markets Authority (“CMA”) issued a joint position paper on harmful design in digital markets, which sets out the their perspectives on how choices in online architecture undermine consumer choice and control over personal information. The paper highlights the importance for consumers to have the ability to exercise meaningful choice and control over their data, and the practical ways that firms with a digital presence and UX designers can avoid potentially harmful practices to present choices about personal data processing—in particular, “harmful nudges and sludge”, “confirmshaming”, “biased framing”, “bundled consent”, and “default settings”.

What to do: Businesses with an online presence may want to review their interfaces against the potentially harmful tactics the authorities cite, using the actionable steps provided by the report, in addition to reviewing other related commentary from the UK ICO on dark patterns and the EU Digital Services Act’s codification of a number of covered ‘dark patterns’ practices.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Dr. Friedrich Popp is an international counsel in the Frankfurt office and a member of the firm’s Litigation Department. His practice focuses on arbitration, litigation, internal investigations, corporate law, data protection and anti-money laundering. In addition, he is experienced in Mergers & Acquisitions, private equity, banking and capital markets and has published various articles on banking law.

Author

Aisling Cowell is an associate in the Litigation Department based in the London office. She is a member of the firm’s White Collar & Regulatory Defense Group. She can be reached at acowell@debevoise.com

Author

Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at sdthomas@debevoise.com.

Author

Tristan Lockwood is an associate in the firm’s Data Strategy & Security practice. He can be reached at tlockwood@debevoise.com.

Author

Melissa Muse is an associate in the Litigation Department based in the New York office. She is a member of the firm’s Data Strategy & Security Group, and the Intellectual Property practice. She can be reached at mmuse@debevoise.com.

Author

Melyssa Eigen is an associate in the Litigation Department. She can be reached at meigen@debevoise.com.

Author

Maria Epishkina is a corporate associate and a member of the Mergers & Acquisitions, Capital Markets and Private Equity Groups. She can be reached at mepishkina@debevoise.com

Author

David Z. Rochelle is an associate in the Litigation Department. He can be reached at dzrochelle@debevoise.com