On 23 February 2023, the UK ICO hosted its latest privacy forum in a series aimed at helping product designers and managers incorporate “privacy by design” or “data protection by design and by default” principles into their work.

Presenters from a wide range of sectors, including from the ICO, offered practical guidance that may help companies better understand current market practice, the ICO’s expectations, and the direction of forthcoming regulatory guidance.

Key observations shared during the event included:

  • Privacy is not just for lawyers. Privacy should not be siloed from the product teams. That approach can lead to merely protecting the product as designed, instead of designing the product to comply with privacy regulations. Speakers recommended that product design teams be directly involved in privacy considerations from start to finish and include compliance staff.
  • Create privacy feedback intervals. With privacy integrated at the product team level, it is also easier to implement processes to collect feedback on privacy issues throughout the product creation process. A framework of “design, test, repeat, reiterate” can help ensure compliance from the start.
  • Ditch dark patterns. Panelists suggested that the concept of “dark patterns” is coalescing around consistent terminology that may soon enable further regulation. For example, the Digital Services Act codified the term and listed a number of covered practices, which is may spur further regulation inside and outside the European Union. At least some panelists felt regulation was necessary to steer corporate leadership in the right direction, as many product designers do not have the ability to defy management by objecting to flawed privacy design or lack necessary training to spot the problems.
  • Prioritise training. Participants across panels emphasised the need to provide privacy training to the full product teams. Such training would include a focus on the underlying principles behind the laws, and identify high water marks. Uninformed product team members may be unable to identify and escalate privacy concerns, and a general culture that leaves privacy “to the lawyers” or another team may not foster an environment where junior designers feel empowered to raise concerns.
  • Design for safety, too. Discussion touched on the interplay of privacy and the risk of interpersonal harm arising from technology such as surveillance apps and shared bank accounts. Designers are accustomed to considering user goals and privacy risks arising from unauthorised access, but not necessarily how to counter harmful goals of known users or how to provide actionable information to victim users. Designers may want to consider (i) power imbalances between users, (ii) whether users can easily identify other users, what actions they have taken, and how to remove them, and (iii) safeguards against non-consenting or low-efficacy surveillance.

Much of the discussion aligned with the ICO’s guidance on data protection by design and by default, commentary on dark patterns, and their emphasis on design in respect to children’s data, which organisations may wish to review to ensure alignment with the ICO’s expectations. We are likely to see further guidance from the ICO on many of these topics, including finalised guidance on the use of privacy enhancing technologies (“PETs”), which we previously discussed here, by late spring 2023.

To subscribe to the Data Blog, please click here.

The cover art used in this blog post was generated by DALL-E.

Author

Robert Maddox is International Counsel and a member of Debevoise & Plimpton LLP’s Data Strategy & Security practice and White Collar & Regulatory Defense Group in London. His work focuses on cybersecurity incident preparation and response, data protection and strategy, internal investigations, compliance reviews, and regulatory defense. In 2021, Robert was named to Global Data Review’s “40 Under 40”. He is described as “a rising star” in cyber law by The Legal 500 US (2022). He can be reached at rmaddox@debevoise.com.

Author

Stephanie D. Thomas is an associate in the Litigation Department and a member of the firm’s Data Strategy & Security Group and the White Collar & Regulatory Defense Group. She can be reached at sdthomas@debevoise.com.

Author

Maria Epishkina is a corporate associate and a member of the Mergers & Acquisitions, Capital Markets and Private Equity Groups. She can be reached at mepishkina@debevoise.com

Author

Maria Santos is a trainee associate in the Litigation Department.