On August 11, 2022, the Federal Trade Commission (the “FTC”) announced its Advance Notice of Proposed Rulemaking (the “ANPR”) seeking public comment on 95 questions focused on harms stemming from “commercial surveillance and lax data security practices” and whether new trade regulation rules under section 18 of the FTC Act are needed to protect people’s privacy and information.

In Part 1 of this Data Blog series, we provided an overview of the ANPR and the context for the FTC’s rulemaking process. In Part 2, we will explore how the privacy-focused components of the ANPR may offer actionable takeaways for businesses to consider now.

The ANPR provides a helpful guide to FTC enforcement in the area of privacy and data security, summarizing major FTC enforcement actions and recounting the FTC’s broader policy work in areas such as dark patterns and facial recognition.

The ANPR questions also shed light on the specific practices and potential harms the FTC views as particularly troubling and potentially ripe for enforcement. Indeed, among the ANPR’s stated goals are creating a “public record about prevalent commercial surveillance practices” that are unfair or deceptive to “help sharpen” the agency’s enforcement work, even if new trade rules are not promulgated.

Key Privacy Topics Addressed by the ANPR

Unsurprisingly, many of the ANPR’s questions relate to how commercial surveillance practices may impact consumer privacy. For example, the ANPR’s questions:

  • Indicate that the FTC may attempt to limit commercial surveillance practices that use or facilitate the use of facial recognition, fingerprinting or other biometric technologies, and determine how it should do so;
  • Raise the possibility the FTC may attempt to limit targeted advertising and ask whether the potential new rules should impose data minimization standards, data retention periods and purpose limitations;
  • Contemplate whether the FTC should require companies to certify that the companies’ commercial surveillance practices meet clear standards concerning data collection, use, retention, transfer or monetization and whether the FTC, a third-party organization or another entity should set those standards;
  • Address consumer consent as well as notice, transparency and disclosure principles, including whether commercial surveillance practices are or should be prohibited, irrespective of consumer consent; whether different consent standards should be required for different consumer groups (e.g., parents of teenagers, elderly individuals, individuals in crisis or otherwise especially vulnerable to deception); and whether the new rules should require companies to make information available about their commercial surveillance practices—and if so, what kinds of information and in what form—and in which context transparency or disclosure requirements are effective;
  • Emphasize there may be a need for rules to further address the privacy of children and teenagers, including 11 questions specific to the topic, including asking about the types of practices that children are particularly vulnerable to and inquiring as to the extent of the business-to-business market for children and teen’s data, as well as the efficacy of parental consent as a way of ensuring child online privacy. The ANPR’s questions also consider under what circumstances a failure to provide privacy protections to children and teenagers (e.g., not providing privacy-protective settings by default) is an unfair practice, even if the site or service is not targeted to minors.

FTC Privacy Trends

Over the past 20 years, the FTC has brought privacy-related enforcement actions against companies in various industries (e.g., social media, ad tech and the mobile app ecosystem). Recently, these enforcement actions, along with the FTC’s policy guidance, have focused on the importance of abiding by privacy promises, the use and sharing of highly sensitive data, alleged stealthy geolocation tracking and COPPA violations, education technology and children’s privacy, deceptive automatic renewal practices,  “dark patterns,” the collection of health information and compliance with the Fair Credit Reporting Act. These actions by the FTC should be kept in mind as companies review the ANPR and consider potential areas to improve their privacy programs.

Mitigation Strategies to Address Potential Privacy Risks

Although the promulgation of new trade regulation rules related to the ANPR is likely several years away (if it proceeds at all), businesses can review the ANPR as a potential roadmap for risk-mitigation strategies that are informed by the FTC’s enforcement, initiatives, policy statements and guidance with respect to privacy issues. Although every company is different, and company-specific assessments must be made, these strategies may include the following:

  • Identify potential high-risk data uses and, where possible, consider limiting such uses, offering individuals more choices and transparency around those uses, and implementing safeguards for such data. The ANPR questions indicate that the FTC may attempt to limit practices such as: (a) those that use or facilitate the use of facial recognition, fingerprinting or other biometric technologies; and (b) those that involve children and teenagers, where the FTC appears to be contemplating that a failure to provide privacy protections to children and teenagers (e.g., not providing privacy-protective settings by default) may constitute an unfair practice.
  • Consider implementing “privacy by design” principles, including data minimization and retention periods. The ANPR strongly suggests that the FTC is considering data minimization or purpose limitation requirements similar to those found in the GDPR. Such requirements may include limiting “companies from collecting, retaining, using, or transferring consumer data beyond a certain predefined point” and requiring companies “to collect, retain, use, or transfer consumer data only to the extent necessary to deliver the specific service that a given individual consumer explicitly seeks or those that are compatible with that specific service.” Among the enforcement actions highlighted, the FTC cited one involving the collection and sharing of sensitive television-viewing information to target advertising contrary to purported reasonable expectations—suggesting that consumer’s reasonable expectations should be considered in what might be appropriate data uses.
  • Consider whether the company should go beyond a traditional notice and implied consent regime. The ANPR devotes a number of questions around “consumer consent” raising the possibility of imposing affirmative consent requirements or the right to withdraw consent to commercial surveillance practices. Companies may want to consider “privacy by design” strategies in new product developments, including how to incorporate best practices and capabilities to honor opt-outs and consents. The FTC’s October 2021 Enforcement Policy Statement Regarding Negative Option Marketing, which signals that the Commission intends to ramp up enforcement against “dark patterns,” provides key recommendations for businesses that include obtaining the consumer’s express informed consent before charging them for a product or service.
  • Take a fresh look at your company’s privacy practices and whether they are consistent with your privacy representations and promises. The ANPR suggests an increased focus on data privacy issues, and the list of enforcement actions serves as a reminder of the importance of accuracy and transparency of privacy disclosures.

Conclusion

To enhance readiness and privacy compliance programs in advance of any new regulations, businesses should review the ANPR, assess whether their privacy practices may be implicated by the ANPR’s key topics and questions and consider revisiting and enhancing their programs in light of the FTC’s guidance and enforcement actions.

As noted above, Part 1 of this Data Blog series provided background on the current ANPR and the context to the FTC’s approach to rulemaking under Section 18 of the FTC Act. Part 3 will focus on data security and Part 4 on artificial intelligence, algorithms and discrimination.

To subscribe to the Data Blog, please click here.

The authors would like to thank Debevoise law clerk Lily Coad for her work on this Debevoise Data Blog.

Author

Avi Gesser is Co-Chair of the Debevoise Data Strategy & Security Group. His practice focuses on advising major companies on a wide range of cybersecurity, privacy and artificial intelligence matters. He can be reached at agesser@debevoise.com.

Author

Erez is a litigation partner and a member of the Debevoise Data Strategy & Security Group. His practice focuses on advising major businesses on a wide range of complex, high-impact cyber-incident response matters and on data-related regulatory requirements. Erez can be reached at eliebermann@debevoise.com

Author

Paul D. Rubin is a corporate partner based in the Washington, D.C. office and is the Co-Chair of the firm’s Healthcare & Life Sciences Group and the Chair of the FDA Regulatory practice. His practice focuses on FDA/FTC regulatory matters. He can be reached at pdrubin@debevoise.com.

Author

Johanna Skrzypczyk (pronounced “Scrip-zik”) is a counsel in the Data Strategy and Security practice of Debevoise & Plimpton LLP. Her practice focuses on advising AI matters and privacy-oriented work, particularly related to the California Consumer Privacy Act. She can be reached at jnskrzypczyk@debevoise.com.

Author

Michael R. Roberts is a senior associate in Debevoise & Plimpton’s global Data Strategy and Security Group and a member of the firm’s Litigation Department. His practice focuses on privacy, cybersecurity, data protection and emerging technology matters. He can be reached at mrroberts@debevoise.com.

Author

Melissa Runsten is a corporate associate and a member of the Healthcare & Life Sciences Group. Her practice focuses on FDA/FTC regulatory matters and includes the representation of drug, device, food, cosmetic and other consumer product companies. She can be reached at mrunsten@debevoise.com.

Author

Anna R. Gressel is an associate and a member of the firm’s Data Strategy & Security Group and its FinTech and Technology practices. Her practice focuses on representing clients in regulatory investigations, supervisory examinations, and civil litigation related to artificial intelligence and other emerging technologies. Ms. Gressel has a deep knowledge of regulations, supervisory expectations, and industry best practices with respect to AI governance and compliance. She regularly advises boards and senior legal executives on governance, risk, and liability issues relating to AI, privacy, and data governance. She can be reached at argressel@debevoise.com.

Author

Melissa Muse is an associate in the Litigation Department based in the New York office. She is a member of the firm’s Data Strategy & Security Group, and the Intellectual Property practice. She can be reached at mmuse@debevoise.com.