Many businesses use customer-tracking technology and other tools—such as pixels, session replay, software development kits (“SDKs”), and chatbots—to improve website user experiences, understand customer behavior, train their technology, and gauge effectiveness of advertisements.  Increasingly, however, these technologies present litigation risks under the California Invasion of Privacy Act (“CIPA”).

In this blog post, we provide an overview of the technologies that plaintiffs most commonly target for CIPA lawsuits and measures that companies can take to mitigate their CIPA litigation risk.  We have separately addressed regulatory enforcement trends from businesses’ use of cookies in this blog post.

CIPA Overview and Background

CIPA is a criminal statute enacted in 1967, preceding both the internet and technologies that are the subject of plaintiffs’ novel legal theories today.  Originally understood to be focused on preventing eavesdropping during telephone calls, plaintiffs have repurposed this statute in recent years to target companies using tracking technologies for basic business practices, including digital marketing and customer service call monitoring.

Plaintiffs can wield CIPA as a powerful cudgel because the statute provides statutory damages of $5,000 per violation.  Courts’ receptions to these claims have varied.  Some have dismissed for lack of standing or failure to state a claim, but others have denied motions to dismiss, permitting plaintiffs to test their novel CIPA theories with the benefit of (costly) discovery.  This has been particularly true since the Ninth Circuit issued a decision in Javier v. Assurance IQ, LLC, 2022 WL 1744107 (9th Cir. May 31, 2022), holding that companies cannot absolve past CIPA sins through retroactive consent.

Tools and Technologies Targeted for CIPA Claims

Plaintiffs have used CIPA to target both the technology providers and the companies that utilize them.  Often, plaintiffs allege that the technology provider is impermissibly intercepting communications with a company without consent and that the company is aiding and abetting that illegal interception.  More recently, plaintiffs have asserted that companies using tracking technologies are liable based on CIPA’s prohibition against unauthorized wiretapping or use of pen register and trap-and-trace devices.  The technologies typically targeted by plaintiffs under CIPA include:

  • Tracking Pixels. Tracking pixels are commonly embedded in websites to track consumer activity, such as IP addresses, referrer URLs, timestamps, user clicks, form submissions, and other user information.  Information collected by pixels may then be transmitted back to the company that created the pixel.
  • Session Replays. Session replay analytic tools allow website operators to play back visitors’ experiences on their website by creating a stylized recording of a user’s interactions with the website (such as clicks, page visits, mouse movements, etc.), and are often used to enhance website design and identify bugs.
  • SDKs. SDKs are sets of software development tools that assist software developers in building apps that can easily integrate into their platforms.  These tools include libraries of codes, debugging tools, documentation of how to use SDKs, and sample codes.  Plaintiffs have alleged that SDKs are capable of identifying consumers, gathering data, and correlating that data using unique “fingerprinting” to identify specific users.
  • Chatbots. Chatbots are programs or software that automatically respond to messages sent to a website’s email, social media account, phone number (such as text messages), or the chatbots themselves.
  • Cloud Contact Centers. A cloud contact center is a web-based customer service center that uses the internet and cloud computing to manage customer communications.  Calls, emails, SMS, and other customer communications are hosted in a cloud environment, and the cloud syncs customer information so that agents (human and AI) can easily access customer data across various communication portals.  Here, CIPA claims may arise when technologies, such as AI, are used by a cloud contact center to record communications between the company’s business client and the end-customer.

Plaintiffs will likely continue to expand their CIPA-based claims to new technologies; based on recent trends, we expect them to begin aggressively targeting software and technologies that collect and use data to train AI.

CIPA Lawsuits Targeting AI

One recent trend is the emergence of CIPA suits targeting AI-powered software products that businesses use to record or otherwise collect their customers’ communications.  Under a recent line of cases, the purveyors of such software products could potentially be liable under CIPA for maintaining the mere capability to use the contents of the collected communications for their own purposes, such as training their AI, regardless of actual use.  See, e.g., Javier v. Assurance IQ, LLC, 649 F. Supp. 3d 891 (N.D. Cal. 2023).

For example, in Ambriz v. Google, LLC, 2025 WL 830450 (N.D. Cal. Feb. 10, 2025), plaintiffs brought CIPA claims against Google based on its Google Cloud Contact Center AI (“GCCCAI”), which businesses use to transcribe and analyze calls with human call agents.  Plaintiffs asserted Google was liable for intercepting their calls with businesses using GCCCAI.  On a motion to dismiss, Google argued that the alleged conduct was not actionable under CIPA because Google provided a software tool to its business clients for them to lawfully record and analyze their own customer calls and that Google was contractually unable to use any data from the calls for its own purposes.

The District Court rejected Google’s argument, finding that, for CIPA purposes, Google was an unauthorized third-party listener, and plaintiffs had adequately alleged that Google has the capability to use wiretapped data it collects to improve its AI models.  Alleged capability to use the call data—even in the absence of proof of actual use—was enough to survive Google’s motion to dismiss.  The District Court concluded that “Google does not dispute that it is technologically capable of using the call data for an independent purpose, and this is what the capability test measures.”  Id. at *3.

What’s Next?

Although Ambriz did not implicate aiding and abetting liability under CIPA, it is not difficult to imagine future plaintiffs making the claim against companies that utilize cloud contact centers or other AI products that have the mere capability of utilizing the captured data (i.e., the purportedly “wiretapped” data) for a separate and independent purpose.  To the extent Ambriz suggests that contractual limits alone may be insufficient as a defense to such claims, companies can augment their consent argument by proactively and clearly disclosing their use of third-party service providers, including AI-powered analytic tools.  Alternatively, companies may consider implementing technical measures (including end-to-end encryption and direct delivery of data without human intervention) to reduce the likelihood that a plaintiff can adequately allege the company has even the mere capability of using records of customer interactions for purposes unrelated to the vendor engagement.

Given the surge of claims brought under CIPA, the California legislature is contemplating a bill, SB 690, that would establish a commercial business exemption to CIPA’s prohibition on unauthorized wiretapping or use of pen register and trap-and-trace devices.  If passed, the bill would provide some relief to businesses on a going forward basis.  On June 3, 2025, the bill was unanimously approved by the state Senate and referred to the Assembly.

Practical Considerations

Continuing to track CIPA-related developments will be critical for businesses that use customer tracking and engagement technologies.  Until there is greater clarity on how CIPA will apply to particular technologies, companies may want to take steps to mitigate their risk, including:

  • Inventory Use of Tracking Technologies. Companies should ensure they understand how their websites use pixels, cookies, and other tracking technologies to evaluate their CIPA risk and consider limiting or removing any unnecessary tracking tools or collecting extraneous information.  Often, the tools can be configured to collect lower-risk data, or prevent the sharing of data to unauthorized third parties.  The use of a consent management platform may assist in this task.
  • Consider Obtaining Affirmative Consent. Because consent can be a powerful and effective defense to CIPA claims, companies may consider requiring customers to affirmatively agree to the use of tracking and analytic technologies in a manner that is substantively and procedurally enforceable (such as by using cookie banners and proper clickwraps while avoiding so-called “dark patterns”).  Software providers whose products leverage AI should also consider the type of consent that businesses who use their products obtain from end customers, and consider adjusting their license agreements to require that businesses using their products disclose their use of AI to end customers and/or obtain affirmative consent from end customers prior to allowing them to engage with the AI-powered systems.
  • Utilize Just-in-Time Disclosures. Other forms of disclosure that could mitigate risk include using pop-ups that inform website users that their interactions are being tracked and recorded, beginning chats with a disclosure informing consumers that the chat is being recorded through a particular third-party vendor, and affirmatively disclosing whether any AI is transcribing a consumer’s call into a cloud contact center.
  • Ensure Accurate Disclosures. Companies should continually review their terms of service and privacy policies to ensure that they are accurately disclosing use of relevant AI and tracking technologies.  These disclosures could describe limits that the company places on them (such as if they are configured to not share information with third parties, etc.).  And, for those not using the tools at all, it may now be worth stating that explicitly in terms of service and privacy policies to potentially deter frivolous lawsuits.
  • Enable Opt-Out Mechanisms. Should a customer choose not to engage in an AI-related service or tracking tool, ensure that there is a seamless way for them to opt out, including by ensuring that buttons, options, or other mechanisms for opting out are clear, conspicuous, easy to use, and consistent with regulatory requirements.
  • Implement Technical Barriers to Prevent Data from Being Reused. If feasible, consider implementing technical barriers to prevent relevant vendors from independently utilizing data that they record from the company to mitigate CIPA aiding and abetting litigation risk.

To subscribe to the Data Blog, please click here.

This blog post was made possible by contributions from former associate Jackie Dorward and summer associate Ella Han.

The cover art used in this blog post was generated by ChatGPT.

Author

Avi Gesser is Co-Chair of the Debevoise Data Strategy & Security Group. His practice focuses on advising major companies on a wide range of cybersecurity, privacy and artificial intelligence matters. He can be reached at agesser@debevoise.com.

Author

Jim Pastore is a Debevoise litigation partner and a member of the firm’s Data Strategy & Security practice and Intellectual Property Litigation Group. He can be reached at jjpastore@debevoise.com.

Author

Matthew Kelly is a litigation counsel based in the firm’s New York office and a member of the Data Strategy & Security Group. His practice focuses on advising the firm’s growing number of clients on matters related to AI governance, compliance and risk management, and on data privacy. He can be reached at makelly@debevoise.com

Author

Johanna Skrzypczyk (pronounced “Scrip-zik”) is a counsel in the Data Strategy and Security practice of Debevoise & Plimpton LLP. Her practice focuses on advising AI matters and privacy-oriented work, particularly related to the California Consumer Privacy Act. She can be reached at jnskrzypczyk@debevoise.com.

Author

Gabriel Kohan is a litigation associate at Debevoise and can be reached at gakohan@debevoise.com.

Author

Joshua Plastrik is an associate in the Litigation Department. He can be reached at jhplastrik@debevoise.com.

Author

Annabella Waszkiewicz is a law clerk in the Litigation Department.