On Tuesday, May 16th, 2023, Andrew Bab of the Mergers & Acquisitions and Private Equity Groups and Co-Chair of the Healthcare & Life Sciences Group, Avi Gesser of the Data Strategy & Security Group, Paul Rubin, Co-Chair of the Healthcare & Life Sciences Group and the Chair of the FDA Regulatory practice and Melissa Runsten, a corporate associate published an article entitled “Artificial Intelligence and the Life Sciences Industry: FDA and FTC Regulatory Update.”

Key Takeaways:

  • The life sciences industry is embracing the use of artificial intelligence (“AI”) while the regulatory framework continues to evolve.
  • On May 10, 2023, FDA issued a discussion paper on the use of AI and machine learning (“ML”) in drug development. FDA also recently issued draft guidance to formalize its approach to regulating AI/ML medical devices. The agency proposed the use of predetermined change control plans (“PCCPs”) to address the dynamic nature of AI. Under the proposed framework, modifications to a medical device in accordance with a PCCP would be allowed after the initial FDA authorization (without the need for a subsequent filing).
  • The FTC is also paying close attention to the growing use of AI and has advised companies that the agency may pursue enforcement against those marketing AI products with false or unsubstantiated claims.


Artificial intelligence (“AI”) promises to transform the life sciences industry, as companies seek to incorporate AI into everything from drug discovery to clinical decision-making and diagnostic imaging. Although AI’s potential is now being recognized by the wider public, the Food and Drug Administration (“FDA”) has been developing a framework for regulating the use of AI in FDA-regulated products over the past several years.

FDA recently issued draft guidance to formalize its approach to regulating medical devices that incorporate AI (specifically, the subset of AI known as machine learning (“ML”)[1]).[2] FDA has already approved or cleared hundreds of AI medical devices, and this guidance promises to increase the rate of innovation (and number of approvals and clearances) by setting clear expectations for how the agency will handle its review of AI/ML medical devices in the future. In addition, last week, FDA released a discussion paper addressing the use of AI/ML in the development of drug and biological products.[3]

The FTC is also paying close attention to the growing use of AI and, among other things, has warned companies that claims about AI products must be truthful and not misleading. In light of the evolving regulatory approaches to AI, life sciences and healthcare companies must keep abreast of developments at multiple agencies to fully understand applicable opportunities and risks.

The Evolution of FDA’s Regulatory Framework for AI/ML Medical Devices

The dynamic nature of AI presents a significant obstacle for FDA as it develops an appropriate regulatory regime. FDA’s traditional regulatory framework is based on approval or clearance of a static, unchanging device. Yet the benefits of AI/ML are derived from the ability to learn and adapt based upon new data. The agency therefore must implement a regulatory approach that would ensure safety and efficacy while permitting flexibility for the medical devices to perform as intended.

In 2019, FDA issued a discussion paper describing a potential “total product lifecycle” regulatory approach to premarket review for AI- and ML-driven software modifications.[4] As part of this framework, FDA contemplated the concept of a predetermined change control plan (“PCCP”). The PCCP would be included in premarket submissions, describing anticipated modifications and the methodology for implementing the modifications in a controlled manner that manages risks to patients. A PCCP would allow a manufacturer to modify its medical device in accordance with the PCCP after the initial FDA authorization without an additional submission to FDA.

In January 2021, FDA issued its Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan (the “Action Plan”) to respond to stakeholder feedback on the 2019 discussion paper.[5] In the Action Plan, FDA expressed its intent to issue draft guidance to address PCCPs. FDA claimed the framework “would enable FDA to provide a reasonable assurance of safety and effectiveness while embracing the iterative improvement power of artificial intelligence and machine learning-based software as a medical device.”[6]

FDA’s New PCCP Guidance for Companies Developing AI/ML Medical Devices

In the Food and Drug Omnibus Reform Act of 2022, Congress added section 515C to the Federal Food, Drug, and Cosmetic Act, providing FDA with express authority for PCCPs.[7] On March 30, 2023, FDA issued draft guidance, “Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence/Machine Learning (AI/ML)-Enabled Device Software Functions.” This guidance provides important insight into how the agency intends to move forward with regulating AI in medical devices. FDA invites stakeholders to submit comments before July 3, 2023 (although comments can be submitted even after that date).

FDA’s new draft guidance document maintains the total product lifecycle approach proposed by FDA in the 2019 discussion paper and 2021 Action Plan. Under the guidance, FDA would allow medical device manufacturers to include a PCCP for the device as part of the device marketing application (e.g., 510(k) application) submitted to the agency. If successful, a manufacturer may make modifications anticipated by the PCCP without additional marketing submissions to FDA; in other words, FDA would pre-authorize the modifications covered by the PCCP. FDA recommends that a PCCP include a limited number of modifications that can be verified and validated to ensure efficient review.

In the guidance, FDA recommends the PCCP include: (1) a detailed description of each planned modification to the device, including the rationale for the modification and anticipated changes to the device characteristics and performance; (2) a modification protocol describing the methods that will be followed when developing, validating, and implementing modifications, including how information on the modifications will be communicated to users and how real-world data on the impact of the modifications will be monitored after implementation; and (3) an assessment of the benefits and risks of implementing a PCCP.

A PCCP may address modifications that are implemented either manually on locked models or automatically by the software (i.e., continuous learning/adaptive models), although FDA acknowledges that its review of automatically implemented modifications will be more complex and involve a benefit-risk assessment. FDA also states that PCCPs may include modifications that are implemented on individual devices based on the unique characteristics of a specific clinical site or individual patients.

AI/ML Medical Devices Already on the Market

As stakeholders waited for the agency to finalize the regulatory framework, many companies have pushed forward with developing AI/ML medical devices and obtaining FDA marketing authorization. In fact, FDA has already authorized more than 500 AI/ML-enabled medical devices, most in the last five years. The pace of AI/ML medical device authorizations is expected to increase due to the new draft guidance, FDA’s experience reviewing applications and the innovation from private industry as medical device companies develop new ways to incorporate AI/ML into their products.

The vast majority of authorized AI/ML devices are radiological devices authorized via the 510(k) clearance pathway. Over 95% of AI/ML devices have 510(k) clearances,[8] and the rest have been approved through the De Novo and premarket approval (“PMA”) pathways. As of October 2022, approximately 75% of authorized AI/ML devices are radiological devices, and approximately 11% of AI/ML devices are cardiovascular devices. Many of these devices assist healthcare providers with clinical diagnoses (e.g., by assessing radiographs for fractures or detecting atrial fibrillation from ECG data) or patient monitoring (e.g., monitoring blood glucose levels by running test data through an algorithm). Certain clinical decision support tools may also incorporate AI and require FDA authorization as medical devices.[9]

Up to this point, the algorithms in most authorized AI/ML devices have remained static because any change would require an additional submission to FDA. An exception came on February 7, 2020, when FDA announced the marketing authorization, through the De Novo pathway, of Caption Guidance software, the first cardiac ultrasound software using AI to help users capture images of a patient’s heart for diagnostic purposes. The marketing application, submitted by Caption Health, a medical AI company, was notable because it included a PCCP three years before FDA issued the draft guidance addressing such plans. The De Novo Summary for the product states that the plan contained “a protocol . . . to mitigate the risk of algorithm changes leading to changes in the device’s technical specifications or negatively affecting clinical functionality or performance specifications directly associated with the intended use of the device,” including “[a]ssessment metrics, acceptance criteria, and statistical methods . . . for the performance testing of the proposed changes.”[10] Note that the Caption Guidance software incorporates a locked algorithm—i.e., the algorithm is not adaptive, and changes must be implemented manually in accordance with the PCCP.

The new FDA guidance is expected to significantly increase the use of PCCPs in future AI/ML device applications. Indeed, FDA acknowledged in the guidance that the agency “continues to receive an increasing number of marketing submissions and pre-submissions for devices leveraging ML technologies, and [] expects this to increase over time.”[11] FDA also stated that there has been strong interest from industry in utilizing PCCPs for AI/ML-enabled medical devices ever since FDA released its 2019 discussion paper introducing the concept of the PCCP.[12]

FDA Discussion Paper on Using AI/ML to Develop Drugs and Biologics

On May 10, 2023, FDA issued a discussion paper on the use of AI and ML in drug development, including the development of medical devices intended to be used with drugs. FDA is aiming to better understand where regulatory clarity may be helpful and how to ensure the responsible use of AI/ML in drug development. Notably, FDA acknowledges that companies are exploring or already deploying the use of AI/ML in drug discovery (e.g., drug target identification, selection, and prioritization), nonclinical research (e.g., in vivo predictive models), clinical research (e.g., data analysis and designing non-traditional trials such as decentralized clinical trials), post-market safety surveillance (e.g., identification and validation of adverse events) and pharmaceutical manufacturing (e.g., early detection of deviations).[13]

Although in the discussion paper, FDA is generally supportive of these emerging uses of AI/ML, the agency cautioned that AI/ML algorithms may have certain risks such as the potential to amplify errors and preexisting biases in the underlying data sources. FDA posed dozens of questions for stakeholders in the discussion paper and invited feedback that may inform future guidance and other regulatory action.

FTC Regulation of AI Claims

The Federal Trade Commission (“FTC”) has regulatory authority over advertising claims made for AI products (including claims for most medical devices containing AI), and the agency has indicated it intends to actively surveil these claims.[14] On April 25, 2023, FTC Chair Lina Khan and officials from three other federal agencies (including the Consumer Financial Protection Bureau[15]) issued a statement emphasizing the FTC’s commitment to monitoring the development and use of AI products.[16] Chair Khan warned companies that “[t]here is no AI exemption to the laws on the books, and the FTC will vigorously enforce the law to combat unfair or deceptive practices or unfair methods of competition.”[17]

This statement followed an FTC blog post on February 27, 2023, titled “Keep Your AI Claims in Check.”[18] The FTC advises companies that the agency may pursue enforcement against those marketing AI products with false or unsubstantiated claims about the products’ abilities or benefits. In the post, the FTC indicates it is aware that AI is a “hot marketing term” and cautions companies against “overusing and abusing” the term.

The guidance provided by the agency on avoiding violations of Section 5 of the FTC Act (which prohibits unfair or deceptive acts or practices) is consistent with principles used by the FTC in other contexts for decades. The FTC advises companies marketing AI products that: (1) performance claims must have scientific support, and companies should not claim a product can do something beyond the current capability of any AI technology; (2) comparative claims (e.g., claims that an AI product does something better than a non-AI product) must also have support; and (3) if a company claims a product is AI-enabled, it must actually employ AI (“merely using an AI tool in the development process is not the same as a product having AI in it”). Companies should carefully evaluate AI-related product claims prior to dissemination, as it is clear the FTC will enforce against violative claims.[19]

[1]       ML algorithms are data-driven AI systems that “learn” from examples in  large datasets (i.e., training sets) without being explicitly programmed to reach a particular answer or conclusion. These algorithms can learn to decipher data patterns at scales unattainable by humans to identify relationships between the input (e.g., radiologic images) and output (e.g., diagnoses/clinical decision support). For example, a device may use an ML algorithm to identify signs of cancer in a CT scan much earlier than most doctors.

Datasets can be supervised or unsupervised. In supervised learning, the data are labeled by humans, and the machine is programmed to identify specific patterns. Most FDA ML applications use supervised learning (which requires a training dataset for which the outcome variable (e.g., disease state) is known). Unsupervised learning does not have labeled data, so the algorithm can find patterns or trends that people are not explicitly looking for or aware of. Most ML algorithms use three datasets for training, validation and testing.

[2]       FDA, Draft Guidance: Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence/Machine Learning (AI/ML)-Enabled Device Software Functions (Apr. 3, 2023) (“PCCP Guidance”), available here.

[3]       FDA, Using Artificial Intelligence & Machine Learning in the Development of Drug & Biological Products, Discussion Paper and Request for Feedback (2023), available here.

[4]       FDA, Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD), Discussion Paper and Request for Feedback, available here.

[5]       Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan (Jan. 2021), available here.

[6]     Id. at 1. Also in 2021, FDA issued Good Machine Learning Practice for Medical Development: Guiding Principles (available here) jointly with regulators in Canada and the United Kingdom.

[7]       Section 515C provides that supplemental applications or new premarket notifications are not required for a change to a device if the change is consistent with a PCCP previously approved or cleared by FDA.

[8]       A 510(k) application is a premarket submission made to FDA to demonstrate that a new device is substantially equivalent to a legally marketed predicate device.

[9]       FDA, Guidance for Industry and FDA Staff: Clinical Decision Support Software (Sept. 28, 2022), available here. Some clinical decision support tools may be exempt from FDA clearance or approval based in part upon statutory changes to the medical device definition in the 21st Century Cures Act of 2016.

[10]      De Novo Classification Request for Caption Guidance, available here.

[11]     PCCP Guidance at 4.

[12]     Id.

[13]     FDA issued a separate discussion paper addressing the use of AI in drug manufacturing on March 1, 2023. FDA, Artificial Intelligence in Drug Manufacturing (Mar. 2023), available here.

[14]     The FTC is also focused on AI’s potential for discrimination or bias and the use of AI to commit fraud or influence people’s beliefs, emotions and behavior.

[15]     For more information on the regulation of artificial intelligence in the financial sector, see Debevoise In Depth: Increased Focus by Federal Regulators on AI and Consumer Protection in the Financial Sector (Nov. 10, 2021), available here.

[16]     FTC Chair Khan and Officials from DOJ, CFPB and EEOC Release Joint Statement on AI (Apr. 25, 2023), available here.

[17]     Id.

[18]     FTC Business Blog, Keep Your AI Claims in Check (Feb. 27, 2023), available here.

[19]      For more information on FTC enforcement trends, see Debevoise In Depth: A New Era of Federal Trade Commission (“FTC”) Privacy and Cybersecurity Oversight: Top Ten Things Companies Should Know When Assessing FTC Compliance and Exposure (Jan. 12, 2022), available here.


For more information on the array of legal and regulatory issues associated with AI across all sectors, subscribe to the Debevoise Data Blog, here.



Andrew Bab is a corporate partner, member of the firm’s Mergers & Acquisitions and Private Equity Groups and Co-Chair of the Healthcare & Life Sciences Group. He can be reached at albab@debevoise.com.


Avi Gesser is Co-Chair of the Debevoise Data Strategy & Security Group. His practice focuses on advising major companies on a wide range of cybersecurity, privacy and artificial intelligence matters. He can be reached at agesser@debevoise.com.


Paul D. Rubin is a corporate partner based in the Washington, D.C. office and is the Co-Chair of the firm’s Healthcare & Life Sciences Group and the Chair of the FDA Regulatory practice. His practice focuses on FDA/FTC regulatory matters. He can be reached at pdrubin@debevoise.com.


Melissa Runsten is a corporate associate and a member of the Healthcare & Life Sciences Group. Her practice focuses on FDA/FTC regulatory matters and includes the representation of drug, device, food, cosmetic and other consumer product companies. She can be reached at mrunsten@debevoise.com.