While the SEC made an early foray into proposing rules to govern use of generative AI (Gen AI) by broker-dealers,[1] FINRA has been taking a more traditional approach to emergent technology: surveying members on uses, issuing white papers,[2] publishing observations from its examinations program,[3] and issuing guidance about the application of existing rules.[4]  Consistent with this approach, on June 27, 2024 FINRA issued Regulatory Notice 24-09 (the “RN”) reminding members of regulatory obligations when using Gen AI tools for compliance surveillance.

In the RN, FINRA reminds members that it is closely monitoring evolving Gen AI use and acknowledges both that the technologies present promising opportunities and create new risks.  The RN also re-emphasizes that (i) the use of Gen AI tools implicates essentially all of the FINRA rulebook (with the specific rules at issue depending on the use case), and (ii) FINRA’s rules are intended to be technologically neutral and therefore apply to the use of Gen AI equally to other tools.

While the RN does not announce any new policies or interpretations, it notes in particular that use of Gen AI tools in the supervisory process will be evaluated in light of requirements under FINRA Rule 3110 for members to maintain a reasonably designed supervisory system.  Specifically, the RN advises firms using such tools for tasks like reviewing electronic communications to maintain policies and procedures that address technology governance, attendant model risks, data privacy and integrity, and reliability and accuracy.  The RN also highlights that these requirements apply to vendor tools as well as those developed in-house, and advises firms to evaluate such tools prior to employment as well as on an ongoing basis.

The RN also encourages firms to consult with FINRA and/or request interpretive guidance in cases of uncertainty.

We are available to discuss FINRA’s guidance and other considerations for use of Gen AI tools by broker-dealers.  Please do not hesitate to contact us with any questions.

The cover art used in this blog post was generated by Microsoft Copilot.

To subscribe to our Data Blog, please click here.

[1] See, “SEC Proposes Rule to Eliminate or Neutralize Conflicts of Interest in the Use of “Predictive Data Analytics” Technologies.  DSS Blog, August 14, 2023.

[2] See FINRA Report:  Artificial Intelligence (AI) in the Securities Industry (June 2020).

[3] See 2024 FINRA Annual Regulatory Oversight Report (January 2024) at 10.

[4] See e.g., Frequently Asked Questions About Advertising Regulation, Questions B.4 and D.8.


Charu A. Chandrasekhar is a litigation partner based in the New York office and a member of the firm’s White Collar & Regulatory Defense and Data Strategy & Security Groups. Her practice focuses on securities enforcement and government investigations defense and cybersecurity regulatory counseling and defense.


Avi Gesser is Co-Chair of the Debevoise Data Strategy & Security Group. His practice focuses on advising major companies on a wide range of cybersecurity, privacy and artificial intelligence matters. He can be reached at agesser@debevoise.com.


Matthew Kelly is a litigation counsel based in the firm’s New York office and a member of the Data Strategy & Security Group. His practice focuses on advising the firm’s growing number of clients on matters related to AI governance, compliance and risk management, and on data privacy. He can be reached at makelly@debevoise.com


Jeffrey L. Robins is a corporate partner and a member of the Debevoise Banking Group. His practice focuses on representing broker-dealers, swap dealers, banks, securities exchanges, industry associations and buy-side institutions in regulatory and transactional matters. He can be reached at jlrobins@debevoise.com.


Kristin Snyder is a litigation partner and member of the firm’s White Collar & Regulatory Defense Group. Her practice focuses on securities-related regulatory and enforcement matters, particularly for private investment firms and other asset managers.