MFDF Insights: ACA Benchmarking Report Highlights Use of AI in Compliance

MFDF recently hosted a webinar titled: Updates on the Role of AI in Compliance: Insights from ACA's 2025 AI Benchmarking Report. Carlo di Florio, President of ACA Group, joined MFDF to provide highlights from the report.

 

The following represent some of the survey questions discussed in the session.

 

Q:      What are some of the key use cases of AI in the fund industry currently?

A:       Survey responses indicate that AI is currently being used across several core functions in the fund industry, including investment research and diligence, compliance risk management, marketing review and operations/IT.  In compliance, AI is helping streamline time-intensive areas by supporting workflow management and executing preliminary regulatory reviews, with the added overlay of human oversight and signoff. AI is also being embedded in compliance technology solutions such as best execution and AML processes.  In addition, AI is increasingly being used to enhance monitoring and connectivity across compliance systems, including trade and market surveillance, employee compliance areas such as personal trading, and electronic communications. 

 

Q:      What are the key risks of AI use that participants identified?

A:       Participants identified several key risks associated with the use of AI, including privacy and information security concerns, hallucinations, regulatory compliance challenges, and cybersecurity threats. Core areas identified for continual improvement at respondent firms include human oversight processes, third-party oversight and enhanced cybersecurity. 

Q:      What has changed since your prior survey on AI use?  Are we at the point where directors should be concerned if firms are not using AI in compliance?

A:       We found that once firms started seeing positive results from AI, reliance on these tools increased.  This underscores the importance of directors focusing on the appropriate balance between innovation and governance, including whether oversight frameworks and controls are keeping pace with expanded use. While firms are not expected to adopt AI, there is a clear trend towards embedding AI into compliance processes, which could influence evolving expectations about what constitutes a reasonable compliance standard over time.

 

Q:      What questions should directors be asking about AI governance and oversight?

A:       Directors will want to develop an understanding of how fund advisers are approaching the governance and oversight of AI, including whether there are policies defining authorized use and decision-making processes and whether oversight roles and responsibilities are clear.  Directors may also want to ask how models are tested and validated, what data inputs and outputs are being used, and how cybersecurity risks are managed.  Another area that directors may want to focus on is the use by fund service providers of AI, including what contractual provisions about data usage and model training are in place and how these arrangements are monitored on an ongoing basis.

 

Click here to watch an archived video of the ACA’s presentation on AI use by compliance professionals.

 

Additional webinars presented as part of MFDF’s AI webinar series include:

 

As a reminder, webinar archive materials are available to MFDF members at no cost.