SEC examiners have opened a wide-ranging probe into how private fund advisers use artificial intelligence, Private Funds CFO has learned.

In at least two cases, regulators have asked private funds 15 sets of questions about how they are using AI, including a description of their “models and techniques,” a “list of algorithmic trading signals and associated models,” the source and providers of their data, and internal reports of “any incidents where AI use raised any regulatory, ethical, or legal issues,” according to a document request letter reviewed by Private Funds CFO.

At least one of the request letters came from an examiner on the SEC’s national exam team based in New York. The letters are part of a newly launched exam sweep of the industry, sources familiar with the probe say. The sources spoke on condition of anonymity to protect their clients’ privacy.

News of the sweep comes just as the commission is weighing broad and strict new rules governing how investment advisers use advanced technology – and how they protect sensitive data from cyber-hackers. A separate rulemaking notice on how fund advisers manage their outside vendors also looms.

The questions in the private fund document request letters suggest that regulators are looking for details in both areas. It’s possible that the information examiners gather will find its way into a risk alert, which could in turn be used to buttress any final AI/cybersecurity or outsourcing rules. In any case, it’s yet another entry point for regulators into a $25 trillion industry that already feels itself under siege from Washington.

Employee lists sought

Matt Rogers

Most of the questions in the document request letter focus on how the funds manage AI risks. Examiners ask for copies of the firms’ AI compliance policies and procedures (including any changes they have made to policy since the exam started); the firms’ contingency plans “in case of AI system failures or inaccuracies”; a sample of the firms’ “client profile documents used by the AI system to understand each client’s risk tolerance and investment objectives”; the firms’ security measures; a “list and description of all data acquisition errors and/or adjustments to algorithmic modifications due to data acquisition errors,” and samples “of any reports detailing the validation process and performance of the robo-advisory algorithms.”

Three other sets of questions focus on how the funds supervise staff on AI matters. Examiners ask the funds for a list of those “who develop, implement, operate, manage or supervise AI software systems.” Regulators also want a “list of all board, management or staff committees specific AI-related responsibilities, meeting frequency, and a list of the members of each committee. Please also indicate whether the committee keeps written minutes,” the document requests state.

It is unlikely regulators are asking for employee lists idly: When the commission swept private funds advisers for compliance with books-and-records rules last year, they subpoenaed employee phone records by name.

At least two sets of the questions in the document requests are couched in the SEC’s marketing rules. Regulators ask the funds to provide all “disclosure and marketing documents to clients where the use of AI by the adviser is stated or referred to specifically in the disclosure,” including “all audio and video marketing in which the adviser’s use of AI is mentioned.”

The second set of questions asks for a list “of all media used to advertise, market or promote products and services” including “social media, chat forums, websites, DDQ responses, private placement memoranda, pitch books, presentations, newsletters, annual reports, and podcasts and/or other video or audio marketing.” Examiners also want two recent examples of each kind of ad, the document requests show.

‘Future crises’

Emilie Abate

SEC chairman Gary Gensler has led one of the most aggressive commissions in decades. He’s opened more than four dozen rulemaking notices and both the exams and enforcement divisions have stepped up their efforts. Cybersecurity is the one issue that has animated the chairman more than any other. A former MIT professor, Gensler has talked repeatedly about the “balloons and confetti” fund advisers use to attract and engage their investors. He is also worried that technologies such as AI, while ratcheting up the productive capacity of the industry, also ratchets up systemic risks in the industry.

“This technology will be the center of future crises, future financial crises,” Gensler told The New York Times in early August.

‘Game changer’

Reading through the AI/cybsecurity rule proposal, K&L Gates attorney Matt Rogers says he wonders whether funds should consider having some kind of kill switch on advanced technologies. The 239-page proposal says, among other things, that it is no longer enough for funds to identify and disclose the conflicts of interests they find. They will have to eliminate those conflicts, too. If adopted as written, those rules would be “a game changer,” Rogers says.

“If I’m in a shop that uses those types of technologies, I’d be taking a hard look at them and making sure that there are some cutoffs in place,” he says. “Because you have to mitigate almost any conflicts, if you can’t control the technology, you’ll need to be able to find ways to turn it off.”

It is not the first time the commission has flirted with the idea of a disclose-and-cure standard. In 2019, under Republican chairman Jay Clayton, the commission proposed similar language in its Regulation Best Interest rules. The final rules stepped back from the proposal, but this time may be different. Among the harshest critics of the final Reg BI rules was Barbara Roper, then with the Consumer Federation of America. Roper is now one of Gensler’s senior advisers.

Cyber-‘hygiene’

Emilie Abate was a branch chief at SEC’s exams division. Now a director with compliance consultant Iron Road Partners, she says she worries that too many private fund advisers may view cyber-hacks as the cost of doing business, and therefore may invest less in cybersecurity policies and technology.

“Whether or not a sweep is underway, private funds should take AI security seriously, especially when investor information is being utilized as a data source or input into large language models,” she says.

Regulators have gotten a lot more sophisticated in how they analyze cyberattacks, Abate says. They may look to media reports or investor relations pages for registrants to follow up with.

“It’s incredibly important for advisers to understand the types of data they’re collecting, to be able to identify whether the data contains sensitive, personally identifiable information and know where the data is being stored and how it is being protected,” Abate says.

When they are talking about cyber-risk management, the last thing regulators want to hear from a fund adviser is “I don’t know,” she adds.

If examiners think the cybersecurity problems are big enough, they will refer the case to the SEC’s enforcement division, Abate says. “One of the first questions the enforcement team often asks is, ‘Where is the investor harm?’” Abate says. “Strong consideration would likely be given to attacks involving compromised social security numbers or particularly egregious failures in preventing or responding to an intrusion.”

Among the red flags examiners are likely to look for, Abate says, are poor cybersecurity hygiene, shortcomings in vendor oversight, and lack of compliance engagement with a fund’s IT teams.