As SEC launches AI Task Force, here's what advisors should know

sec
Bloomberg News

The SEC's recently launched AI Task Force will most likely make its investigations more efficient.

It also means that advisors should take care to avoid raising anything that might be interpreted as a red flag as the machines shift through the data.

The AI Task Force will probably develop AI tools to analyze vast amounts of trading data, communications and filings more quickly than human reviewers can, identifying suspicious patterns and compliance violations faster, said William Trout, director of securities and investments at technology data firm Datos Insights.

However, this efficiency comes with risks of false positives, where AI algorithms might flag legitimate advisory activities as potentially problematic based on pattern recognition that lacks nuanced context, he said.

"The explainability challenge is particularly crucial for advisors, as they'll need to understand why AI flagged their activities to mount effective defenses," said Trout. "The SEC's emphasis on 'trustworthy' and 'responsible' AI suggests the regulator recognizes these concerns, but the practical implementation remains uncertain."

Document and examine everything internally

Advisors should expect more frequent inquiries as AI enables broader surveillance, while also preparing for potential algorithmic bias that might disproportionately target certain advisory practices or client demographics, said Trout.

Success will depend on whether the SEC can balance enhanced detection capabilities with fair, transparent processes that advisors can navigate effectively, he said.

READ MORE: This is the biggest cybersecurity threat for wealth firms

"To prepare for potential algorithmic bias, financial advisors should proactively document their decision-making processes with detailed rationales for investment recommendations, client communications and trading strategies," said Trout. "They should maintain comprehensive records that demonstrate compliance with fiduciary duties and suitability requirements, particularly for clients in demographics that might trigger algorithmic scrutiny."

Advisors should also consider conducting internal audits of their practices to identify patterns that AI might misinterpret as suspicious, such as concentrated trading in specific sectors or unusual timing of transactions, said Trout.

"Additionally, advisors should stay informed about the SEC's AI implementation through industry publications and legal updates, while investing in their own compliance technology to help identify and address potential red flags before they attract regulatory attention," he said.

READ MORE: When AI wastes more time than it saves for advisors

As the very foundation of AI is pattern matching, firms need to be more conservative with their risk-taking posture with the SEC, said Jimmie Lee, founder of IT consultancy JLEE & Associates. As the task force improves its efficiency, he anticipates new policies and regulations for the market.

"Historically, such updates have always occurred, but if they utilize AI to identify necessary policies, the SEC will have unprecedented tools to enforce them," said Lee. "Consequently, the duration from policy announcement to enforcement might decrease, potentially accelerating regulatory actions."

Firms should monitor the AI Task Force progress more closely, seek opportunities to give feedback and plan extra time for appealing decisions, said Lee.

"They should consider adding more governance and auditing capabilities to their workflows before making decisions," he said. "Additionally, they should automate governance and auditing after implementation to ensure clients stay compliant with the SEC's faster pace."

AI is now considered infrastructure, so act accordingly

If the SEC is making use of advanced technologies within their operations, they are going to become more familiar with the suitability and effectiveness of those technologies within financial services workflows — and are more likely to question why firms are not making use of those technologies themselves, said Robert Cruz, vice president of regulatory and information governance at Smarsh.

"We have already seen this in the SEC's adoption of surveillance technologies in their examination processes," he said. "Firms that continue to exclusively rely upon methods for communications inspection that are less effective than other commercially available approaches used by the SEC themselves, such as large language models, may face added scrutiny."

For advisors, this signals a shift, said Jack Fu, co-founder of AI-powered investment platform Draco Evolution and Draco Capital Partners. AI isn't optional, it's infrastructure, he said.

"Firms should begin auditing their current use of automation and data-driven tools, while also setting clear policies around transparency, bias mitigation and client communication," he said. "AI has the potential to reduce advisory costs and expand access, but only if advisors are equipped to explain and manage these tools responsibly. The opportunity here is to align smarter tech with smarter compliance. Advisors who prepare now will be better positioned to thrive in a more AI-regulated future."

For reprint and licensing requests for this article, click here.
Regulation and compliance Artificial intelligence Technology
MORE FROM FINANCIAL PLANNING