Wealth Think

Before you buy gen AI tools, ask these 4 questions

I was on a call with my financial advisor recently discussing estate planning when they dropped a comment that stopped me cold: "We can share a copy of this meeting transcript. We just need to email our vendor's support team and they'll send it right over."

Paul Morville (2).jpg
Paul Morville, product architect for Advisor360º

The tech vendor can access a transcript of my estate plan meeting with my advisor?

AI can be an extraordinary ally to financial planners, amplifying productivity, reducing administrative burdens and strengthening client relationships. Eighty-five percent of financial advisors believe generative AI will help their business, and 76% already reap its benefits. Nine out of 10 use AI-enabled assistants daily, according to a recent survey conducted by our firm

READ MORE: ​​Quickly evolving tech is making long-term vendor commitment a thing of the past

But there are trade-offs, particularly around data security. In its race to gain a competitive edge by adopting generative AI, the wealth management industry is leapfrogging over data privacy and security issues, particularly when it comes to the technology vendors they employ. It is incumbent on advisory firms to ensure that the gen AI tools they provide their advisors are secure, vetted and compliant. 

Here are four essential questions firms should ask prospective wealthtech vendors before adopting new AI tools.

Who owns our data?

Once data enters the vendor's software or application, the advisor and the firm may no longer own it. For instance, once transcription software captures an advisor-client conversation, the company that owns the software might own that meeting transcript. 

Make sure there's no ambiguity: You should own your data. 

Firms should require vendors to delete their client data in compliance with SEC and FINRA cybersecurity and data privacy  guidance. Confirm in writing that your client data remains your property and is not used to train AI or improve the vendor's models. If ownership isn't clearly stated, your data could be absorbed into the system and used in ways you never intended. Note that vendors may have dual pricing models: one rate if they can keep and use your data and a higher one if they're required to delete it.

Who has access to our data (and where is it stored)?

If a vendor's customer support or marketing team can pull up a client meeting, that's a problem. 

Ask for written confirmation of the vendor's access and audit protocols and data security protections. Keep in mind that from health and wellness to DNA testing, tech vendors in other industries monetize data by training their large language models on it or packaging insights gleaned from the data to sell to others. It would be naïve to think financial data wouldn't be treated similarly.

How experienced is the data security team?

When it comes to cybersecurity, if a single endpoint is compromised by, say, a support rep who clicks the wrong link, a cybercriminal could gain access to every client conversation stored on the platform. Even when security breaches occur on the vendor's end, the liability remains with the financial institution. 

As both a founder and an advisor to numerous startups, I've seen firsthand that many young companies run lean and fast, prioritizing speed to market over data security. In a regulated financial services industry, that's a dangerous equation. You're trusting AI tech vendors with sensitive data — insist on knowing how many security specialists they have, their experience and their credentials.  

READ MORE: Family offices are ripe targets for cybercriminals; here's how to protect them

What did your last pen test cost?

A penetration or "pen" test simulates a cyberattack to uncover system vulnerabilities before cybercriminals do. 

Costs for such tests can range from $5,000 for a check-the-box exercise to six figures for an in-depth audit. A vendor that skimps on pen testing is cutting corners on security. A meaningful investment in cybersecurity is a strong indicator of how seriously AI vendors take data protection.

Data breaches can destroy reputations and drain client accounts, so advisory firms can't afford shortcuts. Hold AI to the same high-security standard as your CRMs, document vaults and e-signature platforms — and choose your partners wisely.

For reprint and licensing requests for this article, click here.
Technology Practice management Artificial intelligence RIAs
MORE FROM FINANCIAL PLANNING