Opinion  

'AI needs to prove it's working for clients' best interests'

Vijay Raghavan

The wide adoption of artificial intelligence in other industries has wealth management leaders working to identify optimal use cases that drive better digital experiences for customers, while also creating efficiencies for financial advisers and back-office operations.

Imagine the typical scenario where a prospect works with a financial adviser to become a customer at a wealth management firm. 

While onboarding, the prospect shares data through financial statements and identity documents, along with information on their investor self-directedness, preferences, and risk tolerance. 

Article continues after advert

During the risk assessment, AI systems at the wealth management firm automatically identify and assess anti-money-laundering and 'know your client' risks early in the process, and proactively alert the adviser to any issues. 

Once onboarded, AI systems identify connections and surface insights to the financial adviser about what the customer is doing with their investments across their various accounts (brokerage, retirement, college savings). 

The data that customers are now sharing with wealth management firms has not changed significantly compared to the past. However, what is changing is that forward-looking firms are adopting advanced AI algorithms to add more ‘intelligence’ to the financial adviser at every stage of the customer lifecycle.

These AI systems are analysing large datasets and generating insights at a scale faster than humans can. For example, if a client has an environmental, social or governance preference, or perhaps a teenager about to enter university, proactive alerts to the adviser generated by AI can help create hyper-personalised portfolios that reflect their customer’s values, goals, and current life situation. 

So, while a financial adviser is not any ‘smarter’ because of the AI tools at their disposal, the AI-assisted insights help them to be more effective for their clients, driving loyalty while increasing assets under management.

Adding 'explainability' to AI

The potential of AI in wealth management is significant and can help advisers see correlations within a customer portfolio or uncover new insights that are not easily apparent. 

In a highly regulated industry that relies on the quality of the advice given, potential AI deployments in wealth management suffer from a trust problem. 

Forrester’s data reveals that 25 per cent of data and analytics decision-makers say that lack of trust in AI systems is a major concern when it comes to using AI, and 21 per cent cite a lack of transparency with AI/machine learning systems and models.  

Explainable AI (XAI) provides the transparency that stakeholders need to be confident that an AI system is leading to outputs in an open and traceable way. 

Forrester defines XAI as: "Techniques and software capabilities for ensuring that people understand how AI systems arrive at their outputs."

XAI will help financial advisers to:

1. Improve the customer experience through trust and transparency. 

Through any AI-related interactions during onboarding and servicing, XAI provides transparency on how AI derives these insights, providing confidence to the customer and adviser.