Opinion  

'Chatbots pose a risk for financial services firms'

Zoe Morton

Zoe Morton

Scour the web and you will find an endless stream of AI-powered solutions on offer, aka chatbots, many of which are designed for financial services firms to support their customer service. With promises of accelerated communication, effective data gathering and a heightened awareness of consumer needs, is this all too good to be true? 

As someone who has spent more than 20 years working in financial services, I would say I am fairly financially savvy. However, I admit there have been times where I have found myself caught on the back foot with chatbots.

On the surface these can be made to look and feel very much like an interaction with a real human (Santander’s own chatbot is called ‘Sandi’ and HSBC’s is ‘Amy’). Often, it is only after a bit of questioning and information gathering by the bot that I am given an opportunity to be routed through to a real person to discuss my query.

Article continues after advert

When chatbots go wrong the fallout can be significant. One well-known parcel delivery firm had to disable part of its online support chatbot after it swore at a customer in early 2024.

In late December 2023, in the US a customer almost managed to purchase a new vehicle for $1 through the manufacturer’s chatbot after it was hacked.

Thankfully no such reports in terms of this type of activity where UK financial services firms are involved, but the hacking incident highlights the need for stringent security and operational resilience, which includes any outsourcing or third-party suppliers.

When things go wrong the reputational damage can be significant too. 

Clearly, there is a balance to be struck in terms of keeping pace with modern technology while considering those customers who may be less financially literate and may need the human touch.

Indeed, research from Smart Money People highlighted that almost half (48 per cent) of clients complained about a lack of human support and 24 per cent claimed there was an over-reliance on chatbots.

Consumer protection

In its Plan for Financial Services, published in January this year, the Labour party outlined its vision for reinforcement for consumer protection and financial inclusion, including becoming a "global standard setter for the use of AI in financial services" and also "establishing a regulatory sandbox for financial products to reach underserved communities".

Indeed, leading consumer rights champion Martin Lewis and the City of London have joined forces to urge the UK chancellor to make financial inclusion a priority for the Financial Conduct Authority.

What this looks like, only time will tell, but what is key is ensuring that as many consumers as possible have the right access to tailored support specific to their individual circumstances when needed.

A chatbot is clearly not equipped or qualified to act as an IFA substitute, and in the case of financial services advice can only go as far as signposting an individual to the right place for further advice. 

The FCA’s consumer duty stresses the need for customers to be offered the right level of support and ensure consumer understanding, and the FCA stresses the need for vulnerable customers to be treated fairly, and for their needs to be understood.