unsplash-image-HeNrEdA4Zp4.jpg

News

Can AI give you financial advice?

AEGIS can AI give you financial advice, person with phone and laptop

By Thomas Kidd

With the recent developments in AI technology like Chat GPT, there is speculation about how it can be implemented in all kinds of industries, and the financial advice sector is no exception. The prospect of freely available AI generated advice is exciting, but raises a plethora of issues not just for advisers, but for investors and regulators as well; can you trust that the advice you receive is accurate? And if it’s not, how can the regulators hold AI accountable?

First we need to understand that AI tools like Chat GPT are not “intelligent” in the way we usually think about this word. GPT stands for “Generative Pre-training Transformer”; this kind of AI uses a library of training data to learn how to generate new information that looks similar. Basically, when you ask Chat GPT a question, it looks through its database for similar looking text, and tries to create a response which looks like the responses in its database. It’s able to do this well enough that it really seems to know what you’re talking about, providing relevant and often accurate responses; but of course it doesn’t always get it right. Here for example, I asked Chat GPT a straight-forward math problem:

Me: What is 293.67 / 36.98 ; round your answer to 5 decimal places

 

Chat GPT: The result of dividing 293.67 by 36.98 and rounding to 5 decimal places is approximately:

293.67 / 36.98 ≈ 7.94424

Looks good, right? Well if you check this question on a calculator, you’ll find GPT was only able to get the first two decimal places right. The thing to keep in mind here is that GPT is not using a calculator; after consulting its database, it decided that ‘7.94424’ looks like the kind of answer I am expecting.

The problem here should be clear; how often do you double-check the answer on your calculator is correct? Well, many people seem to think that since GPT is made with a computer, it must use the same rigid logic as a calculator, and this is far from the truth.

But we can do better; many AI tool developers today are using GPT to understand user prompts, but then connecting it to an existing application to do the busy work. For example, an AI connected to Microsoft Excel could be prompted with my question above, but instead of trying to generate an answer itself, it would plug this question into Excel, and then come back to me with the answer. In this form, the AI is simply running Excel for me, based on my requests.

Unfortunately the same problem eventually arises; if I ask the AI for something more complex, how can I verify that it understood me correctly? I’m going to need to check the responses, and this time I can’t just plug it into my calculator; there could be hundreds of datapoints that need to be confirmed.

This is a problem that any artificial intelligence is going to face; at a certain point, you are taking the responses on faith, trusting that the AI understood you and is providing an accurate answer. We already do this with many digital tools; I trust my calculator to give correct answers, my phone to tell me the correct time, my banking app to tell me how much money I have. But when dealing with GPT, we have to be a bit more discerning about when we trust it.

At present there are already financial advice AI tools available, but they are being used exclusively by advice professionals to speed up the advice process. One such example, moneyGPS, is advanced enough that it can “deliver a fully compliant, single topic statement of advice”, and its developer hopes to make this tool available to the public within a year [1].

If this goes ahead, it will be interesting to see how regulators respond; when the AI produces inaccurate or incomplete advice, will its parent company accept liability, or will the burden be placed on consumers to use the tool at their own risk? Add to this the potential privacy issues that could arise from plugging sensitive information into a third-party app, and it looks like APRA are going to have their work cut out for them. With that being said, there is plenty of optimism that AI tools may be instrumental in providing advice to more Australians that need it, with the recent Levy review highlighting the potential of robo-advice to help close the gap [2].

We are excited to see how AI is shaping the world, not just in the advice space but in all aspects of life. There is great potential for these tools to help us deliver quality advice in a timely manner. But at this stage, the tools that are available are not suitable for providing financial advice, so think twice before trusting them with your hard-earned money.

References

[1] https://www.smh.com.au/money/planning-and-budgeting/ai-powered-robo-advisers-could-plug-financial-advice-gap-20230413-p5d08b.html

[2] https://treasury.gov.au/sites/default/files/2023-01/p2023-358632.pdf

Thomas Kidd in an authorised representative of Alliance Wealth Pty Ltd. (AR: 001292328)

Luke Kidd