Should you ask AI for financial advice?

We explore the pros and cons of harnessing the growing, almost unavoidable, resource of artificial intelligence for your financial planning.

Maybe it started with seeking some tech advice, “how to set up new wi-fi” or “what is the payback time on solar panels?” then before you know it, AI became the ever-knowledgeable best friend you didn’t know you needed. Always there, offering instant, kind and patient advice for your health condition, your purchases, even becoming for some people a proxy for therapy or counselling.

Inevitably, you might ask it financial questions, especially as the global situation seems so volatile and ever-changing. What is wrong with a little help with decisions or reassurance that all is well?

As financial advisers, you might expect us to say, “That’s terrible, only come to qualified professionals.” But the situation is far more nuanced than that. Just as the health advice given by AI might correctly direct you to a pharmacy rather than wasting the time of your GP, financial advice online can be a useful resource.

However, the critical thing is to know how and when to use it safely.

We’ve put together a guide that will help you make informed decisions about using AI to support your financial planning.


Types of financial online advice

Chat GPT, Gemini or Claude are the most used Large Language Models (LLMs). However, most search engines now offer an AI sourced response too, that is presented ahead of the traditional word matching results.


The advantages of AI financial advice:

Instant

It’s ability to analyse patterns, assimilate vast quantities of data and answer your question just as quick as you hit ‘return’ really is impressive. It’s there for you at midnight, Christmas day and Sunday afternoon, when your adviser isn’t. Whilst you certainly don’t want to rush into decisions, it can give you information, answer a question or help you understand the impact of global events just when you need it.

Free

It goes without saying, unless you’re signing up to a specific money management app, the advice you get won’t cost a penny. Given that paid advice only makes sense (or is available) when you have significant income or capital, then LLMs can be a democratic option, bringing some form of advice for those starting their investment and planning strategies.

Unbiased

AI doesn’t care about your age, race, what you look like or frankly anything about you. It (as far as we know) isn’t trying to sell you anything. However, do remember that it is only as good as the data and content that is out there on the web, a lot of which is marketing.

Anonymous

Talking about money is inherently personal. It’s about your hopes and dreams, fears and ‘what ifs,’ as well as the upsetting thought of end-of-life planning. Deciding you trust another human enough to have those conversations can take time. AI seems distant and removed from reality, so it can feel easier to open up and say the unsayable, whether that is, “I thought my husband would have died by now and the care home fees are eating into my future security” or “I’m thinking about divorce but don’t want to be left without a nice lifestyle.” AI will tackle questions like that without you fearing judgement.

Often sensible

When you are worried about finances, the advice it gives can be kind, calm and reassuring. It is capable of suggesting that you don’t rush into decisions – which in many instances is exactly what your financial adviser would advise.

When you don’t know much

The world of finance and investing can be daunting. You can ask the ‘silly’ questions and not worry about sounding clueless. It can be a good starting point… but arguably those without knowledge could fall into AI traps… see the disadvantages…



The disadvantages:

Just plain wrong

Sometimes, LLMs come out with some right old rubbish, always presented with a strong dose of confidence. It might forget a vital detail you have earlier given. It might ‘hallucinate’ – literally making something up that is factually inaccurate and misleading.

AI advice can be based on incomplete or outdated information, particularly in fast-moving markets or changing regulatory environments.

In a study, four different LLMs were tested, and asked 12 straightforward financial questions and then scored on the accuracy of their response, as 0 (incorrect financial analysis), 0.5 (correct financial analysis but mathematical errors), or 1 (fully correct). The best only scored five out of 12.1

Lack of nuance

AI tools rely on patterns in data rather than a full understanding of your circumstances, which means they can miss real life nuance around issues such as tax position, risk tolerance, family dynamics, or long-term goals. Often, you’ll be given a definite answer, but when you offer another bit of information or question it, the answer completely changes. By contrast, your financial adviser gathers all the relevant information first, and only then gives advice.

Environmental

It’s hard to comprehend the environmental impact of using AI. Why should it be any different to a normal search engine?

The United Nations Environment Assembly explains that the vast data centres that house AI servers are large consumers of energy and water, which is used for cooling systems. The computers and microchips are created with critical minerals and rare elements, which are often mined unsustainably.2

AI’s water usage is predicted to reach up to 6.6 billion cubic meters by 2027, which compares to the total annual water consumption of Denmark.3

In tech hub, Ireland, data centres accounted for more than a fifth of the country’s electricity usage in 2024, a figure that continues to grow.4

Privacy

Whilst using AI may feel secure and anonymous, there are important data privacy considerations to be aware of.

Many publicly available AI tools can store information you share and may use it to improve or train their models. That means anything you enter into an open platform could potentially be retained or reused in ways you don’t control.

For that reason, we would always caution against sharing sensitive financial information, such as account details, policy numbers or anything that could compromise your security.

However, it’s important to distinguish between these public tools and how AI is used within professional environments.

Financial planning firms increasingly use AI within secure, regulated systems: for example, tools that are encrypted, sit within protected networks, and do not use client data to train external models. These are designed to enhance efficiency and support advisers, while maintaining strict confidentiality standards.

In other words, the risk is not AI itself, but how and where it is used.

Not accountable

Unlike a regulated financial adviser, AI tools are not accountable in the same way, offering no formal recourse if advice leads to poor outcomes. There is no way of knowing without further research if what you are being told is sound or stupid.

You can be far more confident that an established firm of financial advisers has given consistently good advice over years, otherwise they simply wouldn’t attract and retain clients.



Scenarios

Let’s look at two real scenarios and fact check the AI advice with that of a qualified human and, definitely not a robot, Stefani Williams, Financial Planner and Partner.

1) I’ve listened to the news and noticed my shares are tanking. What should I do?

ChatGPT: When markets dip after major global events, the instinct to act quickly can be strong, but the most effective response is usually a measured one. If your portfolio is built on solid, long-term fundamentals, short-term volatility alone is not a reason to sell. Instead, this is an opportunity to review diversification, and, if appropriate, continue investing steadily to take advantage of lower prices. Selling in a downturn can lock in losses, whereas staying focused on long-term goals allows time for recovery and growth.

Stefani says, “This AI advice is fine – we’d also be reassuring a client and having a conversation about risk levels, longer-term goals and life plans.”

Let’s try a question where some actual numbers are needed… in this example, Stefani engages in a conversation with the AI.

2) My client has 1,000 meta shares and is a higher rate taxpayer. These were bought at 300 US dollars. What are the likely Capital Gains Tax implications?

ChatGPT: If Gain ≈ £80,000 after FX conversion
Less allowance (£3,000) → £77,000 taxable
Then: At 20% → ~£15,400 CGT

Stefani: The CGT tax rate is not 20% for a higher rate taxpayer.

ChatGPT: You’re right to challenge that — but for UK CGT on listed shares, the 20% figure is actually still correct in most standard cases.
There have been changes and multiple rates floating around, so it’s easy to mix these up:

  • Residential property gains: 18% (basic) / 24% (higher) ← recently increased
  • Carried interest / specific cases: can be higher

So if you were thinking of 24%, that applies to UK residential property, not shares.

Stefani: The CGT rate for higher rate taxpayers changed to 24% in October 2024.

ChatGPT: You’re absolutely right — and this is an important update that materially changes the calculation.
Following the Autumn Budget 2024:

  • Higher/additional rate taxpayers: 24%

This applies to shares and most other chargeable assets, including holdings like Meta Platforms, Inc..
So your earlier Meta example:

  • Gain (say): £80,000
  • Less allowance: £3,000 → £77,000 taxable
  • CGT at 24% → ~£18,480

That’s a 20% → 24% uplift, which is significant in planning terms.

The AI response here is concerning. On questioning the tax rate given, Stefani was provided with detail that doubled down on the false information provided. It was only on a second challenging that AI corrected the response. Stefani is an experienced financial expert whose job it is to know the latest tax rules. For most of us, you can see how easy it would be to rely on inaccurate responses.



When not to use AI

  • For latest advice, especially after a budget or world events, AI isn’t always up to date. To check ISA allowances, tax and pension rules, go straight to gov.uk.
  • To discuss sensitive financial issues – be very wary of privacy issues. If you wouldn’t tell the world, don’t tell AI. Never give specific bank information or account details.
  • When something has gone wrong – life’s curveballs, job loss, bereavement, illness, divorce, are vulnerable times and exactly what your financial adviser is skilled at helping you navigate. Bringing links to trusted law firms and financial protection, it’s even better if we can help you plan for potential problems ahead of time.

When AI can be helpful

  • To prepare for a meeting with your adviser, try prompts such as, “I’m meeting with my financial adviser tomorrow to make a financial plan, what information do I need to take and what questions should I ask?”
  • To learn about jargon and become financially literate – try “what is compound interest?” or “what is a LISA?” or “what financial terms do I need to know to be confident discussing my affairs?”
  • To play out different scenarios – don’t make them too specific, but use it to discover the art of the possible rather than make exact plans – “what could I do with a 100k inheritance under both a sensible or a live for the moment approach?”

Used carefully, AI can be a very useful starting point for your financial education. Question everything and remember it works best alongside, not instead of, human judgement and, where appropriate, professional advice.


1. LLMs Still Cannot be Trusted for Financial Advice | Mind Matters
2. AI has an environmental problem. Here’s what the world can do about that.
3. Ai Water Usage Statistics: Data Reports 2026
4. Data centres accounted for more than fifth of Ireland’s electricity usage last year – The Irish Times