top of page

AI is confident. Your finances need more than that.


Why AI Should Be Your Finance Assistant, Not Your CFO


If you’re a small business owner using AI to help with your finances, this is for you.


The opportunity is real. So is the risk. Here is how to tell the difference.


The exciting bit first


In February 2026, Perplexity launched Computer — an AI agent that can connect directly to your bank accounts, credit cards and loans via Plaid's network of over 12,000 financial institutions. It pulls in data from different accounts and you ask it to build you a budget, a debt payoff plan or a net worth tracker using a simple text prompt.


Then at the start of April, they launched “Computer for Taxes” adding US tax integration to Perplexity, allowing the agent to fill out IRS forms and review professionally prepared returns.


Is this the end of accountants?


This is not science fiction. This is what is available right now, and it is genuinely impressive. For time-poor small business owners who have long found financial admin draining and opaque, the appeal is obvious. Why spend an hour squinting at a spreadsheet when an AI can summarise your cash position in thirty seconds?


But this mindset is exactly where I’m starting to see people get into trouble.


I use AI in my own business every day. It helps me with planning and strategy, financial analysis, content, and thinking through decisions. The rules I have set up in my bookkeeping software save me significant time every week. I am not here to tell you that AI is something to be afraid of or avoided. I am a Chartered Accountant and a business coach, and I have seen first-hand the amazing things these tools can do, when they are used well.


But I have also seen — with increasing frequency — what happens when they are not used well. And in the specific context of business finance and accounting, the gap between 'used well' and 'used without enough understanding' can be costly, stressful, and sometimes very difficult to untangle.


This article is about that gap. Not to close the door on AI in your finances — but to help you understand where to prop it open carefully, and where to keep it firmly shut until you know what is on the other side.


Meet your new office junior


Here is the analogy I keep coming back to. Imagine you have taken on a new member of staff. Their CV is exceptional — they have read extensively across almost every subject, they can communicate clearly, they respond quickly, and they are endlessly helpful. On day one, they arrive full of confidence, get stuck straight in, and start producing work.


The problem is not that they are dishonest. It is that they do not know what they do not know. They have read a lot about accounting, but they’ve never had to defend a tax return to HMRC. They have encountered UK tax rules, but they have also encountered US tax rules, Australian tax rules, and guidance from a dozen other jurisdictions, and they cannot always tell which is which.


They are not doing this to mislead you. They are just filling the gaps in their knowledge with the nearest plausible thing they have encountered — and they are doing it with the same confident tone they use when they are completely right.


So they promptly deliver a comprehensive report solving every challenge in the business, and then they go for an early lunch.



This is AI as it currently stands in the context of business finance. Not a fraud. Not a failure. Just a very well-read, very confident junior who needs proper supervision, clear briefing, and, most importantly someone with enough knowledge to review what they have produced and mentor them as they learn.


The question is not whether to hire the junior. It is whether you understand enough to manage and mentor them properly.


And that is where things start to get a bit more complicated.


The confidence gap — and why you need to know about it


If you’ve ever felt a sense of relief reading an AI answer that sounded confident – I promise you’re not alone.


The most common thing I hear from clients right now is some version of: 'ChatGPT said I can do this — is that right?'


Sometimes the answer is largely yes.Sometimes it is partly yes, but with important caveats that were missing from the AI's response.


Often it is “no, that’s not right for you”, and the reason it sounded plausible is that it was the right answer to a slightly different question — one the client did not realise they were asking.


I see the same thing in online forums. Someone posts a question about their expenses, or their VAT, or whether a particular cost is allowable. Several people reply. Some of those replies are clearly AI-generated — the structure, the hedging language, the confident summary all give it away. The person replying has asked an AI, received an answer, and posted it as if it were their own knowledge.


The problem is that the AI answered a general version of the question, without the specific context that would change the answer: the poster's business structure, their industry, whether they are VAT registered, whether they are a sole trader or a limited company, what HMRC's current guidance actually says.


The responding person is trying to be helpful and doesn’t know not to trust what the AI says. The person asking the question does not know enough to spot the difference between a well-contextualised answer and a plausible-sounding general one. And the AI, crucially, does not know it is missing the context.


This is what I think of as the confidence gap, and it is particularly dangerous for business owners who are already a little unsure about their numbers.


If you are someone who sometimes wakes at 3am wondering whether you have got it wrong — and many of the business owners I work with are exactly that person (I’ve been there too) — then a confident AI answer feels like relief. It feels like someone checked it for you and then gives you reassurance all is fine.


But it has not been checked.

It has been generated.


Those are not the same thing.


There is also a subtler problem here: AI is prone to agreement. It is ultimately something of a people pleaser.


Ask it whether your business idea sounds good and it will generally find ways to validate it. Ask it whether a particular expense might be claimable and it will often lean towards yes.


It is not designed to push back, challenge your assumptions, or tell you something you do not want to hear. A good accountant will do all three. That friction is not a flaw in the accountant — it is part of the value.


Why accounting specifically is a poor fit for unguided AI


There are some tasks where AI's limitations are fairly easy to manage. If it gets a recipe slightly wrong, you notice when you taste it. If it writes a sentence you do not like, you edit it. But accounting is an area where you might not spot the error until it matters — and by then, the consequences are real.


The UK problem


Most large language models are trained predominantly on English-language content, and a significant proportion of that content is American. When you ask about expenses, tax deductions, or employment status without specifying your jurisdiction, you may well be getting an answer that is shaped by US tax law, US employment definitions, or US accounting practice (US accounting rules are ‘rules based’ meaning clearer yes / no answers and easier application by automation, whereas UK accounting is ‘principles based’ – so more judgement is required)


It is not switching between jurisdictions — it is drawing on the nearest relevant pattern in its training data. An answer about what counts as a business expense might draw on IRS guidance without ever flagging that HMRC operates differently. The answer can sound entirely right (and be right in another country), and be entirely wrong for your circumstances.


The fix — specifying 'UK sole trader' or 'UK limited company under HMRC rules' in your prompt — does help, but it requires you to know to do that.


Most people do not.


And even then, I know accountants who have tried to research using AI, with source and legislation guardrails, and not got the correct answers. So even good prompting isn’t enough.


Legislation versus HMRC interpretation


When an AI is drawing on the right jurisdiction, there is still a layer it cannot reliably navigate. Tax law says one thing. HMRC's operational guidance interprets that law in a particular way. And then there is the practical reality of how HMRC actually behaves in practice — what it will and will not question, what a compliance officer is likely to look for, what defensible documentation looks like.


AI can find the legislation. It can summarise the guidance. What it cannot do is tell you where the gap between the technical legal position and HMRC's operational stance is likely to trip you up. That gap is exactly where us qualified accountants earn our fees.


Structure matters enormously


Sole trader and limited company are not just different labels for the same thing. They are fundamentally different legal and tax structures, with different rules on almost everything: what you can claim, how you pay yourself, what counts as income, how you handle VAT, what your obligations are to HMRC and Companies House. An AI answering a question without knowing which structure you operate under is working without critical context — and it may not know to ask.


The interchangability of terms means AI may misinterpret the questions.

Owners' drawings in a sole trader business work entirely differently from a director's salary or dividends in a limited company, but sole traders often talk about their ‘wages’. An accountant will spot that nuance and allow for it, AI probably won’t.


VAT rules, IR35 considerations, how you treat mixed-use assets — all of these change significantly depending on your structure. A general answer, however well-written, may simply not apply to your situation.


Grey areas require judgement, not information retrieval


Some of the most common accounting questions do not have single correct answers. 'Wholly and exclusively' — HMRC's test for whether an expense is deductible — is a judgement call that depends on intent, proportion, and context. Whether something is capital expenditure or a revenue expense requires interpretation of the specific circumstances. Employment status under IR35 is famously complex and fact-specific.


AI is very good at information retrieval. It is much less reliable at the kind of nuanced, context-specific judgement these questions require. It can tell you what usually happens, or what the general rule is. It cannot tell you what is correct for you, in your specific situation, with your particular business model and history.


The prompt quality problem


There’s a deeper issue here. To get a genuinely useful answer from AI in accounting, you need to include context most business owners don’t realise matters — your structure, your VAT status, your accounting basis, even the tax year.


If you don’t know those things, you can’t ask a precise question. And if the question isn’t precise, the answer won’t be either.


From my years of teaching, I recognise this pattern immediately. A student who searches for an answer to a question often finds something adjacent — something that looks like the right answer but does not quite address what was asked. They cannot always tell the difference, because they do not yet have enough understanding to recognise the gap. AI has the same problem in reverse: it cannot always tell when it is answering a slightly different question from the one that was meant.


The audit trail problem — and where the buck stops


Here is something that does not come up enough in my conversations about AI and finance, and it’s one that small business owners need to be aware of.


'ChatGPT told me' is not a defence with HMRC.


Financial decisions — particularly those with tax implications — need to be defensible. If HMRC queries something on your return, you need to be able to explain the reasoning behind it, point to the guidance you relied on, and demonstrate that you took a reasonable and considered approach. That requires an audit trail of thinking, not just an answer. This is why we like clear records and working papers.


AI does not produce that trail. It produces an output. The reasoning behind that output is not transparent, not verifiable, and not something you can point to in a compliance conversation (even with ‘show thinking’). You accepted the answer. You acted on it. That’s the moment it becomes your decision — not the AI’s. The responsibility for that action sits entirely with you.


This is not a theoretical concern. The AI companies are entirely clear about it. OpenAI's terms of service explicitly state that ChatGPT's outputs should not be relied upon as professional advice and that OpenAI accepts no liability for decisions made on the basis of those outputs. Anthropic — who make Claude — is similarly explicit. These are not buried disclaimers. They are clear statements of where responsibility lies.


Perplexity, despite its ambitious move into personal finance with its Computer agent, includes similar caveats. A tool that can connect to your bank accounts and generate a retirement dashboard is not the same as a tool that carries professional accountability for the financial decisions you make as a result.


This matters because the appeal of these tools is often framed around confidence and convenience. But confidence without accountability is not the same as reliability. The business owner who acted on an AI answer is the business owner who signed the tax return. The penalty, the interest, the correction — all of that lands on them.


We have already seen this in other professions. There have been widely reported cases of lawyers submitting AI-generated case citations to courts, only to discover the cases did not exist. The AI had generated plausible-sounding references with the confident tone it uses for everything else. The lawyers faced serious professional consequences. The AI faced none.


Automation — a brilliant servant, but a dangerous master


I want to talk about bookkeeping software automation specifically, because this is where I think the risk is most invisible — and where I have first-hand experience of both the power and the pitfalls.


The rules and automation features in tools like Xero, QuickBooks and FreeAgent are genuinely transformative when used correctly. I use them in my own bookkeeping, and they save me a considerable amount of time every week. A rule that automatically categorises a particular supplier's invoices, or recognises a recurring transaction and applies the right VAT treatment, removes a repetitive task from your plate entirely.

But there are three things these tools require that are easy to overlook.


First, you have to set them up correctly in the first place. If the initial categorisation is wrong — if you have applied the wrong VAT rate, misclassified an expense, or confused owners' drawings with a salary — the rule will apply that error automatically and repeatedly. One wrong manual entry is a mistake. Six months of the same wrong categorisation applied by a rule is a pattern that looks clean and consistent, which makes it harder to spot, not easier.


Second, you have to review the rules regularly. Your business changes. Suppliers change. Your VAT status might change. A rule that was perfectly appropriate when you set it up may be applying the wrong logic to a situation it was not designed for. Without regular review, you will not know.


Third, you have to check the outputs. Automation does not remove the need for review — it changes the nature of it. Instead of checking every transaction, you are checking whether the rules are still working as intended. That still requires you to understand what 'working as intended' looks like.


The deeper issue here is one of compounding errors. A single mistake is annoying. A consistently applied mistake, running quietly in the background for months, can create a picture of your finances that is coherent and wrong in equal measure — and coherent wrong is the hardest kind of wrong to find. Consistent errors don’t look like mistakes. They look like systems.


Recent changes to bring in ‘auto-reconciliation’ in software like Xero recently may sound like a dream – it does it all for you – but are you confident it’s doing it right?


What AI is actually good at in your business finances


I want to be genuinely generous here, because the case for AI in business finance is real — it just needs to be made accurately with a bit less shiny new hype.


AI is excellent at working with data you have already verified. Feed it a clean set of figures and ask it to identify trends, flag anomalies, or compare this quarter against the same period last year — it will do that quickly and well. Use it to summarise a financial report in plain English, or to generate a first draft of a cash flow projection based on numbers you have reviewed — also excellent.


It is useful as a thinking partner when you are working through a decision. Not as the decision-maker, but as something to bounce ideas off, to surface considerations you might not have thought of, to help you articulate a problem more clearly. I use it this way regularly in my own strategic planning work.


It is good at repetitive transactional work, once it’s been correctly configured. The bookkeeping automation I described above, when set up properly and reviewed regularly, is a genuine efficiency gain and allows you to act as ‘reviewer’ rather than just processor.


The distinction that matters is this: AI working as an analytical assistant on data you have already understood and verified is powerful and appropriate. AI making decisions about data it is processing itself, without you in the loop, is where things go wrong.

There is also something important about what AI cannot bring to financial analysis that a qualified accountant or experienced business owner can: the 'why.'


AI can tell you your revenue was down 15% in March. It cannot tell you that March was the month you took a week off to deal with a family situation, that you had already decided not to take on new clients that month, and that the figure is therefore not a cause for concern. Context is not just data — it is understanding, and that lives with you.


Different jobs require different people — and AI holds no qualifications


It is worth being clear about what we actually mean when we talk about 'finance' in a small business context, because the word covers several genuinely distinct disciplines.


Bookkeeping is the transactional recording of financial activity — money in, money out, categorised correctly, reconciled against bank statements. It is essential, it requires accuracy and consistency, and it is the area where AI automation offers the most legitimate help — with the caveats already noted.


Accounting requires interpretation and judgement. It is the process of taking that transactional data and making sense of it — determining profit, understanding what the figures mean, applying the relevant tax treatment, preparing compliant returns. This is where professional qualifications, ongoing training, and knowledge of current legislation and HMRC guidance become important. It is also where the nuances of business structure, industry, and individual circumstances make a significant difference.


Financial decision-making is something else again. It is using your financial understanding to make choices about your business — pricing, investment, whether to take on a particular client, what 'enough' looks like for you, how to balance profit against energy, how to build something sustainable. This requires knowing your numbers, yes, but it also requires knowing your business, your values, your capacity, and your definition of success. No AI can do this for you, because it does not know any of those things unless you tell it — and even then, it is not equipped to weigh them the way you can.


AI can assist with the first. It needs significant human support to do the second responsibly. It cannot do the third at all without you firmly in the driving seat.


Treating AI like a team member — how to actually make it work


Here is where I want to shift from caution to something more constructive, because the goal is not to make you afraid of AI — it is to help you use it intelligently.


The team member analogy holds here too. Even a CFO does not work alone. They sit within a wider structure — a board, a leadership team, a set of processes and accountability mechanisms. They are briefed regularly. Their work is reviewed. Their decisions are challenged. The quality of their output depends significantly on the quality of the information they are given and the governance around them.


AI in your finances works best the same way. Not as an autonomous decision-maker, but as a capable assistant within a structure. That means a few things in practice.


Brief it properly


When you use AI for anything finance-related, the quality of your output depends heavily on the quality of your context. That means being explicit about: your business structure (sole trader or limited company), your jurisdiction (UK, and ideally the relevant HMRC guidance), your VAT status, your accounting basis (cash or accruals), your industry where relevant, and the specific circumstances of what you are asking. The more precisely you can frame the question, the more useful the answer — but you need enough knowledge to provide that framing in the first place.


Know what you are checking


Using AI as an assistant only saves you time if you have enough understanding to review what it produces. If you are checking AI-generated financial analysis without knowing what correct looks like, you are not reviewing — you are trusting. Review requires knowledge. This is one of the reasons I am so committed to helping business owners understand their own numbers: not so they can replace their accountant, but so they can manage any assistant — human or AI — effectively.


Cross-check anything consequential


Any AI answer that will affect a tax return, a financial decision, or a compliance position should be verified against a primary source — the relevant HMRC guidance page, your accounting software's documentation, or a qualified accountant. HMRC's Business Income Manual, VAT manuals, and guidance on specific topics are all publicly available and searchable. If an AI answer and an HMRC page say different things, the HMRC page wins. Every time.


Maintain the human in the loop


The combination that creates real insight is AI plus human judgement — not one instead of the other. This is something I talk about a lot in my work: data and intuition are stronger together than either alone. The same is true of AI capability and human understanding. AI can process and pattern-match at a scale and speed no human can match. Humans bring context, nuance, accountability, and the kind of judgement that comes from actually running a business and knowing what it costs in ways that don’t appear on a balance sheet.


The business owners I see getting the most from AI are not the ones who have handed it the most responsibility. They are the ones who understand their business well enough to direct it clearly, review its work critically, and know when to bring in a qualified human instead.


Understanding your numbers is what makes any tool useful


The thread running through everything I have written here is this: AI is most powerful in your business finances when you already understand what you are looking at. Not because you need to do everything yourself — you do not — but because understanding is what allows you to direct, review, and make good decisions.

If you do not know the difference between turnover and profit, an AI summary of your revenue trends is potentially misleading. If you do not understand how money moves from your business income to your personal pocket — through drawings, salary, dividends, tax — then AI-generated cash flow projections are numbers without meaning. If you do not know what 'enough' looks like for your specific life and business, no tool can help you build towards it.


This is not about becoming an accountant. It is about becoming financially grounded — having enough understanding of your numbers to make confident, intentional decisions, and to use the tools available to you with appropriate judgement rather than blind trust.


That is exactly what I cover in Finance Fundamentals for Founders — a workshop designed specifically for business owners who want to feel more confident with their numbers, understand what they are actually looking at, and make better decisions as a result. We look at the key figures that matter, what they mean, how money flows through your business, and how to build a picture of your financial health that you can actually use.


Because the goal is not to replace your accountant, or to avoid AI, or to spend more time on your finances than you want to. The goal is to understand enough to be in charge — of your tools, your decisions, and your business.


Key Takeaways:

  • AI can produce confident answers without understanding context

  • In accounting, context determines whether something is correct

  • UK tax rules and interpretations are complex and often misapplied by AI

  • Automation can scale errors, not just efficiency

  • AI is useful for processing and drafting, but not for judgement or decision-making


Most people don’t need more tools. They need to understand what’s already happening in their business.


If you want to understand what your numbers are actually telling you, the workshop is a starting point.


If you want to apply it to your business — to see what’s really going on and what to do next — that’s exactly what the Financial Reality Audit is for.


Finance Fundamentals for Founders runs on 27th April. You can find out more and register at damgoodbusiness.com.


FAQ


Can AI replace an accountant for small businesses? AI can assist with tasks like categorising transactions or drafting explanations, but it cannot replace professional judgement, especially where tax rules depend on context and interpretation.


Is it safe to rely on ChatGPT for tax advice in the UK? No. AI tools are not regulated and do not take responsibility for accuracy. Business owners remain responsible for their own tax filings.


Where is AI useful in business finance? AI is useful for drafting, summarising, and processing data, but decisions should always involve human review and understanding.


 

About Davinia McGann

Davinia is a Chartered Accountant with 18 years of experience, a business coach, and the founder of DaM Good Business. She supports female and LGBT+ business owners to build financially and energetically sustainable businesses — by understanding their numbers and using them to make confident, intentional decisions.

Comments


Join the Club

Join the DaM Good Business® newsletter and get access to the free Club — resources, tools, and weekly thinking to help you build a calmer, more profitable business.

bottom of page