What are the AI no-go areas?
While the majority of accountants are embracing AI, judgment, empathy and ethics are still areas where letting it loose is seen as a line not to be crossed.
2 min read


Respondents to a special editorial report survey by accountingweb in association with Sage, State of the nation: AI in accountancy and bookkeeping, drew a distinct line between processing data and managing relationships. They seemed happy for artificial intelligence (AI) to handle the “grunt work”, but they were fiercely protective of tasks requiring empathy, judgment or high-stakes responsibility.
While the data confirms that adoption is already the norm – 71% of firms are using external AI tools and confidence is growing – a deeper look reveals a significant divergence in how firms believe these tools should be applied.
When they asked respondents if there were specific services that simply should not be delivered with the help of AI, the profession was almost perfectly divided. A slight majority (55%) of respondents indicated that all services are potentially suitable for AI intervention. For this group (often the “strategic adopters”), the technology is a tool that can be applied universally, provided the oversight is correct.
However, 45% of respondents drew a firm line. While we could assume these are all sceptics, the reality is that they are more likely to be active adopters who have simply identified specific boundaries where they believe the technology creates risk or dilutes value.
Analysis of their free-text responses identifies three distinct pillars where the human in the loop is seen as non-negotiable: judgment, empathy and ethics.
The judgment threshold
The most frequent reservation concerns complex decision-making. While there is broad enthusiasm for automating data entry and reconciliation (58% said this was the top potential use case), respondents consistently flagged high-level audit and tax planning as areas requiring human cognition.
Respondents noted that “audit tasks that require professional judgment” and “subjective tax matters” rely on nuance that large language models (LLMs) currently lack. The sentiment here is that AI can process the what and the when, but is ill-equipped to determine the why. For these professionals, the risk of hallucination or lack of context in statutory work is a red line.
The empathy requirement
The second theme is relational, underlining that the human touch is valued not just as a soft skill, but as an essential part of service.
Respondents specifically cited face-to-face chats and sensitive financial discussions, such as following up late payments, as tasks that should remain human-led. The data suggests a fear that automating client communication (while efficient) risks commoditising the trusted relationship. For 45% of the profession, efficiency at the cost of empathy is a bad trade.
Ethical and moral reasoning
Finally, there is a strong protective instinct regarding ethics. Respondents cited “human critical thinking” and “moral judgments” as capabilities that algorithms cannot replicate. In a regulated profession built on trust, the idea of outsourcing ethical reasoning to a “black box” remains a significant barrier.
This 55/45 split challenges the assumption that the end goal of AI adoption is total automation. Instead, it suggests the profession is heading towards a hybrid model.
For the 55%, the strategy is likely to be automate and supervise – using AI across the board with rigorous checking. For the 45%, the strategy is automate and protect – using AI for the heavy lifting, but ring-fencing specific high-value tasks as exclusively human domains.
As we look towards 2026, the firms that succeed will likely be those that can articulate this distinction clearly to their clients, using AI to drive efficiency, while highlighting the judgment, empathy and ethics that remain the unique superpower of the accountant.
Contact
Get in touch
Phone
enquiries@ataccountants.co.uk
079 22 812 855
A&T Accountants Ltd is registered in England and Wales © A&T Accountants Ltd 2025
