ICAEW.com works better with JavaScript enabled.

Over-reliance on automation: a cautionary tale from Plato

Author: ICAEW Insights

Published: 27 May 2024

The dangers of too much tech are not a new concern – Konrad Bukowski-Kruszyna explains why accountants can take a sceptical cue from the ancient Greeks when it comes to AI.

In the realm of accountancy and auditing, blindly trusting technology without exercising due diligence and professional scepticism can have severe consequences. It brings to mind Plato’s oft-quoted yet frequently misunderstood passage from the Phaedrus:

“For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practise their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise.”

This quote is often used by technologists to mock those who express caution about new technologies. They suggest that Socrates (the speaker in this passage) is raging against the idea of writing and books – and look how ubiquitous those became! 

But it is actually a profound warning that resonates strongly in the modern age of automation and artificial intelligence (AI).

The allure of automation

The accounting and auditing professions have embraced technological tools and techniques ranging from automated analytical procedures to data analytics and AI-powered assistants. These advancements promise increased efficiency, accuracy and insight. However, as the International Auditing and Assurance Standards Board (IAASB) cautioned in 2021, the use of those technologies can also give rise to unintended consequences, such as automation bias and the risk of overreliance.

The rise of generative AI and large language models (LLMs) such as ChatGPT has amplified these concerns. Auditors may shift from actively analysing data to passively evaluating AI-generated analyses. This could reduce their situational awareness and understanding of the full context. 

The illusion of omniscience

One of the most insidious risks of over-reliance on technology is the illusion of omniscience it can create. As Plato forewarned, those who rely too heavily on external aids "will therefore seem to know many things, when they are for the most part ignorant". In the context of auditing and accounting, this manifests as a false sense of confidence in the completeness and accuracy of the information provided by automated systems.

Auditors and accountants must remain vigilant and maintain a healthy degree of professional scepticism, even in the face of seemingly flawless outputs from automated systems. Failure to do so can lead to costly oversights and potentially disastrous consequences.

The introduction of generative AI tools into the auditing workflow may also inadvertently disrupt established processes and practices. Auditors may find themselves grappling with new tasks, such as crafting appropriate prompts for the AI, while simultaneously losing access to valuable feedback and context that traditionally guided their evaluations. This disruption to familiar workflows can hinder the auditor's ability to exercise professional judgement and critically evaluate the outputs of automated systems.

The Dunning-Kruger effect and illusory superiority

The dangers of over-reliance on technology are further exacerbated by the Dunning-Kruger effect, a cognitive bias that causes unskilled individuals to suffer from illusory superiority, mistakenly rating their ability as much higher than it actually is. In the realm of auditing and accounting, this can manifest as a false sense of mastery over the automated tools and techniques being employed.

The IAASB gives an example of an auditor using a tool to analyse financial information to identify potential risks of material misstatement: “Professional judgement is required to determine whether a risk of material misstatement exists because there may be other circumstances that the tool had not considered (such as ineffective controls or high turnover in the accounting department) which may suggest additional risks of material misstatement."

This illustrates that even the most sophisticated tools and techniques are not infallible and require the application of professional judgement and critical thinking. Overconfidence in one's mastery of these tools can lead to a dangerous complacency and a failure to exercise the necessary due diligence.

The constant flow of suggestions and analyses generated by generative AI systems could prove disruptive to the auditor's thought processes and workflow. The barrage of information could increase cognitive load, making auditors more prone to missing crucial details or making errors. 

While generative AI may simplify certain routine audit tasks, it may struggle to provide accurate and reliable analyses when faced with more complex scenarios that require nuanced judgement calls. This polarisation of task complexity could lead auditors to over-rely on AI for simple tasks while still shouldering a high cognitive burden for intricate analyses.

Maintaining a balance

It is important to note that this is not a call to reject technology, automation and AI outright. Rather, it is a plea for a balanced approach that recognises the immense potential of technological advancements while remaining vigilant against the dangers of over-reliance (driven by some very convincing sales pitches from vendors).

By fostering a culture of continuous learning and professional development, firms can equip their auditors and accountants with the necessary skills and mindset to leverage technology effectively while maintaining a healthy degree of professional scepticism.

To mitigate the risks associated with the adoption of generative AI in auditing, a human-centred design approach is essential. This approach should incorporate principles such as continuous feedback loops, system personalisation, maintaining familiar task flows, supporting human judgement for complex tasks, and a clear division of labour between auditors and AI/automated systems. 

By embracing these human-centred design principles, the auditing profession can harness the power of generative AI while mitigating the risks of over-reliance, cognitive overload and disrupted workflows.

If we remain vigilant against the risks of automation bias, the Dunning-Kruger effect, illusory superiority and the potential negative elements of generative AI, we can strike a balance that allows us to reap the benefits of technological progress while avoiding the pitfalls of blind faith in these advancements. 

Through a commitment to continuous learning, professional scepticism and the application of sound judgement we can unlock the full potential of technology in the accounting and auditing professions.

Konrad Bukowski-Kruszyna, Audit Data Analytics Director and Alteryx Innovator.

A version of this article originally appeared on LinkedIn.

Generative AI guide

Explore the possibilities of generative AI in accounting.

AI using computer

You may also be interested in

Artificial intelligence
Artificial intelligence

Discover more about the impact of artificial intelligence and the opportunities it presents for the accountancy profession. Access articles, reports and webinars from ICAEW and resources from tech experts.

Browse resources
Keep up-to-date with tech issues and developments, including artificial intelligence (AI), blockchain, big data, and cyber security.

Keep up-to-date with tech issues and developments, including artificial intelligence (AI), blockchain, big data, and cyber security.

Read more
ICAEW Community
Data visualisation on a smartphone
Data Analytics

Helping finance professionals develop the advanced data analytics and visualisation skills needed to succeed in this insight-driven era.

Find out more