According to Riaz Shah, Professor of Practice at Hult International Business School, audit and accountancy practices make a common mistake when it comes to adopting technology. Often, it’s driven by senior partners, and the focus is on the technology itself rather than the problems it could potentially solve.
“Wanting to cut audit prep time by 30%, for example, is a much clearer starting point than saying, ‘Oh, there’s something going on in artificial intelligence (AI) – we’ve got to get involved,’” he says.
Shah is discussing how to lead in transformative times at ICAEW’s Annual Conference on 17 October, including how to deal with relentless technological disruption. He believes that rooting the AI adoption conversation in a business problem is vital for getting the management team onboard. While partners and directors will be excited about AI’s potential, it is important to be able to see how the technology will apply to their work. That will sway managers from viewing the project as ‘AI for its own sake,’ or something that is happening only among a small group of senior people.
Prefer to listen?
This audio file was produced by AI and has been adapted from the original article for audio purposes.
Proven systems
In Shah’s view, the whole notion of ‘choosing a tool’ in the AI age has a sense of finality to it that belongs more to the era of Enterprise Resource Planning (ERP) software than the state of the art in 2025.
“I had clients that spent around nine months choosing an ERP solution, and a year or two implementing it,” he recalls. “With AI, those timescales simply aren’t relevant, because everything’s changing so quickly.”
As a starting point, Shah urges practices to focus on AI tools that have a certain use history behind them, along with testimonials. Practice leadership should also address fundamental questions, such as: do you know that the tool will keep your firm’s data secure? Will it integrate with systems that you are already using? And crucially, can you explain it to regulators and clients?
“You’ve got to start with proven systems that will cover those areas and deliver rapid wins, whether in invoice capture, reconciliation or other, routine tasks,” Shah says. “When you come to explore more cutting-edge AI, run small experiments until you’re confident that you can adopt those tools more widely. It’s a case of constantly updating your approach.”
Automate and reskill
Shah, as an educator, is always eager to stress that AI takes away tasks – not jobs. That maxim, he says, is an essential springboard for practice leaders to bring employees with them on the AI journey.
“We spend around 82,000 hours of our lives at work,” he says. He points to a Gallup survey revealing that last year, employee engagement fell to just 21%. In the UK alone, the figure was 10%. “If AI can remove some of the drudgery that prevents people from feeling engaged at work, that can only be a positive thing.”
Securing staff buy-in requires managers to view AI’s benefits through a people-centric lens, Shah notes. You may assume that if a tool can automate 40% of the tasks, it will require 40% fewer people. Amid a chronic skills shortage, that would be a mistake. Practices should instead look at each job role and split it into various tasks.
“Automate all the boring ones, then reskill staff so they can tackle more interesting and challenging tasks that require AI knowhow. Get them to do the free AI learning modules from Google and Microsoft, and direct them to ICAEW’s Gen-AI Accelerator Programme.”
Take a similar approach to AI adoption with external stakeholders. Before you talk to them, practice leaders should do a quiet risk-management exercise to determine who they need to win round, from regulators to clients and suppliers.
“Ask in each case: ‘What’s in it for them?’ Think about what may excite or worry them , and approach them accordingly. And when you’re underway with piloting and experimentation, be transparent. Show them the results.”
Crafting inputs
Amid widespread enthusiasm for AI adoption, there has been much debate around how unknowns in the software could lead to skewed or distorted outputs. Scrutiny of those quirks are likely to intensify as AI becomes ever more independent, or ‘agentic.’ For Peter Beard ACA, Founder and Director of GenFinance.AI, adopters must understand that with greater AI autonomy comes tighter accountability.
“Deferring work to an agentic AI system does not relieve your accountability,” he says. “In and of itself, the tool is neither effective, nor ineffective – your audit paperwork is still signed off by people, and that’s not going to change. Bear in mind that someone owns that person-agent relationship, and any resulting workflow. Someone’s responsible for creating that agent, and someone’s responsible for using it and taking in its answers.”
Beard, who will be sharing real-world examples of AI adoption in accountancy practices at the Annual Conference, believes crafting inputs is both an art and a science. “There’s an effective way of talking to these tools, and it should always involve applying your five, fundamental principles, including objectivity, integrity and professional competence.”
If something goes wrong, hopefully you won’t need to pivot, says Shah, because you’ll have carried out thorough and extensive pilots. “If you do bet the entire practice on a novel tool too quickly, that can be very dangerous.”
Given the pace of change, it’s wise not to overinvest, he adds. “The solution you’ve put all your stock in could be superseded into obsolescence in just a few months’ time.”
With all that in mind, Shah stresses that constant iteration is the name of the game. “AI adoption is never ‘one and done,’” he says. “There are two traps. One is overconfidence, where you’re convinced a certain tool is the answer to all your problems and think it could replace all your people, and you end up in a big mess. The other is under-confidence, where you do far too much analysis and end up freezing and just thinking that something better’s going to come along in three weeks’ time. In the middle is the best path: curiosity, experimentation and iterative, continuous learning.”
Real-world AI Insights
ICAEW's Annual Conference 2025 includes sessions covering how AI is already being used and how to address the challenges of implementation.