The Financial Reporting Council’s (FRC) inaugural guidance of use of artificial intelligence in audit has been welcomed as the first step towards regulatory clarity on artificial intelligence (AI) adoption – but further clarity is needed, ICAEW says.
As use of AI tools in audit grows, the FRC's new guidance outlines a coherent approach to implementing a hypothetical AI-enabled tool and offers insights into FRC documentation requirements, which the regulator says are designed to support innovation across the audit profession.
ISQM (UK) 1 requires firms to obtain or develop, implement, maintain and use appropriate technological and intellectual resources to enable the operation of the system of quality management and the performance of engagements, and document how they have addressed the risks that they might not meet these objectives.
Supporting auditors
This guidance sets out to support auditors and central teams at audit firms as they develop and use AI tools in their work, while also providing third-party technology providers with the regulatory expectations for their customer base.
The intended scope of the guidance spans traditional machine learning techniques and deep learning models, including generative AI. It was developed collaboratively with the FRC’s Technology Working Group, which draws on technical experts across the audit profession.
Key features of the guidance include:
- Two-part structure – illustrative example of one potential way AI can be used in an audit, and principles intended to support proportionate and robust documentation of tools that use AI.
- Sophisticated view on appropriate explainability – acknowledges that appropriate levels of explainability vary according to context and usage.
- Alignment with government AI principles – documentation guidance reflects the UK government's five AI principles.
- Relevant across market – the guidance contains material that clarifies how expectations translate into contexts where a tool is obtained from a third party.
Mark Babington, FRC Executive Director of Regulatory Standards, says: “AI tools are now moving beyond experimentation to becoming a reality in certain audit scenarios. When deployed responsibly, they have significant potential to enhance audit quality, support market confidence, drive innovation and ultimately contribute to UK economic growth.
“This guidance aims to illustrate how AI can enhance audit work as well as clarify FRC expectations around proportionate, appropriate documentation of tools that use AI. We recognise that this field is moving quickly and will continue to engage across the profession, both in the UK and internationally, to support innovation and the appropriate use of AI.”
Uncertainty as a barrier to AI adoption
Esther Mallowah, ICAEW’S Head of Tech Policy, says that for many businesses regulatory uncertainty is one of the barriers to adopting AI meaningfully and at scale, prompting ICAEW to call for regulatory clarity in its response to the Department for Science, Innovation and Technology’s consultation on its white paper A pro-innovation approach to AI regulation. This approach advocates for sector-led regulation of AI, with existing regulators laying out requirements for the use of AI within their domains.
Ian Pay, ICAEW’s Head of Data Analytics and Tech, says this latest FRC guidance represents a positive first step in providing some of the clarity regulated firms need around the regulator’s expectations when it comes to audit firms’ AI adoption. However, more needs to be done.
"We value FRC’s contribution in this space and know, from talking to members and firms, that guidance on the use of AI in audit will be appreciated. That being said, we also know that many firms are grappling with the specifics of AI implementation, and the appetite for practical examples and use cases for AI is ever-growing.
“We hope to work with FRC in the coming months to develop and clarify this further. Keeping the conversation going on AI in audit is going to be critical in supporting firms of all sizes as they strike a careful balance between innovation, risk and ethical adoption," Pay said.
Supporting innovation
The FRC guidance also demonstrates support for innovation across the audit sector, something that the regulator has publicly stated as an ambition in line with the government’s mandate for UK regulators to support growth and innovation across the UK economy.
Alongside the guidance, the FRC has also published a thematic review of the six largest firms’ processes to certify new technology used in audits. It includes insights and examples of good practice in their processes and controls to certify automated tools and techniques for use in audits.
Firms outside the six that are the focus of the review will find a lot of the content interesting, but may still feel that a lot of the things those firms are doing are beyond their reach. However, the thematic review demonstrates that many of the principles relating to documenting AI tools also apply more widely and so even firms not currently exploring AI tools per se can still take learnings from the publications.
Throughout the guidance there are mentions of the possibility of independent assurance of third-party AI models. Mallowah says that getting this assurance could be challenging in a relatively nascent sector.
“There are many types of activities labelled as AI assurance and audit firms would need to be sure that any assurance they obtain over an AI model covers the relevant risks and achieves the intended purpose. In addition, many firms using generative AI to support audit testing are building on top of foundation models for which access to information about their development and testing is limited, and independent assurance reports are not yet available.
She continues: “Firms will also need to understand where the responsibility of developers of third-party models end and where their responsibilities as implementers and/or users begins. ICAEW is actively engaging in tackling these challenges to help facilitate the development of an effective AI assurance ecosystem that helps both our members and wider society embrace AI responsibly.”
Franki Hackett, Head of Audit AI at Grant Thornton, says the firm was delighted to have contributed to the FRC’s regulatory guidance on the use of AI in audit. “The guidance brings much-needed clarity on how to respond to the current state of AI without constraining innovation as technology develops. Audit quality depends on good technology, which is why we are pleased to be incorporating this guidance into our audit AI quality framework to ensure our AI research and development delivers the best outcomes for our clients.”
ICAEW recently held its inaugural AI Assurance conference, bringing together key industry players and experts to discuss various topics including What is AI Assurance in practice? and Getting Assurance over foundation models.
- Read the guidance on the use of artificial intelligence (AI) in audit.
- Read the thematic review of the six largest firms’ processes to certify new technology used in audits.