Artificial intelligence (AI) is no longer a future concept for charity finance teams. As the audience poll at the ICAEW Charity Conference showed, most professionals are already experimenting with AI tools, even if only for drafting text or generating ideas. What is less clear is how confidently and consistently AI is being used, and whether organisations have the right safeguards in place.
As Esther Mallowah, ICAEW’s Head of Tech Policy, explained at the Charity Conference’s panel discussion on AI, part of the challenge is defining what we mean by artificial intelligence. It is not just generative tools like ChatGPT. Many finance systems already use “traditional AI” behind the scenes for tasks such as invoice processing, expense coding, forecasting, and anomaly detection. For many charities, AI is already embedded in their finance software, whether they actively think of it that way or not.
The opportunity now is to move from informal, individual use to more deliberate and well-governed adoption that genuinely supports better decision-making.
Where AI Is adding value in charity finance
Grant Gevers, Senior Solutions Consultant at Sage Intacct, highlighted how AI is becoming part of the everyday fabric of finance systems, rather than a bolt-on extra. In practical terms, this is delivering three clear benefits for charities.
Firstly, AI is reducing manual effort. Machine-learning tools can read invoices, extract data, and suggest coding, saving hours of processing time each week for organisations dealing with high volumes of transactions. Crucially, these tools are designed to make suggestions, not decisions; humans still need to review and approve the outcomes.
Secondly, AI is improving data quality and internal controls. Even where data is imperfect, AI can spot unusual transactions, inconsistencies, or potential risks earlier than traditional manual reviews. This allows smaller finance teams to strengthen controls without needing extra headcount.
Thirdly, AI can act as a finance assistant. Instead of simply flagging that a budget has been overspent, AI tools can help explain why, supported by underlying transaction data. This enables quicker responses and more informed conversations with budget holders and trustees.
Moving beyond “shadow AI” to a strategic approach
Zoe Amar, Founder & Director at Zoe Amar Digital, emphasised that many charities already use AI more widely than leaders realise. The 2025 Charity Digital Skills Report shows that over three-quarters of charities are now using some form of AI. However, this use is often hidden in “dark corners” of the organisation, creating risks as well as missed opportunities. She suggested three practical steps for leaders.
The first is to understand the baseline: how staff and volunteers are already using AI, for what purposes, and with which tools. Bringing this activity into the open allows organisations to learn from what is working and identify risks early.
The second step is to run small, purposeful pilots aligned to organisational strategy. Rather than endless experimentation, charities should test a small number of use cases that clearly address business needs, assess the results, and then decide what to scale.
The third step is leadership role-modelling. When senior leaders openly learn, experiment, and discuss both benefits and risks, it gives others permission to engage thoughtfully with AI rather than avoiding it altogether.
Guardrails, governance, and professional responsibility
From a governance and privacy perspective, Nina Barakzai, Specialist Counsel for Global Privacy and Information Governance at HPE, reminded delegates to start with some basic but critical questions: whose data is being used, do you have permission to use it, and does the tool really need personal or sensitive information to do its job? Simple techniques such as anonymising data, grouping information, and avoiding unnecessary uploads of personal data can significantly reduce risk. She reminded delegates that finance professionals are already skilled at structured data handling, and these same skills apply when working with AI.
An AI policy does not need to be complex, but it should clearly define scope, authorised use, transparency requirements, and review processes. Importantly, policies should evolve as tools and risks change. Esther Mallowah also reinforced that the ICAEW Code of Ethics still applies. Accountants remain accountable for AI-assisted outputs. Integrity, transparency, and professional scepticism are essential, particularly where AI outputs influence decisions or public-interest reporting. The risk is not using AI but over-relying on it without proper review.
A sensible next step
How to get started
If you want more tips and hands-on demos about how to make your work more efficient using AI tools, here are some courses that will guide you on your journey:
- ICAEW GenAI Accelerator Programme – our flexible e-learning programme features eight courses with bitesize modules designed to fit your schedule and learning needs. The introductory course and AI fundamentals (8hrs CPD) are free for ICAEW members and tailored to finance tasks. A charity-specific module will be added to the Accelerator training in due course.
- Zoe Amar Digital’s AI resources, including the Charity AI Leadership Accelerator in partnership with Microsoft which summarises learnings from 100+ charity leaders in 5-minute videos with strategic prompts and tailored tips for small and large charities
- Charity Excellence’s free AI training is not specific to finance tasks but it is tailored to charities, covering AI risks and safety, finding funders, writing funding bids, and overseeing AI ethics at board level
- From insight to impact: February updates from the Charity Community
- AI is already here – often more than we realise
- Commission announces major recruitment drive and tech investment
- Minister sets out stall for new era of civil society at ICAEW Charity Conference
- Introducing the Chartered Accountants Livery Company