ICAEW.com works better with JavaScript enabled.

How to prepare for the EU AI Act’s conformity assessments

Author: ICAEW Insights

Published: 08 Dec 2025

Conformity assessments are a central pillar to the EU AI Act and apply to key types of high-risk AI systems. A panel of experts explains how the assessment process will work.

With 2026 fast approaching, businesses that work with artificial intelligence (AI) and trade in Europe are preparing for a major requirement of the EU AI Act.

Set to take effect within the next two years, conformity assessments were the subject of a special panel session at the AI Assurance Conference held earlier this year at Chartered Accountants’ Hall.

In the talk, experts from the legal profession, tech industry and certification field shed light on which types of technologies the assessments will apply to. The speakers also set out some background details on why the assessments are such an important part of the Act.

Pyramid of risks

Ranked by Legal 500 as a leading expert on AI law, Simmons & Simmons Managing Associate William Dunning explained that conformity assessments are designed to regulate AI systems. In other words, AI models that are integrated into a broader setup. That may mean an application in a web browser, for example, or a piece of hardware, such as a robot arm in a factory.

Dunning noted that conformity assessments concern one area of the Act’s risk-based approach, which resembles a pyramid. At the top, the EU places prohibited AI systems, such as certain biometric and facial-recognition tools, or systems that are used to exploit people’s vulnerabilities. As those tools are deemed to pose unacceptable risks, they are banned. Along the pyramid’s base are lower-risk systems, which are regulated only with lighter-weight transparency requirements.

Where conformity assessments apply is in the middle of the pyramid, comprised of systems the EU considers high risk. “There are two categories, here,” Dunning explained. “First, there are AI systems that play a safety role in regulated products, such as cars, medical devices or machinery. Second, there are systems used in particular societal contexts, such as recruitment and HR or education.”

In terms of the compliance practicalities that businesses face, Dunning noted: “There’s a wide range of both substantive and procedural requirements for high-risk systems. Under the former, companies must put in place risk management, data governance, documentation, human oversight, cyber security, accuracy, robustness and quality management measures, among other things.

The conformity assessment comes in on the procedural side of things, he explained; evaluating systems in relation to those substantive requirements. If a system passes, it receives a Conformité Européenne (CE) mark.

Finally, Dunning noted, the system must also be formally registered in an EU-wide database. “All told,” he said, “it’s quite a big bundle of requirements.”

Gold standard

Tim McGarr, AI Market Development Lead (Regulatory Services) at certification provider BSI, explained that the conformity assessment’s evaluation procedure derives largely from product testing regulation, which is rooted in principles of impartiality. 

In that fashion, it feeds into one of the EU’s main ambitions behind the Act: to create a pioneering piece of legislation that will serve as a benchmark for other jurisdictions.

“Just as GDPR provided a gold standard for how to legislate for data protection, the Act sets out to do the same for AI,” he said. “If we look at how the conformity assessment could help raise awareness of the need to apply best practices, manufacturers of products such as medical devices have been used to this level of scrutiny for many years and simply understand that that’s how regulation works. However, people with more pure-technology backgrounds are not used to being regulated in this way at all – so the Act represents newer territory for them.”

Pauline Norstrom – CEO of AI strategy and risk advisory firm, Anekanta®  AI – explored further reasons behind the breadth of the assessment categories. In particular, she highlighted the child welfare scandal that hit the Netherlands in 2018, when it emerged that authorities had implemented the innovative use of an AI system to detect benefits fraud. Working from a set of risk indicators, the system, which had not been adequately tested for bias, had overwhelmingly targeted minority-ethnic families. 

As a result of the unintended consequences, tens of thousands of low-income parents and caregivers were wrongly accused of fraud, with more than 1,000 children sent into foster care. In 2022, the scandal was discussed in the European Parliament. Although progress was already underway on the EU AI Act prohibitions, the scandal served to underpin the need for such safeguards.

“Look at Amnesty International’s report on the scandal, Xenophobic Machines,” Norstrom said. “It’s essentially a playbook on AI innovation that also illustrates the consequences of inadequate planning, risk assessment and governance at the outset. Despite a lack of explainability, monitoring and review, the system – which wasn’t ChatGPT, just a regular, machine-learning algorithm – was allowed to carry on for years before its negative impacts were corrected.”

Dealing with complexity

Turning to how companies should get in shape for the effective date of conformity assessments, Norstrom said: “Organisations should look at their AI setup – not just what’s in play, but what they’re planning to buy, too – and ask key questions about the capabilities of each system to gauge its risk level. For example, is it autonomous? Can it adapt? And can you pin down an explanation for the decisions it makes?”

Dunning recommended putting together an inventory of all the AI tools in operation in the organisation. Dunning’s firm has seen large organisations start to put together this AI inventory. “Once that’s in hand, you can examine your systems in the context of the Act and start to deal with the complexity.”

However, when the Act’s rules around high-risk systems were developed, ChatGPT wasn’t around, he cautioned. The regulation is aimed at the types of systems used for automating simple tasks that does not necessarily represent the current AI market. 

“For example, what happens when you’ve got a system based on ChatGPT or an equivalent that can do literally thousands of different things? If a business decides to use ChatGPT to quickly sort a batch of CVs, is that tool subject to the whole suite of high-risk requirements under the Act and who is responsible? That’s currently quite a big area of uncertainty as to how the Act is going to work.”

Recent developments suggest that the EU is acknowledging the difficulties of implementing the high-risk regime under the Act. Challenges identified include “a lack of harmonised standards for the AI Act’s high-risk requirements, guidance, and compliance tools”. Proposals to delay the rollout of high-risk requirements until 2027 or 2028 – depending on the type of system at issue – are currently making their way through the EU’s legislative process. However, that still leaves businesses with much to do to achieve compliance.

Get ready for the 2026 conference

Register your interest for next year’s edition of ICAEW’s AI Assurance Conference on 6 July 2026, so we can notify you when bookings open.

Accounting Intelligence

ICAEW has created a suite of resources to support members in building their understanding of AI, including opportunities and challenges.
Browse resources Masterclass videos
Cut out of laptop device with generative AI prompt screen.

You may also be interested in

Resources
Artificial intelligence
Artificial intelligence

Discover more about the impact of artificial intelligence and the opportunities it presents for the accountancy profession. Access articles, reports and webinars from ICAEW and resources from tech experts.

Browse resources
elearning
GenAI Accelerator

Gain the skills to harness the power of GenAI, transforming the way you work and positioning yourself as a leader in the industry. Don't just keep up with change - drive it.

Find out more Enrol
ICAEW support
A person holding  a tablet device displaying various graphs
Training and events

Browse upcoming and on-demand ICAEW events and webinars focused on making the most of the latest technologies.

Events and webinars CPD courses and more
Open AddCPD icon