ICAEW.com works better with JavaScript enabled.

How do you audit a robot?

Guidance for internal auditors on providing assurance over robotic process automation (RPA). The ICAEW’s Internal Audit Panel outline the key risks for organisations in implementing RPA and how internal audit can mitigate these risks. ICAEW members can download more detailed guidance.

Robotic process automation (RPA) is helping companies to automate routine tasks, freeing up resource to focus on higher-value activities. Its key strength is compliance consistency, as the robot never deviates from the algorithms programmed into the software.

However, the implementation of RPA also creates exposure to possible challenges.

ICAEW's Internal Audit Panel has produced the following guidance to support internal auditors in understanding the key risks associated with robotics, how to mitigate them, and how to maximise the value the business obtains from robotic software.

What is RPA?

RPA is software that can be easily programmed to autonomously mimic and perform basic and repetitive tasks usually performed by human beings interacting with a computer system.

Examples include: data entry; data search and retrieval; sending standard emails; and matching reconciliations.

Robotics has been a fundamental part of business for decades, with robots performing high-volume, repetitive tasks to improve productivity and shrink costs within the manufacturing sectors.

"Front

The full guide

Audit and Assurance Faculty members and subscribers to Faculties Online can view the full-length guide on how to provide assurance over robotic process automation.

Open the guide

Governance of RPA

KEY RISK 1: Too many robots for an organisation to manage

Multiple departments creating and maintaining robots will be subject to varying standards of risk and control, and not all of them will know all the correct procedures.

Mitigation: Internal audit should be looking for a strong governance framework supporting each robot delivery model. These should include: clear standards around organisational supervision and oversight; business justification; and development standards defined by a centre of excellence.

Selection

KEY RISK 2: Process unsuitable for automation

There is a risk that some robotic process automation could actually diminish the control effectiveness of a process. The benefits of RPA may not outweigh the cost of investment, and the proliferation of robots could lead to a more vulnerable and fragmented technological environment.

Mitigation: Business management should have a structured way of assessing which processes are suitable for automation. This should hinge on predetermined factors, such as:

  • the inherent risk of the underlying process;
  • the degree of process complexity;
  • the degree of subjectivity and variability in decision outcomes; and
  • the stability of the IT environment.

These factors should then be weighed against expected benefits, for example the delivery of greater efficiency and cost reduction.

Design

KEY RISK 3: Robots may not do what we need and cannot think like humans (yet!)

Robots may not accomplish the stated objectives. Furthermore, as they are not sentient and cannot naturally assess anomalies and irregularities as a human would, they are capable of generating inappropriate outputs.

Mitigation: Data quality is critical to the effectiveness of a robot and therefore data governance controls will be key in the design, even for simple robots. These will include controls on the maintenance of source data, as well as to ensure data integrity, and deliver security and confidentiality, which is critical for public confidence. Internal audit should check that process mapping, which should include the data lifecycle and data quality, is thorough.

Development and testing

KEY RISK 4: Robot may not work as designed due to coding errors

A robot may not accomplish stated objectives because it has been badly coded and testing environments may not be readily available. This could result in an error occurring in a live environment.

Mitigation:Robot development should comply with standard organisational principles for IT development, including appropriate testing prior to implementation. This should include a proof of concept and a transition phase of parallel runs of the automated and non-automated process to compare outputs. Monitoring and error-handling routines should be built into the processing, to ensure operational issues and anomalies are detected early and responded to automatically where possible.

Live monitoring

KEY RISK 5: Robot may go wrong after it has been put into use

Robots may misbehave without the business realising, meaning the robot has time to execute a high volume of errors and cause damage.

Mitigation:Depending on the risk, robots will require varying levels of human supervision by the process owner, based on dedicated metrics. Internal audit can review whether the list of available key risk indicators (KRIs) is complete in relation to the underlying process, and:

  • provide assurance on the accuracy of management's metrics and identify new insights into the process operation;
  • test the effectiveness of robot alerts and kill-switches to instantly curtail the robot in the event of a problem;
  • review quarantined exceptions to ensure timely action by a human monitor; and
  • analyse the issues and complaints logs to assess if themes can be attributed to robot performance.

Change management

KEY RISK 6: Changes to the robot may damage its effectiveness

A robot may fail to respond correctly if necessary updates to its rules, due to a changing business model or operating environment, are not implemented. It could also be that actual changes made to the robot's rules have unforeseen and damaging impacts.

Mitigation: Internal audit should assess whether there is clear accountability of who is to determine when changes are required to the robot and who is to reevaluate the risk. They will need to review whether there is a structured change management process and if it includes restrictions on: who can execute changes; testing and approvals; change logs; backups of prior versions; and notifications to impacted users.

Process continuity

KEY RISK 7: Robot breaks down

Widespread use of robots can create a dependency which makes the business vulnerable if they become disabled, particularly if the business has lost key process knowledge. A second risk is that if the robot should fail there may not be sufficient staff to operate processes manually in its absence.

Mitigation: Internal audit should look for documentation that articulates a clear business continuity plan that captures back-up procedures and the sources of data required to complete the work. Internal audit should review whether the business continuity plan sets out how activities will be resumed, whether there is regular testing of system capacity and whether there is a process to ensure that areas impacted downstream are notified of the robot's incapacity. Robots will evolve over time and periodic risk assessments will also be necessary to determine whether the robot should be retired.

For more information on the key risks and mitigating them, as well as discussion on how robots are transforming internal audit functions, download our full guide.

Further reading

Read about Robotic Process Automation (RPA) in finance and get involved in the debate:

ICAEW's assurance resource

This page is part of ICAEW’s online assurance resource, which replaces the Assurance Sourcebook.

Find out more.