When Robert Wigley became the chairman of UK Finance, he did not expect to be spending more time on economic crime than any other topic. “Fraud in the UK is about 40% of all reported crime. That makes it the largest form of crime in the UK.”
Economic crime is so prevalent because for criminals, it’s relatively low risk and potentially high reward. About 1% of police spending goes to fraud, for example, and sentencing is relatively light. But the economic damage is immense. Fraud and other economic crimes costs industry and consumers hundreds of millions of pounds a year. The banking and finance sector is doing a lot of work to police it, and the work it does prevents around two out of every three pounds of attempted fraud, but what gets through is extremely costly.“It's an important thing for the financial services industry to focus on, and preferably to prevent rather than to cure and reimburse.”
A lot of the work that UK Finance and its members are doing is around the use of technology in tackling financial crime. This involves a lot of collaboration between the banks and financial services providers and law enforcement. “When we cooperate with each other and with law enforcement, our rate of success in prevention goes up dramatically,” says Wigley. “We've created a number of institutions, committees and groups which focus on particular types of issue, to the extent permitted by law.”
This work feeds into the Economic Crime Strategic Board, which Wigley sits on. The board is co-chaired by the Chancellor Rishi Sunak and the Home Secretary Priti Patel, and is developing an overall anti-fraud plan for the UK. “This is genuinely a world leading cooperation between the financial services industry, and law enforcement.”
UK Finance is deploying AI and machine learning to track and detect economic crimes going on in the UK, specifically fraud, money laundering and mule accounts. It’s used in many ways, for example, when onboarding new clients, using natural language processing tools to analyse high volumes of data, to flag potentially suspicious names and addresses. This is then checked by fraud and money laundering detection teams to see if it’s worth following up.
The industry’s system generated millions of alerts last year, which resulted in around 400,000 Suspicious Activity Report (SAR) forms being submitted to the National Crime Agency. This can be time-consuming, and because many are very low level, it does sap away the time that could be put into tackling larger activities.
Through the Economic Crime Strategic Board, Wigley and UK Finance is pushing for the government to allow financial services firms to take attention away from low level activity and put focus on the bigger transactions. “We want to tackle the serious organised crime.”
Detecting mule accounts is another big area in which AI can play a part, looking at patterns of activity in accounts to spot suspicious activity. Typically, this includes unusual spikes and dips in the individual’s bank balance. Mastercard’s VocaLink operates the Mule Insights Tactical Solution (MITS) which, with member banks’ permission, monitors a large proportion of UK accounts for the telltale patterns of a mule account.
The money often leaves the mule accounts in nanoseconds, facilitated by bots, making the money difficult to trace. Collaboration between banks is hampered by data protection laws. UK Finance is seeking new powers from the government that will enable banks to share information in situations where there is reasonable suspicion of illegal activity. “I’m hoping that the government later this year will bring in legislation to give us those powers,” says Wigley.
There is also a limit on what banks can share with the government, hence the need for wider information and intelligence sharing to bring down the rates of successful fraud, money laundering and other economic crimes.
One big challenge for most financial services firms is to ensure there are no brass in the algorithms. We need to keep false positives to a minimum, says Wigley, for both the efficiency of processes and to address regulators concerns about treating customers fairly.
“When it comes to, for example, applying for loan or mortgage applications, you need to be very careful that we don't have any kind of bias in the algorithms,” says Wigley. “We have produced a set of data ethics principles, specifically for use in financial services. We recently set up a Data Ethics Working Group that will aim to replicate, across the industry, some of the best practice we've seen in companies such as Visa.”
It is inevitable that regulators will introduce greater controls on how businesses use data. We need some kind of legal frameworks to explain how we can use data fairly and without exploiting customers, while also having the ability to act when the evidence of wrongdoing is compelling, says Wigley.
“We have very regular engagement with the FCA and the Bank of England on the topic of AI. The FCA and the Bank of England are working with an AI public private forum to develop their views on AI regulation and there's a report due at the end of 2021. We are exploring closely with the ICO ways to collaborate more closely on this topic because ours is one of the industries where the thinking is most advanced.”
This is all necessary, says Wigley, to ensure that the financial services sector can effectively tackle economic crime. “I feel that the industry is fighting fraud with one arm tied behind it at the moment, because we can't share all the information we would like with other firms and banks, law enforcement of the regulators. The government knows exactly what we need, they just need to find space in the legislative programme this year, preferably to get it into law. Then we'd be better equipped to deal with these issues.”
Company reform and economic crime
The Economic Crime Act 2022 became law in March and part two of the bill is incoming. From risks to required changes, we explore key considerations for accountants on the issue.