ICAEW.com works better with JavaScript enabled.

Data Management

Five years after data compliance requirement BCBS 239 was introduced, Rob Konowalchuk investigates what’s changed and where banks are heading.

Just over five years ago, a document hit our reading lists from the Basel Committee on Banking Supervision (BCBS), which triggered a renewed industry in data management programmes in banks. By that point, we were used to reading new rules overhauling banks’ capital and liquidity requirements.

But this one was different. It began: “One of the most significant lessons learned from the global financial crisis that began in 2007 was that banks’ information technology (IT) and data architectures were inadequate to support the broad management of financial risks.” The document then defined 11 principles aimed at banks’ governance over infrastructure supporting risk data, risk data aggregation capability and reporting practices. Five years is typically sufficient to implement a new regulation, so it is reasonable to ask: are banks conforming and has BCBS 239 been a success?  This is difficult to answer.

Different Approach

Firstly, compliance with principles is harder to judge compared to more concrete rules. But also, success can be judged on multiple levels. If the question is whether all global systemically important banks (G-SIBs) meet the requirements fully, then the answer is no.

As of the BCBS’s last update (albeit a year ago now), only one G-SIB complied by the January 2016 deadline and many banks reported a good two or three years to run to be fully compliant.
But perhaps we should also ask the following questions: has BCBS 239 caused banks to  dramatically rethink their approach to data management? Have the principles become the de facto standard for managing data across the whole bank– not just the risk organisation? Have banks begun to see more engagement with their data and between functional areas, between the business and IT, and with their regulators? Has BCBS 239 triggered a more prominent role for the chief data officer in banks? The answer is a resounding yes.

No other missive from the Basel Committee has become so universally known by its code  name as BCBS 239 has. People outside the arcane world of banking regulation throw the term around in many different contexts. This is not just because it remains an area where banks are battling to comply, or because of the ever-heightening risk that IT resilience poses to banks (as cited by the European Banking Authority in its latest roundup of risks facing the sector in the EU). It is because it has become a trigger for large-scale, strategic investment in data architecture as a valuable asset to banks.

On its own, data is useless. So all the investment in high quality data (and data models) is futile if it is not organised correctly, understood and put to good use. By the same token, without getting the hygiene factors right, banks would not stand a chance of creating timely and insightful information with which to make important decisions.

Becoming compliant

One of the trends that BCBS 239 sparked was the rise (or creation) of the chief data officer within banks. This C-level prominence reflects the strategic nature of the issues at hand. Most large banks have a well-defined data strategy. Executing such a strategy is not a one-off project, but an ongoing job. As the business changes (new products, new rules, new processes, new technology), data management must adapt. Data strategies break down silos as data integration is a key aspect of the target architecture and therefore how disparate people and teams interact with data. The classic risk and finance divide features heavily here, but so does the integration of operational data and market data for example.

Beneath this, efforts are still ongoing to strengthen data governance. Data stewards are clearly identified all the way back in the lineage, data quality tooling is improving in robustness and usability, and upstream data owners have much more business context for their data. Another finding of the BCBS report a year ago, which still rings true today, is an incomplete integration of data management with banks’ wider business architecture and strategy. In this context, architecture refers to all of the systems, processes, policies and standards, as well as people and culture to support business capabilities. Data is the bloodline of the business but cannot run smoothly without all these other elements.

Banks are acutely aware of this, and are investing in tooling to form a more holistic view of how data management fits in with this wider context. The last item in the list above – people and culture – must not be overlooked. Systems and models can do so much with data. But it is people who engage with it. Banks that have adopted a ‘know your data’ culture are maximising value from their investment in data management.

Game changing

As with most game-changing regulations, the first struggle is to comply on time – in whatever tactical way feasible. Once that is in a steady state, one can think about a more sustainable and strategic approach. At least, that has been the case for the vast amount of regulation unleashed in the past decade. While many banks may not be officially classified as “fully BCBS 239 compliant”, many have pressed ahead with more strategic and innovative methods of managing data.

This does not always involve the use of new technology still in proof-of-concept, but often better use of tried and tested technologies. Enterprise-wide information management programmes have been inspired by BCBS 239 concepts, involving the cataloguing and definition of key reporting metrics, their aggregation and calculation logic, their component data elements and corresponding lineage and ownership. Similarly, while BCBS 239 is ostensibly about risk data (and some finance data by extension), other critical banking functional areas are now investing in more robust and business-centric approaches to data. Treasury functions are redefining their logical data models and business architecture for critical aspects of liquidity and interest rate management.

Consolidated data layers for risk, finance and regulatory data are nothing new, and many banks have had this in development or in place for a while. However, many of these banks are now developing new, more automated reporting platforms to realise the benefits of having all their cleansed and standardised data in one place.

In parallel to this, there is large-scale investment in more exploratory and innovative ways of doing things. The BCBS’s 2017 progress report cites “over-reliance on manual processes” as a main finding. Process excellence teams have a refreshed mandate to drive efficiency and cut costs. Robotics solutions are emerging as a highly effective, yet practical approach. Robotic process automation enables repetitive tasks to be automated, while machine learning enabled intelligent process automation allows robots to take over complex and highly skilled tasks. These approaches will effect data management.

Finding solutions

Robotics is already a proven concept and widely put to use in banks – more in operational areas than in risk and finance. More experimental are the data innovation labs and data science teams in place at most large banks. These teams are identifying potential use cases for machine learning and natural language processing, for example, including the performance of critical risk data aggregation and reporting tasks.

Above all, banks are in business to serve customers. Banks’ annual reports detail high amounts of investment in mandatory regulatory change and remediation; and improving the customer experience, typically through investing in new digital channels. This means that there is a risk that investment in risk and finance infrastructure may be de-emphasised. Leading banks recognise that, come budget time, these decisions are not mutually exclusive.

BCBS 239 has undoubtedly driven many demonstrable improvements in banks. But, there is much more to do on data integration, engaging the business around data, prioritising investment in infrastructure amid other firefighting or headline-grabbing initiatives, while investing in innovative solutions to future-proof the business.

Returning to the question of whether BCBS 239 has been a success: it is clear that managing data in banks is an evolving journey, bringing new challenges and innovation opportunities. Perhaps the question to address should be: when banks’ use of data is undergoing such profound change, is ‘fully compliant’ really an end-state that will be realised anytime soon? Or should application of the principles and adapting practices to business and technological change be viewed as an ongoing process, with material benefits realised along the way?

Rob Konowalchuk,
Partner, Avantage Reply

This article was taken from the April 2018 edition of FS Focus published by the Financial Services Faculty at ICAEW. Find other editions of the magazine and find out more about the Financial Services Faculty.