Nightmare on spreadsheet: take Excel use seriously
8 October 2020: The Government’s recent run-in with Excel shows that the lessons of disasters past have yet to be learned, writes ICAEW Tech Faculty’s David Lyford-Smith.
For many years, the academic organisation EuSpRIG (the European Spreadsheet Risk Interest Group) has met to discuss why spreadsheets go wrong, and why they go wrong so often. There are plenty of studies into human error, poor planning, common formula mishaps and so on. However, at the heart of it, their aptly-named horror stories page continues to grow not because Excel and its contemporaries are inherently problematic, but because organisations continue to avoid taking spreadsheet risk seriously.
When it became clear that an Excel error was at the bottom of this week’s under-reporting of coronavirus cases, the ubiquitous spreadsheet package was roundly criticised on social media. Excel, according to the Twitterati, isn’t a “serious” package suitable for something so important.
But really, the issue isn’t that Excel isn’t up to much – it’s not held on as a cornerstone of business and scientific analysis for so long by being dangerous and unusable. The program has its flaws and issues but the problem, as always, lies in people and process.
While the details remain unclear on exactly how Excel was involved in the reporting issue, the most likely culprit is that, at some stage, a simple comma-separated-values (CSV) file was exported into an old format .xls file – the Excel 1997-2003 format – which only allows for 256 columns and 65,536 rows. Excel nerds will of course know that the Excel 2007+ .xlsx format can handle 16,384 columns and 1,048,576 rows – but opening the data in the obsolete format would have cut it short, something that apparently went unnoticed downstream for several days.
So while Excel might have been the scene of the crime, it’s clear that the culprit is really poor systems integration: data being passed from one system to another should never be uncritically accepted, but instead come with some basic integrity checks, such as confirming that the number of records remained the same. Whether or not Excel is an appropriate part of a multi-system reporting process like this isn’t really what’s at issue here – the problem is that any multi-system process could exist without the checks in place to avoid these kinds of common issues.
This story will sound familiar to anyone who recalls the debacle around the West Coast Mainline franchise awarding of 2012, in which incumbent Virgin Rail successfully challenged the bidding process that would have awarded the contract instead to FirstGroup – ultimately because the financial models used to assess the bids were flawed. The flaws weren’t unique to spreadsheets then, either – they involved a lack of challenge on key passenger growth figures and an overall lack of control and review. But the mistake is now most commonly remembered as yet another “dodgy spreadsheet”.
It’s plain enough that the cycle will continue to repeat. Poorly controlled or managed business and government processes will go wrong, a spreadsheet will be fingered for the blame, everyone will write the issue off as spreadsheets being a ‘non-serious tool’ and go right back to using their own mission-critical spreadsheets none the wiser.
Until spreadsheets are taken seriously as a tool that should be controlled, reviewed, and understood, another horror story is an inevitability.
You can read ICAEW’s guidance on good spreadsheet practice for free – check out our Twenty Principles for Good Spreadsheet Practice or our Financial Modelling Code. Or join the Excel Community for more tips and advice.