In a column for IFoA’s Inclusive Insurance Bulletin, Kelly outlined that the collection and use of data is intrinsic to the financial services sector. The essential and deeply personal nature of financial services means there is a conspicuous need for ethics among businesses applying data science to personal data to develop financial products.
Data challenges are borne out in the extension of payment holidays to retail, mortgage and other loan customers without banks needing to seek information about the reason for the holiday. Banks then have to look at new and alternative data they can use to get a clearer picture of which customers may simply be having a temporary liquidity problem versus those who are genuinely distressed. Banks must approach this carefully to avoid creating additional conduct risk and ensure customers are treated fairly in unprecedented circumstances.
At the same time the reduction in the use of cash due to coronavirus concerns has pushed many more consumers into digital transactions where they may previously not have had a footprint, providing firms with more data to analyse. This has helped economists, regulators and governments understand how COVID-19 and lockdown is affecting behaviour and sped up the shift to digital payments. In parallel, the Black Lives Matter movement has brought systemic racism to the forefront of minds and up the political agenda. Differentiating between different customers based on calculated risk is inherent in the business model of banks and insurers and is the reason people are charged different premiums and interest rates for the same or similar products.
Data demonstrates where business models are problematic, such as the analysis accompanying the Financial Conduct Authority’s (FCA) 2018 consultation on overdrafts. Mick McAteer, Co-Director of the Financial Inclusion Centre sums it up: “People in the most deprived areas are 70% more likely to have to use an unarranged overdraft than those in the least deprived areas; they tend to be from Black, Asian and minority ethnic (BAME) communities and more likely to be financially vulnerable due to poor health or disability. The harm caused here is not a by-product of normal market competition, but an institutionalised form of financial discrimination against vulnerable consumers. Discrimination faced by many BAME households in other parts of their lives (education, housing and the labour market) means that they are more likely to be financially vulnerable.”
As financial services businesses bring big data and algorithms into the decision-making process, the potential for harm increases. While this has been present in insurance for some time, as practices become more sophisticated and opacity increases faster than the understanding of those responsible for governance, the risk can become more prevalent. Discrimination on the basis of protected characteristics or uncontrollable factors that are explicitly or implicitly captured in the data may occur illegally or unfairly.
So what should banks and insurers do when data and societal challenges collide?
In its 2018 publication Ethical use of big data in financial services ICAEW looked at actions organisations can take, both at the senior level and in practical terms.
Boards need to lead their businesses on the ‘right thing to do’ in terms of the law, their regulatory responsibilities (ie the FCA’s six consumer outcomes that firms should strive to achieve to ensure fair treatment of consumers) and ethics. Actions should be rooted in the social purpose of the business. This may mean that products need to be reviewed in terms of inclusivity and with regard to whether there are barriers to inclusion inherent in the design or distribution of insurance products.
When shifting towards greater use of technology, banks, insurers and investment managers need to be conscious of the value of experience and emotion in decision making (the value of a local bank manager or a person’s financial planner) compared to data and algorithms. Board members, in particular non-executive directors, must be vocal when they do not understand, and take a long-term view on whether outcomes are acceptable or not. They should ask themselves if they would be comfortable with the decision over time, as we see what current trends in technology endure and what is a passing fad. In his book Digital Minimalism Cal Newport illustrates the need to anticipate changing norms. “You’re gonna look at allowing a 13-year-old to have a smartphone the same way that you would look at allowing your 13-year-old to smoke a cigarette.”
Invest in and foster a diverse organisation. Banking, insurance and investment management should serve all sections of society. As well as its intrinsic benefits for the business as a whole, diversity helps combat implicit bias in data and how it is used. This is essential at board level, but also among technical specialists, such as data scientists, and in the second and third lines of defence. Diversity in all its forms (gender, background, specialism, experience, etc) should be a board level objective. Diversity needs to be supported by competence, and organisations need to attract, train, develop and retain the right talent. In doing so, they will help to create and sustain an inclusive organisation that benefits the business, customers and society at large.
ICAEW Know-How from the Financial Services Faculty
This guidance is created by the Financial Services Faculty & recognised internationally as a leading source of expertise and know-how on banking, insurance and investment management issues. Join the Faculty to gain digital access to practical guidance, expert analysis and professional development support across the financial services industry.