ICAEW.com works better with JavaScript enabled.

Ethical use of big data in financial services

The following three groups of principles can help board members, management, and those working in and with banks, insurers and investment management identify, prevent and resolve ethical tensions. They may also add to regulatory thinking and debate. We have provided illustrative examples for each principle.

There will be winners and losers from the fundamental changes to banking, insurance and investment management brought about by big data. More competition, the destruction of traditional concepts like risk pooling, and disaggregation all threaten the businesses we see today.

Customers may benefit from improved services and more suitable products, and companies can better manage risks and be more efficient. However, customers may be unfairly excluded or priced out of vital markets, and companies will experience new risks. 

ICAEW explores big data

In part one of this series ICAEW explored how banking, insurance, investment management and payments have all been changing because of the availability of, and ability to process, higher volumes and more types of data.

Unlike those in many businesses, workers in financial services will find that big data is not just a marketing opportunity. While consumers can choose to ignore certain companies and discriminate where, when and how to make purchases, financial services are essential to our everyday lives, and the collection and use of data is intrinsic to the sector, meaning there is a sharper need for ethics. 

Use of big data creates a self-reinforcing cycle for the industry. Increasing reliance on data allows consumer behaviour to be shaped by digital applications, which will in turn increase the supply of data, giving a winner takes all urgency to the debate.  

Decisions financial services companies and their boards make today about how, when and what big data is used will have a social impact in ways that will not apply to other industries. Regulation is developing in this area, but it is currently inconsistent and patchy. Financial services providers must be responsible as consumers grow wiser about the realities of data driven business. Regulation is part of this evolution. The industry, regulators and consumers must work together to balance the tensions between a fair market and an optimised market. 

Part three of this series investigates what customers can do to be more responsible consumers in the digital age. 

Helpful principles

The following three groups of principles can help board members, management, and those working in and with banks, insurers and investment management find, halt and resolve ethical tensions.

They may also add to regulatory thinking and debate. We have provided illustrative examples for each principle.


Being accountable for big data

Complexity and the use of proprietary third party technology do not justify a lack of understanding of big data or algorithm use by senior management. Board members use this understanding to determine strategy and uphold company values and culture. Regulators have clearly articulated the view that accountability lies with senior management.

Senior management are accountable for big data and algorithms


Make Chief Data Officer a Senior Management Function

Big data and algorithms are increasingly important to businesses. Boards do not currently have the skills or experience to take responsibility for big data and algorithms, and the industry lacks “unicorns” individuals who can understand the science and statistics, as well as explain outcomes simply and clearly. A Chief Data Office (CDO) or equivalent should become a Senior Management Function under the Senior Managers and Certification Regime and equivalent regimes around the globe, to ensure clear accountability within firms and with regulators.

CDOs should ensure boards and companies have the appropriate skills and knowledge to support their use of big data and algorithms. The level of knowledge boards need, and what processes they are able to rely on, is a matter for debate, but there needs to be a reliable interpretation of results by a trusted communicator. Where there is a skills lag or gap between the first and third lines of defence, this should be monitored and remedied.

Black boxes are not excuses for unfair outcomes

Complexity and the use of proprietary third party technology do not justify a lack of understanding of big data or algorithm use by senior management. Board members use this understanding to determine strategy and uphold company values and culture. Regulators have clearly articulated the view that accountability lies with senior management.

Lead on the company view of doing the right thing

Keep social purpose front of mind

Data helps businesses target, persuade, reward and penalise customers, but data can also exploit human biases. This has occurred for years in financial services (ie, cheaper mortgage rates for current account customers), but its potential, and the potential for harm, is amplified by the amount of data now available and the use of algorithms and machine learning. When increasingly individualised products and services result in adverse social outcomes, banks and insurers must reassess their products to ensure they are fulfilling the social purpose on which their existence is based.


A human perspective

Discrimination is inherent in the business model of banks and insurers and reflects the risks different customers present. This is why we are charged different premiums and interest rates for the same or similar products. Bringing big data and algorithms into the decision making process increases the potential for harm. Discrimination on the basis of protected characteristics or uncontrollable factors which are explicitly or implicitly captured in the data used may occur illegally or unfairly.

Boards need to lead their businesses on the “right thing to do”, both in terms of the law, their regulatory responsibilities (ie, the six treating customers fairly outcomes) and ethics. Actions should be anchored in the social purpose of the business.

When shifting towards greater use of technology, banks, insurers and investment managers need to be pragmatic about the value of experience and emotion in decision making (the value of the local bank manager or your financial planner) compared to data and algorithms. They must be vocal when they do not understand, and take a long term view on whether outcomes are acceptable or not. They should ask themselves if they would be comfortable with the decision over time as we see what current trends in technology endure and what ends up as a passing fad. Cal Newport illustrates the need to anticipate changing norms in his book entitled Digital Minimalism. "You're gonna look at allowing a 13-year-old to have a smartphone the same way that you would look at allowing your 13-year-old to smoke a cigarette."

Take the customer view

A bird’s eye view and a questioning mind set are needed to ensure that big data and algorithms are being used strategically and purposefully. For example, how do customer outcomes compare to traditional underwriting which is statistically and actuarially robust? Are there different outcomes for people who would otherwise be similarly situated, but for a protected factor? Is the business making more suitable products and services?


Invest in and foster a diverse organisation

Diversity in all its forms

Banking, insurance and investment management serve all sections of society. As well as its intrinsic benefits for the business as a whole, diversity helps combat implicit bias in data and how it is used. This is vital at board level, but also among technical specialists like data scientists and in the second and third lines of defence. Diversity in all its forms (gender, background, specialism, experience) should be a board level objective. Diversity needs to be supported by competence, and organisations need to attract, train, develop and retain the right talent.

Budget for being ethical

Diversity is not only required in recruitment and promotion; it is also needed in decision making groups. Where organisations are becoming more resource constrained, efficiency drives lead to fewer decision makers, and therefore less diversity of views. 

Boards need to ensure businesses have the resources needed to be diverse and inclusive, in order to support effective decision making and risk management. This includes the budget for training and support (like a technical subcommittee) for those charged with governance so they can engage with, and scrutinise how, big data and algorithms are being used.


Managing in a big data world

Financial services businesses are frequently complex, operating from a range of different, often legacy, systems. Managing business as usual in a big data world will present challenges that will require time, investment and resources to help businesses thrive. 

Adopt one organisation-wide definition of what constitutes big data

Big data doesn’t always mean new data

Despite its recent adoption into mainstream vocabulary, big data is not new for financial institutions. Part of their big data landscape will be existing data they are able to use in new ways due to increased computing powers and storage volumes.

Manage your glossary

Wider and more varied data sets mean a typical glossary becomes a disparate and eclectic set of terms that is complicated to manage and understand. A coordinated ownership model for data needs to sit at the right level for oversight. It should be regularly reviewed to avoid duplication and cross over from different parts of the business. The various locations where data resides and its dynamic nature and varied forms make reviewing more challenging.

Have a holistic view of where and how your company is using big data

A clear data architecture and a comprehensive inventory of where algorithms are being used is vital for boards and management to understand what data they have collected or acquired, how it is being used, and how the outcomes from use are traceable. Management need to be able to step back and consider if data and algorithm use is taking the business in the right direction strategically and is congruent with its social purpose.

A systems and data architecture will include data sources, how it is ingested and stored, how it is processed, streamed and published and where it is used in analysis, proofs of concept and development-use cases. Any weaknesses in algorithms and data held should be recorded, along with where the algorithms and data are in their lifecycle.


Have a common framework for big data governance and risk management

Capable and justifiable use of big data 

Organisations must be confident that they have the analytical ability, statistical validity, sufficient processing speed, appropriate methods of capture, suitable volumes of data storage and the streaming and querying ability to use big data legitimately. Without these, they will be at a competitive disadvantage and at regulatory risk from inappropriate conclusions drawn that could lead to unfair treatment of customers and mis-selling. Holding higher volumes and more types of data also increases the potential for attacks, another risk for the board to manage.

Know how much is enough

Unlike with risk or financial data, there may never be a complete set of big data derived from alternative sources (for example, location data or social media data), as it is always evolving and growing. Those using data need to be satisfied they have enough for that particular purpose and it is statistically robust. Multivariate analysis will be needed to avoid drawing spurious two factor conclusions.

Correct use of data is not correct data

Even if correctly manipulated and processed, the use of big data may deliver unacceptable outcomes. It needs to be accurate for its use case, which includes being collected in a timely fashion and sufficiently comprehensive. Data provided in real time to an insurer from a telematics box would meet this criteria, but social and internet usage data collected at a point in time to aid in credit scoring may not. This is particularly important as approximations drawn from big data can result in discriminatory outcomes.

Auditor algorithms

Management should consider the use of auditor algorithms that can be observed at a higher level than human observation to flag areas that should be investigated. These algorithms may be required to ensure internal governance develops at the right pace to keep up with business changes. While algorithms upon algorithms may not seem transparent, a mechanical and therefore transparent and explainable algorithm can be a risk identification and management tool for those charged with governance.


Understand the conduct and prudential risks of using big data and algorithms

Intersection with risk and regulatory data

New information about customers and the macro environment gained from big data may allow more accurate calculation of risk levels. This information has the potential to feed into regulatory calculations about risk and capital requirements. Boards must be confident about the data being used in this way.

Understand the real value of your data business models 

Holding a concentration of data can become a competitive advantage for wholesale businesses. Data sets are sold to hedge fund and asset managers via trading platforms, creating a new revenue stream in a tough market. Sophisticated clients may be able to de-anonymise data, presenting another riskCloud storage has made this more affordable for investment banks. These revenue streams can become self-reinforcing lines of business, as more clients lead to more data. Ethical tensions exist between the objective of best customer outcomes from data versus trying to get as much data as possible to further develop a product.

Understand your workforce’s vulnerabilities 

Forty percent of jobs in financial services are back office jobs that are likely to be disrupted by more effective use of data. The resultant effects of achievements such as a “single golden view of data” can be far reaching as staff requirements change. For example, there could be a need for senior staff and some junior staff, with less need for those in the middle, like many back office jobs. Given the large proportion of individuals working in financial services who could be affected, employers have to consider the ethical implications of these changes, like redundancy, retraining and redeployment.


Treating customers fairly 

All financial services firms must be able to show consistently that fair treatment of customers is at the heart of their business model. As business models change with the increasing use of big data, firms will have to critically assess how they understand fairness, and how to keep customers at the centre of business.

Be accountable for the fact that data and algorithms may lead to illegal and unfair outcomes

Identify information containing protected characteristics and their proxies 

Protected classes are more likely to be adversely affected by big data and algorithmic decision making. Personal information can be used legitimately by financial services organisations in order to assess customer risks and to offer new products and services. It is not always clear where big data contains protected characteristics or proxies for that information. Firms also need to consider the validity of, and whether they can justifiably extrapolate from, historic data sets due to changing social norms and demographics as well as macro conditions.  

Root out euphemisms which disguise value extraction

Organisations need to be straightforward about how they are using data to enable proper governance and clear customer communication. Terms like product development, price optimisation, price efficiency and business needs can all mask activity where unfair discrimination or value extraction has the potential to occur.

Accept that old data may be biased and unable to be used fairly

If a business is using historic data, it may be embedded with incomprehensible biases from the past due to the way in which it was collected or acquired. Stereotypes can be amplified and reinforced where there are large gaps in data for some groups, and some groups have more negative data collected about them. If so it may be illegal or unethical to use this data for customer decision making and product design, and businesses will have to start from scratch.

Own difficult conversations and understand when the board is not up to the task

To deal with these matters, boards and those working across financial services will have to have conversations about race, gender, sexual orientation, disability and other sensitive topics. These topics are difficult to discuss. A combination of training, space for open discussion and continuous feedback and education will be needed, alongside diverse representation of different groups of society.


Make IT systems fit for purpose so customers can control their data

Create feedback mechanisms 

Customers need to understand what options they have in the event of an adverse big data decision (being denied insurance coverage, access to credit, or an unfavourable change in terms and conditions) against them. If the criteria for decision making are sensible and objective, customers may also struggle to access financial services elsewhere. 

Businesses need to be able to let the customer know why the decision was made, and the relevant criteria. The customer can assess whether the information is accurate or not, and offer correction. Feedback mechanisms exist with credit referencing agencies where people can provide additional data or ask for data to be corrected to improve their credit score.

Create a customer data dashboard 

A data dashboard could help institutions achieve fit for purpose IT systems. This would require a single customer view, which many institutions are working toward. Where data is held as part of an aggregated population, or derived from proxies, firms should provide a clear explanation to customers, and inform them of their options should they want to update or change data, seek recourse or complain.


Don’t sacrifice compliance for customer centricity

Digital interfaces shouldn’t cut corners on regulatory compliance

Financial services businesses are experienced in complying with regulations designed to protect customers and prevent financial crime. Some examples of these regulations include the requirement to Know Your Customer, Anti Money Laundering requirements and the duty to Treat Customers Fairly and fulfil fiduciary duties where appropriate. These duties are business as usual, and   heavily invested in by incumbents, particularly following the deluge of conduct related fines following the financial crisis. 

New entrants to the market may be focussed on more customer centric ways of doing things, making them more attractive services. The rigour of some of the new processes being adopted (like video identification and photographs of key documents) have yet to be tested.