ICAEW.com works better with JavaScript enabled.

Social media can now affect your credit score

13 October 2020: Writing for the Financial Services Faculty, journalist Pippa Stephens looks at how lending decisions are being made on more than just bank records.

In Kenya, how often consumers make calls to their mother and the contents of their text messages to her have been uncovered as factors used by financial services apps to make lending decisions.

The apps Tala and Branch used data from the widely adopted M-Pesa mobile money transfer service. They also took into account whether a person’s contact book contained the surnames and first names of their acquaintances, showing just how intricate the levels of personal data now available can be. 

This data poses an important challenge for the financial services industry, which increasingly makes use of statistical patterns and the technology that teases them out.

Attempting to mine social media for financial insights can be risky. 

British insurer Admiral developed an app which aimed to let the insurer see users’ Facebook posts to find out about their personality traits and help set car insurance premiums. However, the trial was pulled at the time of its launch as Facebook refused Admiral access, citing the privacy of its users. 

Facebook has a platform policy which prevents the use of its data to make judgments about eligibility, including for credit. All potential apps go through a testing stage where Facebook makes sure they comply with this platform policy.

Regardless of Admiral’s effort, there remains a broad range of personal data out there. And coupled with the data are the algorithms that work with it to turn it into useful information for businesses. But these aren’t perfect and come with a set of prejudices many fear will deepen inequality in society. 

“Algorithms don't know right from wrong; they just turn numbers into numbers,” said technologist Vivienne Ming. The formulae encode historical bias as they are using historical data - and they actually exaggerate bias, she added. 

Ming said this was especially problematic in financial services due to the amount of data used across the industry and the fact it often contained racial and gender attributes.

You can read the full article here.