Skip to main content

https://rtau.blog.gov.uk/2020/05/14/public-attitudes-on-the-fair-use-of-data-and-algorithms-in-finance-collaborating-with-the-behavioural-insights-team-bit/

Public attitudes on the fair use of data and algorithms in finance

Posted by: , Posted on: - Categories: Algorithms, Bias, Data collection, Decision-making

Financial companies are increasingly using complex algorithms to make decisions regarding loans or insurance - algorithms that look for patterns in data which are associated with risks of default or high insurance claims. This raises risks of bias and discrimination because the algorithms can identify risk factors that are linked to protected characteristics such as sex or ethnicity. For example, an algorithm might decide to offer less credit to people who work part-time, which would affect women more than men. 

The CDEI is currently working on recommendations as to how organisations can identify and mitigate algorithmic bias. To begin this process, however, we wanted to learn more about public attitudes, most notably how concerned the public are about these risks. To do this, we ran an experiment with the Behavioural Insights Team (BIT), in which people took part in an online experiment where they could choose to reward or move money away from different (imaginary) banks according to how they used data to decide how much to lend to people. 

What did we find out?

We found that, on average, people moved twice as much money away from banks that use algorithms in loan application decisions, when told that the algorithms draw on proxy data for protected characteristics or social media data. 

Not surprisingly, those most at risk of being discriminated against feel most strongly that it is unfair for a bank to use proxy information for protected characteristics (noting that this is not conclusive). For example, directionally, women punish the bank that used information which could act as a proxy for sex more strongly than men. 

When participants believe that an algorithm is more accurate, however, they view it as fairer. This brings into question whether there are legitimate proxies, for example salary, which although could function as proxies for sex and ethnicity, may also accurately assist a bank in making decisions around loan eligibility. 

People’s views of social media use in algorithmic decision-making are also interesting. Although not statistically significant, it is evident that people are less concerned about the use of social media data than about data that relates to sex and ethnicity. People who use social media a lot are, however, no more concerned about the use of social media data to inform loan decisions than those who use it less.  

The experiment also highlighted inherent tensions that face banks when utilising data and algorithms. For example, it is illegal to use protected characteristics as the basis for less favourable treatment and, therefore, banks avoid collecting sensitive data about their customers and considering it in algorithms. However, without collecting this data banks aren’t able to test whether their algorithms may indirectly discriminate against customers. This tension was well illustrated by the recent investigation by New York’s Department of Financial Services’ into Goldman Sachs for potential credit discrimination by sex.

The full details of the experiment we ran and our findings can be found in the report published by the Behavioural Insights Team (BIT).

Next steps

The report sets out ideas for further work to be done by the CDEI, policymakers and financial services. One interesting area to look into could be public attitudes about newer forms of data being explored in finance and insurance, such as social media, wearables or telematics, in order to reach a consensus on what constitutes a responsible use of this data across the industry. A similar suggestion was made in the CDEI’s Snapshot on AI and Personal Insurance.

This work formed part of a year-long review into bias in algorithmic decision-making, originally due for completion at the end of March. The publication of the report has been delayed due to COVID-19, however we will be sharing further details of our findings in the coming weeks, leading up to publication in the summer.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.