This report draws together the findings and recommendations from a broad range of work. We have focused on the use of algorithms in significant decisions about individuals, looking across four sectors (recruitment, financial services, policing and local government), and making cross-cutting recommendations that aim to help build the right systems so that algorithms improve, rather than worsen, decision-making.
Almost all (13 of 16) of this month’s entries were related to healthcare, with the majority of those specifically looking at use-cases in hospitals. Given that the UK faces an ongoing public health crisis and is entering a second-wave of coronavirus infections, it is not surprising that these use-cases are the most prevalent at this time.
The number of brand new use-cases that we are seeing each month has seen a downturn since we began compiling the COVID-19 repository, although we are continuing to find further examples of the existing entries that we have been tracking, indicating that existing use-cases are being adopted more widely.
Although the majority (70%) of the use-cases we have added over the last month are still related to health and social care, the focus has moved away from managing the immediate public health crisis and towards building future resilience.
What is the AI Barometer? The CDEI’s AI Barometer is a major analysis of the most pressing opportunities, risks, and governance challenges associated with AI and data use in the UK. The first iteration of the AI Barometer covers five …
Although there are a number of entirely new uses of data and AI, approximately half of our April database relates to extension or pivots of existing activity to a new context, or a new synthesis of existing data sources. We would expect to see an increase in novel use-cases as time goes on, after further development and testing time.
Financial companies are increasingly using complex algorithms to make decisions regarding loans or insurance - algorithms that look for patterns in data which are associated with risks of default or high insurance claims. This raises risks of bias and discrimination …
Recent reports suggest 9 out of 10 people are biased against women in some way. We wanted to mark International Women’s Day this year by talking about bias in a world of data-driven technology and artificial intelligence, and our forthcoming report on bias in algorithmic decision-making.
Who gets to decide where the boundary lies between national security concerns, and an individual's right to privacy? Are the right guidelines in place for policing's increasing use of technology and data? These were some of the issues discussed at this week's lecture hosted by RUSI.
If 2020 is going to be the year that we get serious about ethical principles, as many commentators in the AI and ethics community claim, I can think of few better places to begin than with personal insurance.