I have long believed in the power of data to make a meaningful difference to people's lives. Working in the field for over 30 years, I have seen the ways in which data can be used as a powerful force for good such as relevant money saving and information at the touch of a button, to transforming lives with the promise of AI-enabled drug discoveries to cure life-threatening illnesses.
I have also seen that we can only realise these benefits by using data wisely and responsibly, thereby earning the trust of the public. It was this commitment to trustworthy innovation in data that motivated me to join the Centre for Data Ethics and Innovation’s Advisory Board in November 2018, and why I was delighted to be asked to take a role chairing the Advisory Board in September 2021.
Now that my time with the CDEI is coming to a close, I want to celebrate what the organisation has achieved in that time, as well as look ahead to the Centre’s future. From its beginnings, delivering policy reviews that shaped the Government’s thinking on algorithmic bias and online targeting, to our current position as a key part of DCMS, the CDEI continues to deliver on trustworthy innovation in data and AI.
The past year has been a particularly exciting one, with the launch of our three programmes - AI assurance, Algorithmic Transparency, and Responsible Data Access - dealing with some of the most pressing issues in the field. My priorities for the CDEI’s work on these programmes have been: an increased focus on private sector innovation, alongside the CDEI’s vital public sector work; enabling access to better quality data; and ensuring that diversity and inclusion is always embedded in our approach.
Our AI Assurance programme is one area in particular where we’ve been working closely with the private sector to help deliver a pro-innovation environment for AI. Projects in this area include our wonderfully futuristic work with the Centre for Connected and Autonomous Vehicles on self-driving vehicles and our partnership with the Recruitment and Employment Confederation to develop guidance on the responsible use of AI in recruitment.
The recruitment project is an example of where AI use has the potential to be transformational, but where we need to make sure that we limit any potential bias. I have long been committed to improving diversity, particularly focusing on women’s empowerment and equal access to careers in the tech sector, and I was especially pleased to support this work. As the founder of the Female Lead, it is vital for me to ensure that the technologies of the future are not shaped by today’s biases.
In the next year, I am excited to see how the CDEI works closely with industry to examine the barriers and enablers to AI Assurance, so it can continue to be an enabler in this ecosystem (which could be worth billions of pounds to the UK economy!).
The CDEI’s approach to the Algorithmic Transparency programme has demonstrated our commitment to the inclusive engagement of the general public. The programme is helping to build trust in the public sector’s use of algorithmic tools, data and AI. Working with the Cabinet Office, the CDEI has developed one of the world’s first standards for algorithmic transparency in the public sector. The Standard helps organisations to inform citizens about the algorithmic tools they use, and why they’re using them. This work has been shaped by deep engagement with the public - helping us to understand what it takes to ensure that transparency is meaningful to citizens - and designing the Standard based on these insights. The programme is exploring how the Standard might be used in other contexts and I am pleased to see that some private sector organisations are already considering how to replicate it for their own purposes.
Finally, our Responsible Data Access programme will help ensure innovators have access to the high quality data they need, while building public trust. Many innovators (who could do some great work with data) just don't have the confidence that they can access and use data in a way that's trustworthy and legal. The CDEI's Responsible Data Access programme is helping to develop practical solutions to address barriers to using data.
For instance, the CDEI launched privacy-enhancing technologies (PETs) prize challenges with the US Government to drive forward innovation in these promising techniques. PETs allow innovators to realise the benefit of training AI models on sensitive data while preserving the privacy of individuals, giving them access to high quality data. These emerging techniques offer up the promise of using data to tackle some of our biggest societal challenges, and seeing the CDEI work closely with some of the biggest in the private sector, including SWIFT and BNY Mellon, is something that I hope will continue. I’m looking forward to the announcement of the prize challenge winners later this year.
The CDEI’s commitment to collaboration and getting the very best expertise into the organisation is one of the things that has impressed me most about the Centre. I am proud of the role the Advisory Board has played in developing the CDEI’s projects, whether that’s been linking the CDEI team with contacts that have unlocked new opportunities or providing (sometimes fairly robust!) feedback from their own experience. Neil Lawrence, one of the existing Advisory Board members and DeepMind Professor of Machine Learning at the University of Cambridge, will step up to chair the Advisory Board on an interim basis until September 2023. I wish Neil, and all of our expert colleagues on the Advisory Board, the very best for 2023 and beyond.
Although my formal engagement with the CDEI is coming to an end, I will follow their work with interest, sure that the team will continue to engage and collaborate with the private sector and to explore and embrace some of the newest technology and applications using an increasingly rich flow of data. The pace of capability and change is only increasing, and the CDEI’s mission is more important than ever. It's less about ‘can we’ and more about ‘should we’ . . . and if so ‘how’?
Leave a comment