Skip to main content

https://cdei.blog.gov.uk/2020/07/19/addressing-trust-in-public-sector-data-use/

Driving forward trustworthy data sharing

Posted by: and , Posted on: - Categories: Covid-19, Data-sharing, Ethical innovation

Set up by the government in 2018, the CDEI has a unique remit: to help the UK navigate the ethical challenges presented by AI and data-driven technology. We are led by an independent board of experts from across industry, civil society, academia and government. CDEI publications do not represent government policy or advice.

Today the CDEI publishes its first report on public sector data sharing. It explores the barriers to data sharing and focuses on citizen trust which the report argues needs to be addressed if we are to maximise the value of data held.

The last few months have shown more than ever the importance of access to data for the running of responsive and effective public services. Coronavirus response teams from across the public sector have needed urgent access to information to understand how the situation is changing day-by-day and act accordingly.

The need to respond to the crisis has affected every part of government and unified the public sector around a common objective. At the beginning of the pandemic, the Information Commissioner's Office (ICO) and the Secretary of State for Health and Social Care both highlighted the flexibility available under GDPR to allow data to be shared in the name of public health. This helped to avoid misconceptions about data protection legislation getting in the way of what was necessary to save lives. 

This environment has seen data shared and used to confront different challenges posed by coronavirus. Data has been shared from NHS trusts around the country to understand the pressure on hospitals in real-time. The private sector has shared data to support the tracking and forecasting of the spread of the virus. And data, such as the shielded patients lists, has also been shared to support local public service delivery.

That is not to say there have not been difficulties. But in a time of crisis, when the immediate reasons and benefits are clear, it is evident that data can be shared quickly. The report we publish today highlights that sharing data has traditionally been more challenging. It also argues that public trust is crucial. Data sharing across the public sector has for a long time been characterised as resource intensive and bureaucratic, held back by technical, legal, and cultural barriers. We must now reflect on what can be learnt from the crisis to support trustworthy data sharing for the benefit of society.

Trustworthy data sharing

Our report was largely written before the start of the current crisis. But the response to COVID-19 has brought the issues into sharp focus.

We have looked at case studies from across the public sector to understand how data can be shared successfully, and how teams have overcome the technical, legal and cultural barriers highlighted by many other reports. 

Across the case studies we found that the inconsistent approaches to these solutions, especially the use of different legal gateways and governance mechanisms, creates a complex environment which hinders transparency and accountability. There were only limited efforts made to address how this could have a negative impact on public trust.

Most of these projects exist in a high risk environment of “tenuous trust”, in which citizens are not particularly aware of how data about them is used and shared. In surveys citizens express a willingness to allow data to be used by the public sector for publicly beneficial purposes. However, when citizens are made aware - often as a result of a data breach or by a media investigation - of data uses they knew little or nothing about, they can receive an unwelcome surprise, further damaging trust and potentially undermining future efforts to use data for public good.

The end result is often a profound uncertainty among public servants about what types of data use are considered acceptable by citizens. Such uncertainty may provide a perverse incentive to restrict public awareness of how data is used and shared in attempts to prevent a public backlash. But it is perhaps more likely to hinder beneficial initiatives that could spur innovation. Data holders may not share data despite there being a compelling public benefit. 

Trust matrix

In our report we lay out an initial framework designed to help those seeking to use data, to do so in a way that is trustworthy. The framework aims to start a conversation about creating consistency and clarity about how the public sector should approach data sharing projects. The questions listed in the matrix are ones that would be addressed in any well-run project. However, being able to demonstrate that these are being addressed underpins a trustworthy approach. In particular, it is important to be clear about when data is being used to explore the feasibility or potential impact of a policy; and when it is being used directly in service delivery. The risks and governance considerations are different. Such questions may also vary between projects. An important step in the process, which this matrix is designed to support, is the identification of specific questions which need to be answered.

Key questions to consider
Value (and impact)

Data use should provide a benefit to individuals or society that is measured and evidenced

  • Who benefits from the data being shared?
  • Who has to take on any risk?
  • Is there a clear statement of the expected benefits? 
  • How are different groups (and individuals) in society affected?
  • Does the benefits statement distinguish between benefits from ‘anonymous’ data (to produce statistics, test hypotheses, model impacts, develop potential products) and use of personal data (to deliver products to individuals or make decisions about individuals)?
  • Does the benefits statement clearly state how benefits, potential harms and biases will be measured?
Security

What is in place to ensure data is used securely and protects individual privacy?

  • What measures are in place to prevent misuse and to control for extensions to original purposes?
  • Is there appropriate use of data minimisation, de-identification and privacy enhancing technology? Is the extent of data used justified by the benefits statement? 
  • If data is being used anonymously is there a clear definition of what this means (e.g. ONS Five Safes) and how it is applied? Given the differing interpretations of this term, it can help to use alternative language stating explicitly how privacy is being preserved.
Accountability (over and above compliance with the Data Protection Act)

Who is responsible for decisions about data use? 

  • How are decisions made about acceptable levels of efficacy and safety; the trade-offs between benefits and risks, including risks of privacy invasion or bias; levels of transparency and user control? Are the decisions and their rationale documented?
  • What mechanisms are in place to ensure accountability for decisions? 
  • If individual subjects do not give explicit consent, what mechanisms are in place to ensure broader societal consent?
Transparency

To what extent is the rationale and operation of the project open to public scrutiny?

  • Are answers to the issues raised in this framework in the public domain including the rationale for any trade-offs between privacy and efficacy?
  • Is an appropriate budget and resource in place to communicate the rationale for the project to those affected?
  • To what extent is the evidence of efficacy and privacy open to independent scrutiny through open source code and scientific evaluation? 
Control

What role do individuals have in the decision to use data about them?

  • To what extent does the project result in a product of service that delivers a benefit to individuals who can choose whether or not to use it?  
  • To what extent does the project enable individuals to see or use any data generated themselves through for example, data portability mechanisms or the use of personal data stores?

CDEI is working with other organisations to apply, test and revise the framework in different contexts. We are keen to hear from other organisations who may be interested in working with us.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.