Since March, we have been working to understand how, by changing how technology is designed, users can be enabled and encouraged to make more “active” choices about their online experience; that is, choices that reflect users’ wishes without obstruction, which are based on an understanding of the likely consequences.
Industry, regulators, government, and civil society have all been thinking about how to empower users online. The CMA called for a “fairness by design” duty in its market study of online platforms and digital advertising. This project is designed to strengthen the evidence base and identify effective ways to enable users to make active choices online.
What we’ve done so far
We have spoken to representatives from industry, regulators, and civil society, to draw out important user design principles. We have reviewed a collection of prominent data-driven services to identify barriers to people making active choices. These barriers include the design not being explained well, the controls being hidden, and a lack of feedback on how changes will affect the user experience.
Informed by this research, we ran workshops with participants from industry, regulators, and civil society, to create prototype designs that we think will help users to make active choices. These designs cover a range of typical online experiences, as well as a number of the different types of controls users are offered online.
Our typical relationship with internet services
Even for those who may be aware that some companies are collecting, using, and sharing more data about them than feels comfortable, finding and changing privacy settings is a struggle. And when settings are changed, it is not always clear what impact this has.
Our relationship with websites
On accessing a new site for the first time, a user is typically greeted by a banner asking them to accept “cookies”. A lot of the time, the easiest way to remove that banner is to click “accept all”. Sometimes users are taken through a long list of options, but it’s rarely clear what these mean; and so many will click accept without a full understanding of the implications of that choice.
Our relationship with social media
People don’t always know how the content they see on social media feeds is organised or how the content can be controlled with different settings. This can lead to them seeing false information without knowing where it comes from, why they are seeing it or how they can block it.
Our relationship with our smartphones
Newly installed smartphone apps will often ask permission to access a user’s camera, microphone, location or other functions, but it can be unclear whether or not these permissions are necessary for the app to function properly. Users may therefore accept these permissions with little understanding of what data the new app can access and how it’s used.
These examples reflect the relationship many of us have with our data collection and privacy settings when using online technology.
Why user design matters
Data-driven services affect so many aspects of our lives, but people cannot easily shape them to reflect their own values and preferences. Research has shown that when asked how they feel about how digital platforms use their data, the public finds the current controls companies offer on their data are hard to use or even understand. During the CDEI’s own public engagement research, conducted last year, most participants who tried to change their settings and preferences online found doing so challenging and questioned whether they offered meaningful control.
This has led to many feeling resigned to an imbalanced relationship between themselves and the technology they use. Doteveryone’s research found that 89% of the public say it’s important to choose how much data they share with companies but only 25% currently find that out. Nearly half (47%) feel they have no choice but to sign up to services despite concerns.
The design of user interfaces developed by digital platforms is an increasingly prominent issue. Increasing attention is paid to the power of “defaults” (a preset option that are selected by the service, which can later be changed by the user) and the use of so-called “dark patterns” (where the settings that benefit the company running the online service are more prominent than those that do not) to influence decision-making.
Digital platforms have acknowledged this by starting to make changes to their services. Google has changed its data retention practices to make auto-delete the default for core activity settings, Apple now allows developers to detail their app’s privacy practices in the App Store for user review, and Facebook’s Privacy Basics now guides users through privacy options. More work is needed to build on this progress and move the industry towards engaging users in active choices.
Over the next six months, we will run user testing of these prototypes with thousands of participants and measure their impact on users’ ability to make active choices. In this live experimental setting, we will look at ways to make decisions easier and more informed, testing various principles around design and behavioural science that have been put forward.
The CDEI will publish the results of these experiments in a report that will also present further ideas for ways active choices could be better facilitated by firms and regulators.
Our published progress update, summarises the research used to inform the experiments and sets out the detail of how they will run.
About the CDEI
The CDEI was set up by government in 2018 to advise on the governance of AI and data-driven technology. We are led by an independent Board of experts from across industry, civil society, academia and government. Publications from the CDEI do not represent government policy or advice.