The world’s leading digital media and regulatory policy journal

Improving trust in digital services

With continuing concerns over the safety of online interactions, CHRIS TAYLOR and JOHNATHAN CHARLES see forthcoming regulation as an opportunity to improve engagement among less confident citizens

Digital services are already a central feature of our everyday lives. However, we may be just at the start of a journey on which more and more transactions and experiences will migrate from the physical to the digital world. Development of digital services in the coming decade has the potential to deliver innovative use cases of great benefit in important sectors like healthcare and education, as well as further transforming media and entertainment.

Adoption of new technology and engagement with digital services is not homogenous. Digital engagement can be constrained by a number of factors, including lack of infrastructure or poor quality connectivity, low digital skills, and low trust and confidence arising from concerns about online harms. In this paper, we look at the latter of these factors and explain how new legislation and regulatory reforms present important opportunities to tackle barriers to confidence and trust.

Confidence to use digital services

Evidence shows that, whilst the digital economy continues to grow and deliver benefits, exposure to online harms, or fear of them, may affect trust and confidence in digital services, creating an impediment to engagement and take up.

It is important to give this the right context. Research has found that most people are generally comfortable online and confident that they can identify and navigate away from harmful material.1 However there is evidence that a significant proportion of users have been exposed to or experienced online harm and that people have reported a high level of concerns about this.2 3 A number of studies have identified areas in which people feel uneasy or distrust some digital services.4 5

Risks to consumers and citizens in the digital world should not be seen as static. Just as the online ecosystem evolves and the benefits it brings change, so will risks of harm and perceptions of this. Data from the UK Office of National Statistics demonstrate this. These data show that, between 2009 and 2019 (the latest year for which data is available), the steepest rise in reasons for not being online in Great Britain related to privacy or security concerns.

Figure 1: Percentage of households in Great Britain which do not have internet access by reason given

 20102019
Don’t need internet3961
Lack of skills2134
Privacy or security concerns433
Access costs too high1529
Equipment costs too high1828
Other reason1325
Have access to the internet elsewhere816
What are online harms?

In the digital world, as in any setting where large numbers of people meet and transact, things can go wrong. Causes can be legal (e.g. exposure of individuals to content which they find distressing or addictive) or illegal (e.g. fraud, or misuse of personal data). Most of these harms are not unique to digital environments, but they have found new manifestations there. Problems may arise which are particular to online experiences, both as a result of the ease with which individuals can access material from any location at any time, and the amount of time they spend online.6 It follows that, as the reach and importance of the digital world grows, so does the potential for harm there. Harms may arise individually or cumulatively (for example increased exposure to other risks can be a consequence of online addiction).

Examples of online harms

1. Harmful content

Harmful content encompasses a very broad scope of possible problems, including illegal material (such as incitement to violence), legal but harmful material (such as content which might give rise to eating disorders), targeting of individuals (such as cyber-bullying, trolling, cyber-stalking, harassment) and misinformation (such as ‘fake news’).

2. Risks to health

For some individuals, online services can risk exposure to material or behaviours which become addictive, or can exacerbate pre-existing health risks. For example, addictions to gambling or pornography where online channels also raise risks of age inappropriate behaviour. Excessive screen time can lead to addictive behaviour in relation to social media or other online facilities.

3. Fraud and scams

Scammers have used online channels to target misleading or false information (e.g. fake reviews), including to particularly vulnerable groups. This can include investment or pensions fraud.

4. Dark patterns

Presentation of information which manipulates consumers into making a choice which favours the content provider, for example messages telling consumers that there is limited stock of an item and/or that it is in high demand.

5. Unfair price discrimination

Data on individual preferences or behaviours can enable higher prices for goods and services to be targeted unfairly based on factors such as location and browsing history.

Benefits of digital services and digital engagement

Overall, digital services deliver private benefits to individuals and public benefits to us all. Some examples of this are presented in Figure 2.

Figure 2: Examples of the private and public benefits of digital services

Service typeType of benefitExamples of benefits
Digital healthcarePrivateEasier and quicker access to services, including 24/7.
PublicImproved outcomes for public health, education, and employment. Lower cost of delivery.
Digital educationPrivateDistance learning. Learn at your own pace. Learn when you like, 24/7 access.
PublicImproved outcomes for public health, education, and employment. Lower cost of delivery.
AIPrivateFaster and cheaper access to some services and facilities. More reliable results for some interactions.
PublicImproving reliability and accuracy for some public services, e.g. road traffic management. Lower cost of delivery.

Factors that reduce engagement dilute benefits. Improving engagement through proportionate and targeted interventions can therefore benefit everyone.7

The opportunity to address harms and build trust and confidence

It is in the interests of all stakeholders to improve the digital landscape by addressing online harms.

2023 and 2024 will be landmark years for regulation of digital services. Globally, the United Nations has a programme of work on digital development, and is preparing for its Summit of the Future in 2024 which includes digital engagement as an area of potential action.8

Meanwhile, reform is underway or planned in a number of jurisdictions, including:

  • Across the EU through the Digital Services Act, Digital Markets Act and Artificial Intelligence Act.
  • In the UK, the Online Safety Bill and the Digital Markets Consumer and Competition Bill are progressing through parliament.9 The Competition and Markets Authority has established a Digital Markets Unit to expand its capability and capacity to regulate digital markets and in July 2022, issued a joint statement with Ofcom on the approach to regulation of online safety and competition.10
  • The Australian Competition and Consumer Commission is undertaking a major review of markets for digital services and platforms.11

Developing regulation to address online harms is breaking new ground in an environment where regulation has up until now been ‘light touch’ (for example, compared to regulation of broadcast content or telecommunications services). Hence, there is a need for innovative thinking by policymakers to strike an appropriate balance between effective safeguarding of consumers and competition, and the benefits of innovation in the provision of online services which regulatory failure could constrain. Simple transference of regulatory methods from other regulatory sectors to the digital economy is unlikely to be effective, and may be damaging.

Policymakers will find it helpful to consider the following points as they build regulatory frameworks for digital services:

  • Common measures and approaches to effectively and proportionately address harms. These might include:
    • clear identification and codification of harmful content, giving examples of material which is illegal and that which is legal but may be harmful to groups or individuals
    • measures for effective prevention, detection and enforcement against harms
    • effective, transparent and navigable systems for consumer redress when things go wrong and/or consumers are dissatisfied, including independent sources of help and advice where there are disputes between a consumer and their provider12
    • clear and straightforward measures to ensure children are not exposed to age-inappropriate content
    • information and support for consumer self-help in protecting themselves and their families from harmful content
    • targeted measures to reach and support individuals with low digital skills, particularly those who are isolated without access to help.
  • The coordination of consumer protection in digital markets with other aspects of regulation, such as measures to safeguard fair competition, and the prevention of harmful advertising so that the regulatory framework is coherent and holistic.
  • The measuring and tracking of outcomes, identifying risks of harm as well as benefits.
  • Future proofing through analysis of changes to the consumer experience as digital markets evolve and new services emerge, and flexibility for protections to adapt to new sources of harm.
A need for international coordination

Digital markets are rarely constrained geographically, and online harms do not stop at national borders. Therefore there is a need for national authorities to cooperate between jurisdictions to address international or global harms.

Inter-agency dialogue and sharing of information between jurisdictions can help in developing effective regulatory measures. Regulation is not consistent between countries and regulatory capacity is uneven. For example, the United Nations reports that 80 per cent of developed countries in Europe have laws for online consumer protection, compared to 41 per cent of least developed countries.13 This does not mean that regulation should be identical in all countries, especially in the area of content which may have unique characteristics and may be culturally significant in some countries whilst not in others. However, benefit can be gained from shared learning, including through the exchange of research and analysis, knowledge transfer and capacity building between countries.


Chris Taylor

Chris Taylor is a partner at Plum Consulting. He is a specialist in communications and media regulation and was formerly director of consumer policy at Ofcom.

Johnathan Charles

Johnathan Charles is an analyst at Plum Consulting, focusing on the telecoms, media and technology sector.

1 Ofcom research found that 69 per cent of users are confident about their ability to stay safe online. bit.ly/44AXtkQ

2 Ofcom’s Online Nation 2022 Report states that 63 per cent of users have been exposed to at least one potential harm online in the last four weeks. bit.ly/3O4Xxm4

3 Doteveryone 2020 research included findings that 84 per cent of people are concerned about children accessing inappropriate content, and 83 per cent are concerned about online scams. bit.ly/44RVKaO

4 For example, research for the Aviva Fraud Report 2021 found that 53 per cent of internet users do not trust advertisements on search engines. bit.ly/3Q8rCE0

5 For example, research by the Centre for International Governance Innovation, Ipsos, UNCTAD and the Internet Society reports a variety of factors which affect trust. bit.ly/3Y3I0qY

6 Ofcom’s Online Nations 2022 Report states that UK adults spent on average 4 hours a day online. bit.ly/3O4Xxm4

7 In a previous article for InterMedia, Sam Wood of Plum carried out analysis that indicated that the benefits of online harms regulation could significantly outweigh the costs. bit.ly/44zsZ2Q

8 bit.ly/3Q7pGvl

9 bit.ly/43BvAry

10 bit.ly/3rA9bh8

11 bit.ly/3O4hRDU

12 Research by Doteveryone found that people do not have access to adequate redress and Doteveryone have made recommendations to rectify this. bit.ly/44RVKaO

13 bit.ly/3O4hIQS