Tech News : Disinformation or Misinformation?

The new Online Media Literacy Strategy from the Department for Digital, Culture, Media, and Sport (DCMS) is aimed at supporting 170+ organisations to improve media literacy rates in the UK, and thereby help young people to spot disinformation.

As an aside, misinformation is information that is simply wrong, inaccurate or misleading (without necessarily having any intention to propagate the misinformation) whereas disinformation is a subset of it, i.e. information that is deliberately wrong, inaccurate or misleading.

As aside to the aside, mistrust and distrust are roughly the same in meaning (i.e. not to trust someone or something) although, according to Dictionary.com, distrust implies having evidence to support that feeling.

Disinformation Problem

The Strategy, which was promised in the government’s online harms white paper, is intended to help tackle the problem that many young people in the UK are not able to distinguish between disinformation/misinformation and truth in what they read online.  For example:

Ofcom figures show that 4 out of 10 UK adult internet users don’t possess the skills to critically assess content online.

National Literacy Trust research figures show that only 2 percent of children have the skills they need to identify misinformation, half of teachers (53.5 percent) think that the national curriculum doesn’t educate children with the literacy skills they need to identify fake news, and 2 in 5 parents (39 percent) don’t watch, listen to, or read news with their child at home.

Pandemic Highlighted Problem

The fact that many young people may have been deterred from accepting the COVID-19 vaccine and/or have believed misinformation and conspiracy theories about the origins and causes of the pandemic have highlighted the problem. For example, popular stories believed by some in the UK, highlighted in University of Cambridge research (Oct 2020) include that:

– COVID-19 was engineered in a Wuhan laboratory (22 percent believed it).

– The pandemic is “part of a plot to enforce global vaccination” (13 percent).

– 5G telecommunication towers worsen COVID-19 symptoms (8 percent).

Who and Why?

Back in October 2020, Cambridge’s Winton Centre for Risk and Evidence Communication + UK Cabinet Office: Go Viral!, studied correlations between certain beliefs and demographic categories and the perceived reliability of misinformation. They discovered that:

– High levels of trust in science equates to low levels of susceptibility to false information (across all nations). 

– Better Numeracy skills are a predictor of greater resistance to misinformation.

– Being older is linked to lower susceptibility to COVID-19 misinformation.

– Identifying as more right-wing /politically conservative is associated with a higher likelihood of believing COVID-19 conspiracies.

– With COVID-19, a tiny increase (one-seventh) in how reliably misinformation is perceived leads to a much bigger (23 percent) drop in the likelihood that the person will agree to get vaccinated. 

Ultimately, as summarised by the minister for digital and culture Caroline Dinenage last week, “False or confused information spread online could threaten public safety and undermine our democracy.”

Training Trainers

The newly announced strategy is to teach a wide variety of UK organisations to teach others to get a better understanding of the online world, and how to critically analyse the content they see, thereby helping them to spot misinformation.

Criticism and Challenges

Criticism or the strategy includes that:

– It is possibly an opportunity missed and is less of a strategy and more a shopping list of useful actions that mirror what’s gone before rather than charting new directions (says LSE’s Professor Sonia Livingstone).

– The strategy appears to blame the user for the problems of the digital world.

– The strategy may be weaker than it could be because it is linked to the Online Safety Bills, so focuses on reducing consumer harms rather than addressing the breadth and depth of the media literacy agenda.

Challenges for the strategy include:

– Exposure to misinformation and disinformation can be influenced by changes to algorithm design and content feeds, thereby meaning that tech companies have a part to play.

– Motivations for believing (and wanting to spread) misinformation are varied, can be complicated and, therefore, anti-vaxxer mentalities / ‘cult’ type and attitudes are difficult to break down and challenge, even with well-meaning teaching.

What Does This Mean For Your Business?

In terms of tackling health emergencies effectively, education and tackling misinformation are vital. Many young people have social media as their main source of news, so giving many other organisations the means to educate young people in how to critically evaluate what they read is well-meaning and could have a value for young people and society as a whole going forward, which in turn will have a value to businesses. However, social media and other platforms use algorithms, which also influence what is presented to young people, which means that tech companies have an important role and responsibility to play in tackling the problem. The problem of misinformation is being tackled to a degree on social media using e.g., fact-checking, and curated news services, but the issue of misinformation is wide, and it is debatable how much of an effect the new strategy will have upon it.  One of the strengths of the new strategy, however, is that it is leveraging the power of many other trusted organisations to help deliver it.