How government communicators should tackle misinformation
About the author
Harjinder Randhawa prepared this article as part of a CIPR Professional PR Diploma assignment while studying with PR Academy.
Conspiracy theories, misinformation, and “fake news” (or any other term which may be used for the spreading of inaccurate information) are being more widely and rapidly accepted. The adoption of social media and online communication channels for sourcing news has fuelled this acceptance. The public, specifically internet users (will be referred to as publics throughout) are absorbed in to online communities, with what they see, share and believe being influenced by algorithms.
The way we consume information – i.e. news from search engines, social networks to news outlets – is controlled using algorithms, which continually learn from our preferences and provide us with customised content that we will find congenial.
These algorithms are plugged by publishers in their aim to increase footfall to their channels, boosting revenues. With artificial intelligence (AI) playing a part in a larger number of our interactions, our existence is hyper-personalised[1], bubbles and chambers are created through them second guessing what we like to see based on the information it has collected about us.
Filter bubbles and echo chambers are not new, but the increasing use of technology, online communications and our thirst for constant information, is causing what is being coined as – the ‘infodemic’ – a wide and rapid spread of misinformation – these ‘incidents’ are on the rise.
Although the segregation of audiences into smaller communities may prove useful to communicators who want to target messages i.e. for commercial gain, they can also represent a threat when misinformation is amplified in confined communities.
Communities themselves have a complex structure, characterised by overlaps, hierarchy, and multiplexity[2]. Homophily, most closely associated with the proverb ‘birds of a feather flock together’, has been found to establish itself in online environments and impacts the transmission of information[3] because it makes it easier and quicker to find like-minded people. It is where similarity breeds connections, and individuals in these relationships share common characteristics such as beliefs, values, education, religion, social class and occupation[4], and beyond to features such as gender, race, age, education level and many more.
In addition to compartmentalising and retaining groups of relative strangers, online communities can also overlap with each other and even nest within one another. This is how filter bubbles and echo chambers develop. Engagement with content that conforms to our beliefs and values, with it then being delivered to us time and time again, reaffirming ideology, with few or no opposing views being presented. Researchers have also argued that less tolerance of opposing views could lead to more polarised and fragmented political views.
There has historically always been some personalisation to the news we see and hear, we all read a particular newspaper, or watch a certain news channel because it represents the news in line with our views and depicts the values and beliefs we hold. A study of causal factors found that these are the very factors which influence the sharing and acceptance of conspiracy theories[5].
Social media has allowed news to travel further and faster, including misinformation[6]. Social media is an ever-popular choice among younger audiences and is also being increasingly adopted across other age groups. 53.6% of the world’s population uses social media, with the average daily usage being 2 hours and 25 minutes[7]. It is said that around 60 percent of millennials use Facebook as their main source of news, the remainder use other sources such as TV news networks, newspapers and mobile news apps[8].
Social media is appealing because it allows audiences access to information conveniently and they can instantly confirm or reject views. As well as getting their own voices heard, they can listen to what influencers have to say or become influencers themselves. Clearly social media is highly attractive, and provides almost immediate reaction to commentary, and validation through the number of ‘likes’ and subscriptions gained to channels. Although we must take note that the changing face of news has also facilitated once-marginalised groups to create vocal and powerful social organisations.
Many researchers have specifically noted the Brexit referendum and the 2016 US election[9] as resulting in the polarisation of communities, the campaigns, it is said were used as divisive tools to divide and conquer people’s hearts and minds. In fact, a survey showed 75% of Americans believed fake news during that election[10], we have also seen much discrediting of the Brexit bus and the figures quoted on the side of it i.e. £250 million per week.
Conspiracy theories are usually created by dividing the world into ‘us vs. them’. There is evidence[11] that far-right groups are using the Covid-19 pandemic as an opportunity to instil ‘fear and uncertainty’ to promote their hateful politics, although those who create and spread conspiracy theories can come from any cross section of society – anyone anywhere can have an opinion and it is free.
The UK government and its communications agency (The Government Communications Service) recognise AI, rapid social media change and the growing presence of online influencers, including disinformation as major challenges for it[12]. A challenge which exists for governments across the world[13].
An incidental increase in the number of 5G phone masts appearing in Wuhan was linked to the outbreak of Covid-19 in the region. Within an hour of the story being published in a Belgium regional publication where a doctor had correlated two separate facts, anti-5G groups had posted on a Facebook group and vloggers were posting on YouTube revealing the story as ‘the truth’. Celebrity endorsement of these messages had further advocated the story across communities, leading to like-minded people propelling the story out of control and resulting in the burning of phone masts. Even though the original story had been taken down, the videos continued to gather thousands of views over days and through the week the story broke out, having been given a push by engagement algorithms, which were not designed to recognise the poor content[14].
Eventually, UK news outlets reported on the theory in a bid to quash the claims being made to halt the vandalism, which had impacted several communication networks, some to key medical services during a critical time in the pandemic. Some responsibility should have been taken by the owners of the social platforms and does beg the question of scrutiny of their operations. However, the government eventually called for social media companies to take down or block posts, it was then supported with a response from a senior medical expert from the NHS to discredit the claims through evidence-based narrative.
The theory has developed various strands now and the origination of it being claimed by several sources, however the circulation of it initially occurred in small online communities, made up of several hundred people, but became a trending story on Twitter, which subsequently let to the action that the activists took.
A speedy formal denunciation of the theory would have steered No10 clearly through the misinformation and provided it with clear opportunities to reiterate the calls to action early and proactively. This I believe would have truly elevated their position of trust and provided them with a badge of recognition among publics during what was being termed as chaotic handling of the pandemic in the media. The story would have been cut short and potentially resulted in advocates across Europe, as the theory had also had an impact in France and Germany.
In contrast to the advocacy of misinformation by celebrities in the example – there is increasing evidence that social media influencers can help in rebutting theories with the rise in trust and appeal for their voices across communities[15].
Most recently we have seen the UK government use celebrities to convey positive vaccination messages to adults with the greatest risk of serious complications or death during Covid-19 infection. A sensible proactive measure, an anticipated move against anti-vaxxers.
So how do we as communicators work collaboratively with policy makers and create strategies to tackle misinformation and be ready to act quickly, accurately and robustly, backed up with data and use strong, recognised and trusted voices to amplify accurate information:
- Step 1: Start now and start early. Establish a working group of experts who can come together to tackle misinformation, don’t wait until it happens. This should complement any crisis management team, as part of the creation of this group a period of learning must take place, establishing a set of scenarios and handling plans.
As government communicators, remember there is an established approach to communications planning. The OASIS model (objectives, audience/ insight, strategy/ ideas, implementation and scoring). The stages of the model allow for a comprehensive data-driven strategy to be developed.
It has been proven to me that the most successful and valuable campaigns I have worked on are those which have invested time in setting clear and relevant objectives. Value is also derived from spending time on audience and insight stage – this is where the data will play a key role for targeting, recognising early signs for potential viral narratives online and understanding audiences. An improvement to the model would be to expand this stage to a full stakeholder plan, considering the beliefs and values important to target audiences, ensuring that a targeted approach to monitoring and communicating can be taken forward.
- Step 2: Ensure the team is recognised across the organisation and that they are seen as the experts who can feed into campaign planning and they should be engaged for monitoring and consultation.
- Step 3: Ensure that policy colleagues are producing a set of facts and data to support any reactions which may be needed should a particular scenario arise. This is key to demonstrating authority, trust and confidence.
- Step 4: Challenge scenarios on a regular basis to ensure that where a proactive stance can be taken, it is considered as part of planning and where a reactive approach is still required that you are prepared to react quickly. How quickly an organisation should respond to those will depend on the scenarios and this is why the working group are key to establishing a judgement on these types of factors and what has happened in the past. Develop a set of questions to undertake a consistent reassessment.
In the example of the 5G phone masts, communicators could have taken an earlier stance using the data they were already using at press conferences, they could have refocussed the publics attention to the original call to action – reaffirming the objectives to stay home and stay safe.
- Step 5: Regularly assess third party support for contentious or priority policy areas. Identifying stakeholders who can act as advocates, in positions of authority, who will speak out in support of your objectives. It would also be prudent to assess those less supportive voices, who may advocate the misinformation.
With the increasing number of platforms for people to have a voice, communications will only get more difficult and complex to monitor and responses will inevitably be needed faster. We must not forget that a big part still lies with social media companies and their actions have led to propagation of hate speech, misogyny and much more, and at times have translated into actual crimes i.e. ISIS supporters.
With trust in governments at an all-time low[16]. Government relationships with the media have become more combative[17] as the size of traditional media has shrunk and the fight for audiences has become more competitive. Therefore, we need to ensure we are armed with the facts, but they could be ignored if publics don’t trust the government and that is where the scenario planning and advocation are essential.
References
[1] Chang, S. and Buchanan, G. (2020). On Birthing Dancing Stars: The Need for Bounded Chaos in Information Interaction. Paper presented at the ACM SIGIR Conference on Human Information Interaction and Retrieval (CHIIR), 14 – 18 March 2020, Vancouver, Canada.
[2] Palla, G., Derényi, I., Farkas, I. & Vicsek, T. (2005) Uncovering the overlapping community structure of complex networks in nature and society. Nature 435, 814.
[3] Murase, Y., Jo, H., Török, J. et al. (2019) Structural transition in social networks: The role of homophily. Sci Rep 9, 4310.
[4] McPherson, M., Smith-Lovin, L., & Cook, J. (2001). Birds of a Feather: Homophily in Social Networks. Annual Review of Sociology, 27, 415-444.
[5] Halpern.D, Valenzuela.S, Katz.J, Miranda, J.P. (2019), From Belief in Conspiracy Theories to Trust in Others: Which Factors Influence Exposure, Believing and Sharing Fake News, Social Computing and Social Media. Design, Human Behaviour and Analytics, pp.217-232.
[6] Dizikes, P. (2018), Study: On Twitter, false news travels faster than true stories [online], MIT news.
[7] Chaffey.D. (2021). Global social media research summary 2021 [online], Smart Insights.
[8] Ingram, M. (2015). Why Facebook’s algorithm matters: because 60% of millennials get news there, Fortune.com.
[9] Gorodnichenko.Y, Pham.T, Talavera. O. (NK). Social Media, Sentiment and Public Opinions:
Evidence From #BREXIT and #USELECTION. Berkley, Edu.
[10] Silverman.C, Singer-Vine.J. (2016), Most Americans Who See Fake News Believe It, New Survey Says [online], Buzz Feed News.
[11] Ahmed.W, Downing.J, Tuters.M, Knight.P (2009), Four experts investigate how the 5G coronavirus conspiracy theory began [online], The Conversation.
[12] Government Communications Service plan, available here: https://communication-plan.gcs.civilservice.gov.uk/
[13] WPP Government and Public Sector Practice: The Leaders’ Report available here: https://govtpracticewpp.com/report/the-leaders-report-the-future-of-government-communication-2/
[14] Temperton.J. (2020), How the 5G coronavirus conspiracy theory tore through the internet [online], Wired.
[15] Hutchinson.A, (2018), Why Influencer Marketing is on the Rise, and How to Maximize Your Campaigns [online], Social Media Today.
[16] Edelman Trust Barometer Spring Update available here: https://www.edelman.com/research/trust-2020-spring-update
[17] WPP Government and Public Sector Practice: The Leaders’ Report available here: https://govtpracticewpp.com/report/the-leaders-report-the-future-of-government-communication-2/