Communicators and data protection
About the author
Ranga Nasho is a marketing executive for an automation company. He studied for the CIPR Professional PR Diploma with PR Academy, and this piece was written for one of the assignments.
It’s been referred to as the ‘Oil of the digital era’ and 7 of the 10 most valuable listed conglomerates in the world (Amazon, Facebook, Apple, Microsoft, Alphabet, Tencent, Alibaba) are affiliated with this commodity – data! To put a figure on it – Amazon, Facebook, Apple, Microsoft and Alphabet collectively earned a staggering net profit of $25 billion in the first quarter of 2017 according to a report published in The Economist.
Today, engineered algorithms used by e-commerce brands to analyse online consumer behaviour based on our search/order history are being adjusted constantly to build a tailored shopping profile for individuals. This information is usually crunched together and redirected to us in the form of personalised pop-up adverts all with the intent of simplifying our shopping experience; but most importantly enhance sales for these merchants in a highly competitive online marketplace.
In essence, the agreement to exchange our data to access or purchase online services provided by these tech giants has completely transformed our lives.
YouTube videos can teach you how to cook a complicated recipe, with just a few clicks Amazon Prime can deliver an order to your door on the very same day and Facebook can help you communicate with loved ones across the world via video, audio, photo and message.
So, what’s the catch or is there one?
Let’s rewind back to the EU referendum in 2016 where the Leave campaign unexpectedly won by a narrow 51.9% to the Remain campaign’s 48.1%. How did this happen we all wondered? Vote Leave Campaign Director Dominic Cummings later revealed in his speech at the Nudgestock Behavioural Science Conference in 2017 how his team managed to target 7 million people sitting on the fence to swing towards their side through targeted Facebook ads.
Dominic explained how data scientists hired were able to utilise machine learning tools to profile people based on data gathered from their Facebook page to create demographic pools. For an example they could target women between the age of 35 – 45 in a particular UK geographical area, with or without degrees and so on. Once the Leave campaign identified their chosen demographics, a pool of roughly 7 million people were to be targeted with pro-BREXIT related political ads.
Fast forward a few months to the American Presidential election, where Donald Trump defeated Hillary Clinton to become the 45th President of the United States. Once again, personal data harvested from Facebook profiles played a huge role.
Cambridge Analytica (CA), is a now defunct former British based data company linked to Donald Trump’s key adviser Steve Bannon. CA was hired to gather personal information of around 50 million American Facebook users in order to profile voters and target them with tailored political ads.
A whistle-blower named Christopher Wylie who worked as a data scientist for CA revealed to The Observer and later on to the 2019 Netflix documentary “The Great Hack”, “We exploited Facebook to harvest millions of people’s profiles. And built digital models to exploit what we knew about them and target their inner demons”.
Business Development Director at CA Brittany Kaiser called this process ‘the boomerang’ on “The Great Hack” Netflix documentary. This was based on the process of individuals uploading their data into the digital sphere (i.e. Facebook) and have it returned as targeted messages designed to alter their behaviour.
Allcot & Gentzkow define ‘fake news’ to be articles that are intentionally and verifiably false and could mislead readers in their journal article ‘Social media & fake news in the 2016 election’. They provide an example of an article published by self-described fantasy website WTOE 5 stating ‘The Pope has endorsed Donald Trump’s candidacy’ as an example. This article was shared an alarming 1 million times across Facebook during the 2016 election.
Based on video evidence of Brittany Kaiser at the Vote Leave launch event on The Great Hack documentary, we know CA or data scientists linked to the firm had a part to play in assisting the Vote Leave Campaign during the EU referendum. The evidence highlighted so far has identified a link between how CA played a role in assisting the Vote Leave and Trump political campaigns to win their respective elections.
According to the Reuters Institute Digital News report published in 2019, social media is the leading platform for viewing news in the US reaching 34% of the population. CA and its data science tools took advantage of that 34% demographic and targeted a fraction of those individuals with fake news – fuelling fear and hate in them.
That process shows the execution of Witte’s extended parallel process model (EPPM). He states, “a fear appeal must be truly frightening to the target audience and accompanied by arguments showing how the target audience are themselves, susceptible’. Witte adds, the message has to include a response to the source of fear which he named ‘response efficacy’. In CA’s case, the call to action rested with the ‘fake news’ content encouraging individuals to vote.
The investigative work of journalist Carole Cadwalladr among others, including testimonies to British Ministers of Parliament from CA whistle-blowers Christopher Wylie and Brittany Kaiser meant the public body, Information Commissioner’s Office (ICO) had to carry out a full investigation. In their full report published in November 2018, the ICO concluded that under the DPA1998 legislation, people had not provided valid consent to CA for which their personal data was processed for political campaigning.
With regard to Facebook, the company received a fine of £500,000 from the ICO which was the highest penalty under the DPA1998 legislation. The decision was based on Facebook’s repeated failures to protect their users’ personal information considering they had been alerted about misuse of people’s data back in December 2015, but decided not to investigate the issue further.
Following the CA/Facebook crisis, the genie is out of the bottle and now more than ever, people demand transparency from social media platforms and web-based service providers on how their data is being handled.
Academic Karl Weick defines a ‘crisis’ as a critical and intense issue that threatens the very existence of an organization in terms of its basic assumptions, values and ways of operating. According to Cornelissen, an example of a crisis would be Shell’s attempt to dispose the Brent Spar oil rig in the North Sea. This led to public boycott and legislation change that not only damaged its reputation but challenged the company to change its values and environmental impact.
Back to Cornelissen, he argues organizations can apply ‘the life cycle of an issue’ framework to identify and neutralise an issue before it brews into a crisis. The framework identifies four different levels of an issue (emergence, debate, codification, enforcement) which organizations can follow in stages.
Applying this model to Facebook, the company was warned about the data breach of millions of user profiles when it first ‘emerged’ via a warning by whistle-blower and former CA employee Christopher Wylie. Facebook’s lawyers instead denied the accusations and threatened Wylie and The Observer newspaper for making “false and defamatory “allegations when asked to comment.
In trying to undermine the issue, Facebook missed the opportunity to ’debate’ publicly to gain some level of control early on. Following this process would have provided Facebook the opportunity to appear ethical, responsible and influence public opinion in a favourable direction. With the lack of transparency, the issue became ‘codified’ meaning the general public had already made the decision to run with the story which had already gained traction across media outlets via the direction of The Observer newspaper.
The ‘enforcement’ stage came into effect when Facebook CEO Mark Zuckerberg was summoned to provide evidence on the crisis to the UK’s Digital, Media and Sport Committee – which he rejected on numerous occasions. This would have inflicted further harm on Zuckerberg and Facebook’s reputation amongst the general public due to the lack of responsibility from its leader.
But why are we surprised about the CA/Facebook data breach?
Technology writer from Geek.com Daniel Starkey emphasises, “It has been said countless times – if a product is free, then you are the product. We’ve known this”. Starkey clarifies further stating that only a few web based platforms or apps are as dominant because Facebook allows us to connect with so many other platforms for that easy link-up.
But of course, this provides a challenge because this process leaves our digital trace on a lot of apps and websites. This means we don’t actually know how secure our personal data is or in some cases who has access to it due to the complexity of the digital ecosystem. Even with Facebook’s security infrastructure and expertise, how can they possibly police how an individual’s data is being handled by thousands of third parties and app builders?
After the CA crisis, Facebook continues to focus heavily on improving privacy settings for users. The company plans to introduce end to end encryption on its Facebook and Instagram services – meanwhile it’s already available on WhatsApp. Nevertheless, the British government opposes Facebook’s plans because of fears of empowering criminals, with home Secretary Priti Patel having recently sent a letter to the social network raising concern.
But Is there an element of double standards from the UK government?
Facebook breached the trust of millions of its users and deserved a tougher penalty, but world governments cannot dictate how the company addresses its privacy and encryption processes – that’s not how a free market works. With a tougher data legislation in GDPR, governments will need to trust social media companies to learn from their mistakes and do the right thing with consumers data.
Implications for communicators?
Senior communication professionals now have a duty of care when it comes to handling consumers’ data. People now want transparency in how their data is being handled as Senior Vice President of digital research and analytics at Ketchum Communications Consultancy told PR Week.
“Twenty-page terms and conditions statements with data usage hidden for a single app download don’t cut it anymore. We need to be open with consumers on how that data can potentially be used. The bigger challenge for communication professionals will be connecting the dots between new innovations and the standard methods that we leverage today,”.
The General Data Protection Regulation (GDPR) passed by the EU in May 2018 replaced the DPA1998 legislation in the UK. GDPR legislation expects the senior management team of each organization that handles personal and business data to ‘ensure that they have watertight consent management processes in place, as well as data rights management systems that protect their most valuable asset – data’.
Failure to comply with GDPR law can potentially mean a fine worth 4% of the business’ global turnover.
It’s vital for senior communication professionals to carry out due diligence checks on all suppliers or third parties with whom they share data in order to protect the individuals who placed their trust in them.
Not only will organizations be protecting their corporate reputation, they would be avoiding heavy penalties that may lead to damaging financial implications on the organization.
Since the GDPR Legislation came into effect in 2018, the ICO has already enforced financial penalties on multinationals British Airways (£183.39 million) and Marriot International (£99 million) for their website security breaches which led to hundreds of thousands of their customers’ data being stolen.
Senior communication professionals need to ensure adequate data handling and security training is provided for staff members and embedded in the induction process for new starters. As digital transformation continues to evolve, website hackers, machine learning tools and data harvesting algorithms will continue to cause a threat across the dark web for organizations. Communication professionals need to show transparency through short, but clear data privacy notices on their websites to constantly remind consumers how valuable, but private their personal data is to them.