Good grief!

How we learned to love measurement and evaluation

About the author

Richard Bailey FCIPR MPRCA is editor of PR Academy's PR Place Insights. He teaches and assesses undergraduate, postgraduate and professional students.

Image by Shutterbug75 from Pixabay
Image by Shutterbug75 from Pixabay

Dr Strangelove, Stanley Kubrick’s 1964 film about the Cold War, has a subtitle: ‘How I Learned to Stop Worrying and Love the Bomb.’ The prospect of nuclear annihilation around the time of the Berlin Wall and the Cuban Missile Crisis forced a psychological adjustment and created a darkly comic classic.

Nothing so stark and dramatic has happened in our profession, but a summary of the past three decades suggests an existential crisis as a largely unmeasured and unmeasurable activity has been forced to adjust to the new reality of a world full of data. Perhaps we’ve all learned to stop worrying and love counting.

The adjustment has taken some time. Here’s how we’ve responded to this slow-burn crisis as we’ve negotiated the five stages of grief outlined by Elisabeth Kubler-Ross. This is based on personal experience, many conversations with practitioners, and on my reading of academic and practitioner textbooks and toolkits.

The good news is that this is one of those areas where the body of academic and practitioner work is invaluable. Read on for a summary of insights below.

Stage One – Denial

We instinctively know the value of public relations and comms. It’s self-evident. But we don’t want to let daylight in on magic and don’t see any value in trying to quantify it.

Stage Two – Anger

You still want me to count it and justify its value? Well here’s a fat cuttings book! Just feel its weight. Imagine what this would have cost through advertising. (I know it’s a dumb measure, but you did insist, and you get what you deserve.)

Stage Three – Bargaining

This isn’t going away, so can I turn this to my advantage? See those tens of thousands I’ve helped you save in advertising? All I’m asking is a few thousands more for public relations.

Stage Four – Depression

This isn’t going well. If I achieve X results this year, I’ll only be expected to produce X+Y next year. I can’t win! I should never have started counting.

Stage Five – Acceptance

In a world of digital communication, so much more can be counted. Public relations and communication is still as much an art as a science (who ever knows for certain which messages will resonate and which will take on a life of their own?) but the ease of tracking and counting makes testing and proving so much easier. Measurement is no longer an expensive – and largely pointless – exercise conducted at the end of an activity that merely proves what we already know. It’s now an iterative process that can inform our planning and help us to justify our salaries and budgets by focusing on what matters.

So here’s a summary of how we’ve (mostly) reached this stage of acceptance.

Practitioner models and toolkits

AMEC says it all. Literally. That a trade body that began as the Association of Media Evaluation Companies could have morphed into the International Association for the Measurement and Evaluation of Communication has to the most ingenious rebranding exercises in our industry.

It’s the same narrative as I’ve described above: how a trade body for clippings companies became an association of software companies adept at turning data into insight. And there’s a secondary narrative within the three iterations of AMEC’s Barcelona Principles. In 2010, they were describing an activity called public relations. By 2015 they had renamed this activity communication.

This wasn’t some academic attempt to redefine public relations; it was simply a recognition that in the age of the PESO media model (as presented by Gini Dietrich in 2014) we need integrated rather than siloed thinking across channels.

So let’s review the Barcelona Principles 3.0 as articulated this year.

  1. Setting goals is an absolute prerequisite to communications planning, measurement, and evaluation,
  2. Measurement and evaluation should identify outputs, outcomes, and potential impact.
  3. Outcomes and impact should be identified for stakeholders, society and the organization.
  4. Communication measurement and evaluation should include both qualitative and quantitative analysis.
  5. AVEs are not the value of communication.
  6. Holistic communication measurement and evaluation includes all relevant online and offline channels.
  7. Communication measurement and evaluation are rooted in integrity and transparency to drive learning and insights.

We now have a process. It starts with planning and goal setting and takes us through outputs (the messages we send and share) to outcomes (the reaction to those messages) and impact (the longer-term effects of our comms campaigns). 

We have also broadened our focus. We’re not just seeking positive impact on our organisations through comms, but we’re also taking not of the effects of our activities on our stakeholders and on wider society too.

In summary, the process is not linear, but rather circular. It does not end with evaluation, but we use evaluation ‘to drive learning and insights’ and so inform the process of planning and goal setting.

The word holistic is used, and this is indeed a holistic model for public relations and communication management.

I’ve focused on AMEC because it has a leadership role, but let’s acknowledge the many contributions that have led up to Barcelona Principles 3.0 and AMEC’s Integrated Evaluation Framework. Somewhere on my bookshelf I have Mike Fairchild’s Planning, Research and Evaluation toolkit published in 1999 for the then Institute of Public Relations. Note the emphasis on planning over twenty years ago, long before it became the focus of AMEC’s educational campaign this year.

I also use Katie Paine’s Measure What Matters from 2011. This book adapted measurement and evaluation for the social media age where earlier texts had remained rooted in traditional media evaluation.

More recently, books by Sam Knowles and Cole Nussbaumer Knaflic provide valuable resources for insight and data storytelling.

Let’s also acknowledge the imitators and followers, as is the case with the Government Communication Service’s Evaluation Framework 2.0 based on the AMEC framework. 

Academic insight

So what contribution have academics made to these discussions?

Prolific Australian scholar Jim Macnamara discussed progress on measurement and evaluation in a 2015 academic paper. He noted that despite the existence of various best practice models and frameworks, many practitioners were slow to adopt them and the lament made by US scholar James Grunig in the 1980s remained true: that ‘PR people keep on sinning’.

Macnamara reviewed 50 years of academic literature in the field of measurement and evaluation. Highlights of this have been the Cutlip, Center and Broom Planning, Implementation, Impact (PII) model from 1985; Macnamara’s own Macro model of evaluation from 1992, which evolved into his Pyramid Model of PR research (2002). He also cited further models from Lindenmann and also Watson and Noble and noted that ‘all of these models identify key stages of [measurement and evaluation], such as inputs, outputs, outtakes and impact.’

In other words, decades of academic thought had led to a convergence around the key concepts underlying AMEC’s measurement principles.

Tom Watson and Paul Noble’s Evaluating Public Relations: A Best Practice Guide to Public Relations Planning, Research and Measurement (third edition 2014) is a milestone text written by academics and used by students and practitioners in the UK.

The authors place their work in its historical context, and also emphasise the role of research and planning: measurement and evaluation does not and cannot take place in isolation. 

They noted the scale of the challenge by citing John Pavlik’s observation in 1987 that ‘measuring the effectiveness of public relations has proved almost as elusive as finding the Holy Grail’.

The debate continues. We know what we should be doing, but for various reasons we’re not all doing it. Some may still be navigating the stages of grief. Let’s hope that for the late adopters it’s not already terminal.

Isn’t it time we all learned to stop worrying – and love counting?

For those needing help understanding the core principles and vocabulary around research, measurement and evaluation, we’ve also produced a free PR Academy / PR Place Guide to PR and Communication Measurement.

  • Macnamara, J. (2015), “Breaking the measurement and evaluation deadlock: a new approach and model”, Journal of Communication Management, Vol. 19 No. 4, pp. 371-387.

Download: Guide to PR and Communication Measurement

The guide will help you to:

  • Understand different research methods
  • Make measurement an integral part of your strategy
  • Prove the value of the great work that you do
Guide to PR and Communication Measurement

We respect your privacy and handle your data with care. Please see our privacy policy.