#FUTUREPROOFING COMMUNICATIONS EVALUATION Richard Bagnall
The #FuturePRoof project clearly shows how far the world of PR and communications has changed in recent years. But how is it best to measure effectiveness in this dynamic and complex environment?
• PR and communications strategy and tactics have changed significantly. Frequently measurement lags behind
• Modern communicators have embraced the PESO model but struggle to measure integrated communications effectively
• Future proofing communications measurement requires a change of mind set and a new approach if we are to prove value and demonstrate effect successfully
A complex world
The communications industry has undergone significant change in the last 10 years driven by seismic changes in the media. Audiences have fragmented, seeking out news and information on their own terms, in their own time, and on the platforms of their choice.
To adapt to these changes many modern communicators have embraced the PESO model. Working across Paid, Earned, Shared and Owned media, communication strategies are focussed on engaging with audiences wherever they might be.
Communications strategy and tactics have changed, but what about its measurement?
Stuck in the past
Many organisations are failing to evolve their communications measurement to the same degree as their communication activity. They are continuing to rely on measurement techniques that have served them well in the past but are now being found to be increasingly inadequate in this new environment.
Traditional media content analysis continues to be the dominant form of communication measurement despite its limitations. These include that it’s too slow, too expensive, doesn’t scale and is too backward looking, lacking in genuine insights. Additionally, on its own, it only offers a measure of ‘output’ – and fails to connect the effect of communications with the objectives and outcomes that organisations seek.
Embracing the digital era, many vendors now offer online, real-time and largely automated portal based evaluation solutions.
With their dynamic dashboards and flashy charts, it can be easy to forget that many of these tools are not measuring what matters, but are instead just counting what is easy to count. They share the ‘output-only’ limitations of traditional media analysis, but in addition, suffer from little to no tailoring of measurement against specific organisational objectives. As a result they frequently leave the user frustrated, inundated with a whole host of numbers that fail to answer the critical ‘so-what’ questions that are needed to prove value.
Media metrics losing relevancy
Many of the metrics themselves that have been relied upon for so long are starting to lose their relevancy. Thanks to the Barcelona Principles, AVEs have been denounced as a flawed metric and largely consigned to the dustbin of best practice history.
Beyond AVEs, other metrics are being questioned too. For example, what is the relevance of volume of content as a metric in an environment where the universe of publisher sources and contributors is ever increasing? Along with volume of content, reach and impressions are also being questioned. Many of these numbers are impossible to measure accurately and are all too easy to gain and cheat.
They also fail to answer the all-important so-what question – it doesn’t matter who may or may not have seen your content – organisations need to know what happened as a result of it.
AMEC integrated evaluation framework
Into these challenges stepped global measurement trade organisation AMEC (the International Association for the Evaluation of Communications).
AMEC’s view was that it is not new metrics or new tools that are needed but a credible, meaningful and consistent approach that all organisations can use. It needs to take into account all of the new channels available to communications professionals, but also show the way to link organisational objectives to outputs to outtakes and ultimately to outcomes and organisational impact.
To answer these challenges, AMEC put together a project team that included academics, PR agency heads, global measurement agencies and in-house communication professionals to create its new integrated evaluation framework. The framework is non-proprietary, free to use, and designed for the benefit of organisations of any size working with any measurement partner.
It’s not a measurement ‘tool’, rather a best practice workflow that supports the user through every step of the process to create their own campaign plan and measurement report.
The framework links the need for clear objective setting, alignment with organisational goals and shows how to tell the measurement story with a blend of output, outtake and outcome metrics that all support the organisational impact of the work. It shows how to include the right metrics from each of the Paid, Earned, Shared and Owned channels and suggests which metrics might be used in different scenarios.
Importantly AMEC’s integrated evaluation framework is provided with a comprehensive resource centre including supporting materials, case studies, thought leader opinion pieces, a dictionary of terms, a comprehensive measurement taxonomy and a whole wealth of further information that will help answer even the most demanding evaluation challenges.
The framework and resource centre was launched to global acclaim at AMEC’s international summit in June 2016, being endorsed, supported and used by the PRCA, CIPR, ICCO, GCS (The Government Communication Service) and many leading PR agencies.
It’s free to use to all interested parties and is available right now. So head on over to: www.amecorg.com/amecframework  and start using it today. We are sure you too will find it useful.
Richard Bagnall is CEO of PRIME Research UK, a global communications measurement specialist consulting firm. Originally working in PR, Richard has for the last 20 years specialised in communications measurement having been a founding director of Metrica, and a board director of Gorkana Group before joining PRIME. Richard is chair-elect of AMEC and was the leader of AMEC’s team that created the integrated evaluation framework.