FiftyOneZeroOne

Archive for March 2007

 The focus for public relations evaluators has long been on message delivery.  One of the key trends over the last 25 years has been an increasing focus on quality of coverage, not just on the volume.  And the prime indicator of that quality has been the presence or absence, strength or weakness, positive or negative reporting of key corporate messages.  That was easy enough when messages took a straight (and signposted) path via traditional media to their intended audience.   Now, the message can be changed, developed, added to, hijacked and contradicted along the way.  This provides peculiar challenges to communicators that can be met by the technology that begat them.  The tests for public relations evaluators in the 21st century are these. The first is that formative monitoring of “who is saying what about you” will become essential in order to enable the rapid intervention and rebuttal necessary to influence the online conversation before it is set in stone.  The Kryptonite bike lock in 2004 or the recent Dell Hell examples shows how control over online messaging is lost forever without rapid and early intervention. The second is how to divine the nature of relationships (planned and unplanned) that exist through social media. Despite the sound and fury around both Kryptonite and Dell Hell, which were badly handled, the two brands continue to operate and prosper. Was the damage to reputation as severe as it might have been in a traditional offline media “storm”? How do organisations The credibility of offline media is well documented. Many would argue that (in its media relations guise) the supposed “killer benefit” of public relations is the credibility afforded by the media’s third part endorsement. But with many bypassing the journalist/media interface and transmitting messages direct, how can “credibility” be weighted from online media coverage and social media commentary. 

The answer at this stage is a very indirect one. By tracking traffic, tonality of comments and responses, use of unique links and weighting of blog responses and cross-links, a very loose correlation of quality factors can be created. But without a precise “call to action”, this evaluation is about output measurement (message distribution) rather than outcome.  That’s the state of play on a lot of supposed online PR measurement at present. It’s just an online variation of the media measurement that has been delivered for decades now. 

 

Do we need new models of communication for online media and social media that use “out-takes” – the audience reaction to and processing of messages – as the ultimate valid measurement of effectiveness? For both formative monitoring and relationship measurement, out-takes may be the most effective route ahead.

One of the case studies for the second edition of Evaluating Public Relations, written by Paul Noble and me and which will be published later this year, is of the use of blogs and wikis to measure the impact of a conference upon its audience. It has been prepared by Geneva-based Glenn O’Neil of the Benchpoint organisation and points the way in which social mwithout a time lag edia will become an important measurement tool in public relations and corporate communications. These tools are interactive and give immediate feedback and so play both a formative and summative role to collect data. 

The methodology for measuring the impact of a conference or event has long used a post-event survey tool, typically inserted in the conference or event pack. The simple questionnaires gave a snapshot of participants’ views on the quality and relevance of the event and of the speakers or production that had been witnessed. 

The limitations of this approach are that there is a low level of response, unless there is an incentive or a strong push by conference organisers to extract the survey from departing delegates. There is also little depth in the response, other than approval/disproval comments. At the LIFT06 IT conference in Geneva last year, an experiment was undertaken using as wikis, blogs and mash-ups to evaluate the event. Glenn O’Neil wanted greater depth of response during the event and not just as an ex post facto survey. He also sought to identify the manner in which LIFT06 influenced knowledge, attitudes and behaviour of conference delegates. The aim of LIFT06 was to “connect people who are passionate about new applications of technology and propel the conversations into the broader world to improve”. The research methodology combined qualitative and quantitative methods. 

All delegates were sent an online survey with questions focusing on the key measures, which got 60% response rate. During the conference, 10 participants, selected randomly, were interviewed for a 15-20 minute period. There was a wiki for the conference programme, in which each speaker had a one page website set up for them on to which both delegates and speakers could leave comments. It was estimated that 20-30% of conference delegates had laptops in use during conference sessions and were thus able to comment during and after speaker’s presentations.  During the conference, more that 20 delegates actively posted their comments on to their own blogs, with 680 postings mostly during and immediately after the event. These postings were fed into a mash-up report from which 50 posts were randomly selected and analysed. The results of the evaluation using conventional and new technology research methods were:         The range of methods gave both immediate feedback on delegates’ views and attitudes during the conference (which offers formative data and enabled immediate change) and afterwards (summative data for future planning).         Based on self-assessment measures, 82% of delegates indicated that their IT knowledge and 70% of their attitudes had changed as a result of the conference.         The participant survey also showed that 93% would attend the next LIFT conference and 96% would recommend it to others.

         The monitoring of a random sample of 50 blog postings showed 62% positive, 30% neutral and 8% negative, often as reactions to speakers. O’Neil also noted that 26% of blog posts came from non-delegates indicating that LIFT06 had generated discussions outside its halls and the immediate circle of participants. (This is data not normally gained through end-of-event questionnaires).

         Overall, 94% of delegates met new contacts, with 57% meeting between one and five new people. 

In addition to new methods of collecting quantitative data, O’Neil commented that the use of blogs is akin to the use of “learning logs” in the education system. He says that this is a rich new area of evaluation research as its gives an immediate “insight into participants’ changes in attitudes, concerns and practices.” 

The full case study can be found at: O’Neil, G (2006) Blogs, mash-ups and wikis – new tools for evaluating event objectives: A case study on the LIFT06 conference in
Geneva. It is available from PRism 4 (2), the online academic PR journal at http://praxis.massey.ac.nz/fileadmin/Praxis/Files/Journal_Files/Evaluation_Issue/ONEIL_CASE_STUDY.pdf

Picture a sushi bar with small packages of food moving around on a circular conveyor or a tapas bar (without alcohol) with bowls of olives, plates of meat and cheese to graze on and you have an idea of the style of the International Public Relations Research Conference (IPRRC) held recently in Miami. Some also refer to it as “speed-dating for public relations research”. Starting at 8am and continuing apart for coffee breaks and a 90 minute lunch break till 4.30pm for three days, IPRRC moves at a cracking pace of five speakers per hour, each allotted 15 minutes to present their paper and engage in a short discussion. No PowerPoints are allowed. Delegates choose four sessions to attend per hour and “rotate” every 15 minutes from speaker to speaker. By the end of the hour, the speaker will have presented four times to round-table audiences and collected a wallet or bag full of business cards and requests for the paper. 

The first morning to the uninitiated delegate passes in a blur of concepts, issues, research methods and types of presenters. But you gather speed and get to enjoy the rapid engagement with new and old ideas, especially from those who get to the heart of their argument quickly and allow time (5 to 7 minutes) for discussion.  The array of papers is bewildering. Broadly the topics covered were public relations theory, internet, blogging and new media, ethics, corporate social responsibility, evaluation, issues management, corporate communication, lobbying, media analysis and research methodology.  Some of the big names in US academic public relations were there including Carl Botan, David Dozier, Vince Hazleton, Dean Kruckeberg, Douglas Ann Newsom, Don Stacks and Judy Turk. As well, Krishnamurthy Sriramesh came in from
Singapore, although he is a former PhD student of Jim Grunig at

University of
Maryland. But there were also doctoral students pitching early stages of their research, academic staff members and practitioners presenting solely or with academics colleagues.  The key presentations came from: 

·        Dean Kruckeberg and colleagues on an ‘organic theory’ as a social theory of public relations. This rejects segmentation and reinstates the concept of the “general public” as exemplifying society as a whole. It seems that US PR academics have discovered Habermas and the “public sphere” rather late as this was a concept that arose on other occasions, too. ·        Brad Rawlins on measuring the relationship between organisational transparency and trust, which is moving the Hon & Grunig work of measurement of relations between organisations and publics into a more sophisticated area. After some years delay, this is developing as new area of research especially as the Brunning & Ledingham paradigm of public relations as relationship management is moving from a comfortable homily to being tested in practice. 

There was a wide range of research into the impact of blogs and wikis in communication, which included a paper from Donald Wright and Michelle Hinson on their impact on traditional mass communications models, which they expect to be profound, and from Bill Sledzik on how blogs are expanding the role of public relations practitioners. Overall, IPRRC is a valuable survey of public relations research from a North American perspective. It’s also an event at which everyone is welcomed and discussion goes on for many hours into the evening. 

Frank Luntz’s article in The Guardian (March 16, p.39) is a research-based demonstration of why “spin” dressed up as public relations and political communications is bound to fail. [http://www.guardian.co.uk/comment/story/0,,2035405,00.html]. 

When authentic voices are lost, voters switch off. Luntz has reported on research amongst a panel of voters in England’s second city, Birmingham that found deep disenchantment with soundbite politics and “PR stunts”. Most of the panel believed that this was the characteristic of the Blair years (since 1997) and now featured amongst all parties.

Taking the example of Opposition leader David Cameron (himself a former public relations practitioner), the panel switched off when shown a web video of life in the Cameron household. The reaction, says Luntz, was “predictably negative” as it was seen as a PR-stunt. “Voters crave something real.”  When Cameron spoke from the heart that the policy changes he was proposing would include “pain and sacrifice”, they warmed to him as it was an authentic voice that made statements which the voter panel accepted as realistic.

The lessons for Cameron and other UK politicians were that after a decade of soundbite culture, “voters are more savvy and wary of anybody who sounds too good to be true”. Being aspirational and visionary is acceptable, as long as it is balanced with reality in the manner in which change and progress will be delivered.

The external view of public relations is that it is based on spin and publicity, a cocktail of one-way communication and deception. But the really effective public relations programmes are those which engage with stakeholders in their many and varied form and build a relationship based on mutual interests with an authentic voice. Political parties (and public relations practitioners) should take note of Luntz’s small-scale research, which endorses the best practice model.

Welcome to DummySpit – an academic/practitioner’s view of current public relations research and best practice.  

Academic research into public relations measurement and evaluation is overlooking industry studies and initiatives. This can be found in two areas of communication activity.  The first was the introduction and use of scorecards to plan, monitor and measure communication, with a strong emphasis on linkage between public relations activities and corporate or organisational imperatives.  

The second is the use of internet tools, such as blogs and wikis to almost immediately measure the impact of events and communication activity.   There has been some research by practitioner-academics who have noted scorecards as they have been introduced from management theory, but no robust research programme has been undertaken to determine the validity and reliability of the scorecard’s performance as a measurement and evaluation tool. Given that some public relations activities that can be monitored by almost immediate techniques and tools, such as blogs and wikis, do new theories and approaches need to be developed which conceptualise this capability? Most models of communication imply there is a period of gestation in which the recipient of messages processes them before acting, but with immediate response and debate now available, does this need revisiting? 

It’s a great opportunity for academic researchers to re-engage with practitioners who, let’s face it, are running way ahead of them on the use of new technologies and social media.

TOM WATSON

Why “DummySpit”? Think about it – “spitting the dummy” is having a blast on important matters


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 17 other followers

Categories

Newsgator

Subscribe in NewsGator Online

Feedburner

Tom Watson

Error: Twitter did not respond. Please wait a few minutes and refresh this page.