Evaluating events with online tools

Posted on: March 21, 2007

One of the case studies for the second edition of Evaluating Public Relations, written by Paul Noble and me and which will be published later this year, is of the use of blogs and wikis to measure the impact of a conference upon its audience. It has been prepared by Geneva-based Glenn O’Neil of the Benchpoint organisation and points the way in which social mwithout a time lag edia will become an important measurement tool in public relations and corporate communications. These tools are interactive and give immediate feedback and so play both a formative and summative role to collect data. 

The methodology for measuring the impact of a conference or event has long used a post-event survey tool, typically inserted in the conference or event pack. The simple questionnaires gave a snapshot of participants’ views on the quality and relevance of the event and of the speakers or production that had been witnessed. 

The limitations of this approach are that there is a low level of response, unless there is an incentive or a strong push by conference organisers to extract the survey from departing delegates. There is also little depth in the response, other than approval/disproval comments. At the LIFT06 IT conference in Geneva last year, an experiment was undertaken using as wikis, blogs and mash-ups to evaluate the event. Glenn O’Neil wanted greater depth of response during the event and not just as an ex post facto survey. He also sought to identify the manner in which LIFT06 influenced knowledge, attitudes and behaviour of conference delegates. The aim of LIFT06 was to “connect people who are passionate about new applications of technology and propel the conversations into the broader world to improve”. The research methodology combined qualitative and quantitative methods. 

All delegates were sent an online survey with questions focusing on the key measures, which got 60% response rate. During the conference, 10 participants, selected randomly, were interviewed for a 15-20 minute period. There was a wiki for the conference programme, in which each speaker had a one page website set up for them on to which both delegates and speakers could leave comments. It was estimated that 20-30% of conference delegates had laptops in use during conference sessions and were thus able to comment during and after speaker’s presentations.  During the conference, more that 20 delegates actively posted their comments on to their own blogs, with 680 postings mostly during and immediately after the event. These postings were fed into a mash-up report from which 50 posts were randomly selected and analysed. The results of the evaluation using conventional and new technology research methods were:         The range of methods gave both immediate feedback on delegates’ views and attitudes during the conference (which offers formative data and enabled immediate change) and afterwards (summative data for future planning).         Based on self-assessment measures, 82% of delegates indicated that their IT knowledge and 70% of their attitudes had changed as a result of the conference.         The participant survey also showed that 93% would attend the next LIFT conference and 96% would recommend it to others.

         The monitoring of a random sample of 50 blog postings showed 62% positive, 30% neutral and 8% negative, often as reactions to speakers. O’Neil also noted that 26% of blog posts came from non-delegates indicating that LIFT06 had generated discussions outside its halls and the immediate circle of participants. (This is data not normally gained through end-of-event questionnaires).

         Overall, 94% of delegates met new contacts, with 57% meeting between one and five new people. 

In addition to new methods of collecting quantitative data, O’Neil commented that the use of blogs is akin to the use of “learning logs” in the education system. He says that this is a rich new area of evaluation research as its gives an immediate “insight into participants’ changes in attitudes, concerns and practices.” 

The full case study can be found at: O’Neil, G (2006) Blogs, mash-ups and wikis – new tools for evaluating event objectives: A case study on the LIFT06 conference in
Geneva. It is available from PRism 4 (2), the online academic PR journal at


1 Response to "Evaluating events with online tools"

[…] Evaluating events with online tools « DummySpit Posted March 21, 2007 Evaluating events with online tools « DummySpit […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 17 other followers



Subscribe in NewsGator Online


Tom Watson

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

%d bloggers like this: