Skip to main content Skip to navigation

Public Engagement with Research Online Project

Skip to: Project Description; Project Updates (blog posts)

PERO: Public Engagement with Research Online

Lead institution: University of Warwick
Project Lead: Dr Eric Jensen
Research Group (Case Study): Centre for Competitive Advantage in the Global Economy (CAGE)
Project Manager: Monae Verbeke
Project partners/collaborators: Dr Trevor Collins, Open University; Nicola Buckley, Cambridge University; Prof. David Ritchie, Portland State University; Prof. Andrew Oswald, Warwick University; Prof. Sascha Becker, Warwick University; Dr Sophie Staniszewska, Warwick University
Project Duration: 1 June 2012 through 30 November 2012, with on-going dissemination.

PERO Project Working Papers
  1. 'Theorising online public engagement with research impacts' (by Dr Eric Jensen, University of Warwick),
  2. 'Comparing traditional forms of patient and public involvement in health and social care research with online forms of involvement and engagement' (by Dr Sophie Staniszewska, University of Warwick / Royal College of Nursing),
  3. Evaluating the impact of research online with Google Analytics' (by Dr Trevor Collins, Open University/Knowledge Media Institute) and
  4. 'Practical Framework for Analysing Impacts of Online Engagement' (by Monae Verbeke, MSc(R), University of Warwick)
Final Project Report

Submitted 30 November 2013: Public Engagement with Research Online - Final Report

Project Description

Researchers are increasingly engaging with publics online and through social networking. Yet effective approaches for capturing and analysing impacts of public engagement through these media are not fully developed. This project will develop a framework and example case study for analysing the reach and significance of online public engagement with research, incorporating assessments of opportunities for involving public perspectives in research as part of long-term impact generation.
The case study will include quantitative and qualitative analysis using web-based public discussion and social media-based responses to research, framed within a theoretically and methodologically robust framework for articulating such impacts. The possibilities for deploying an integrated web-based solution for automatically capturing, analysing and generating reports on the reach and significance of particular researchers’ engagement using existing tools will also be considered as part of this process.

Specifically, we will evaluate the options for connecting existing web crawler / screen scraper technology with linguistic corpus analysis software and website analytics software (e.g. Google Analytics) to leverage the relative strengths of these existing technologies for the new purpose of online impact analysis.

Project Updates (blog posts)

Part 1: PERO Updates - Where we are now.
Posted by Monae Verbeke on July 22, 2012

The last few weeks have been quite busy for the PERO team and we are excited to share what we have been doing. We held our first workshop with communications staff and research students with Centre for Competitive Advantage in the Global Economy (CAGE) on 27 of June. The workshop gave us a great opportunity to meet with one of our target audiences. The communications staff on Warwick campus handles a large majority of research communication, including digital communication through a variety of networks. Through this we were able to disseminate the project’s objectives, ascertain what tools where already being taken advantage of by the communications team, and how our research could best be utalised. The discussion that resulted from the workshop has been well incorporated into our framework for online engagement tool development.

On 28 June, the PERO partners, Eric Jensen and Trevor Collins, met with another JISC-funded project in this call (TDI), which was represented by Kent McClymont, to discuss possible collaborations. We identified a few common activities that are relevant to both projects. It is important that both teams conduct a survey of frameworks for articulating online engagement impacts. However the TDI team is focusing on practitioner-oriented frameworks, while PERO will be exploring how relevant academic social theory can be applied to demonstrate engagement and impact (example: theory of social change and the public sphere). Another area of common activity for the two teams is surveying the tools that are available for impact analysis online. Currently TDI is identifying examples of analytic tool types and PERO will identify the costs and benefits of specific tool examples. We plan on meeting again to share notes on September 10 or 12 to discuss possibilities of co-authoring papers, and developing our end-of-project workshops together.On 2 July, the PERO team held a team meeting, which included project partner, David Ritchie, who is a Professor of Communication visiting from the States. We decided to focus primarily on clarifying the outcomes of our objectives.

How would we know we had achieved what we wanted from the project? What is it that we truly wish to achieve? We began by running through what we have accomplished so far and then tackling our objectives. The team decided that the impacts we are interested in measuring include changes in emotion, cognition, behaviour, and social (affective factors). One key area of discussion was the idea that if researchers knew when their findings would be released publically (including online) it would be ideal to do a pre-, during-, and post-release assessment of the diffusion of the resulting online engagement around the internet.We know it will be necessary for researchers interested in evaluating their impacts to have a method for self-assessment in order to identify where to begin with an impact analysis. The necessary information needed for such self-assessment would focus on an identified target audience.

The researcher would need to know where to find this audience and what they want to achieve with the audience. These particulars would need to be outlined prior to the release of any report or engagement project. Questions that need to be asked of each project/report include: what data is going to be produced from the project/report and what is the evaluation question. The evaluation question should be at the cross-section of the target audience, the objectives, and methods. What data is needed to answer these questions? It seems that the most important data is the discussions that occur online around the research topic. It is now important for the team to determine a method of identifying keywords for analysing discourse that occurs online. These keywords should be ones the target audience would find relevant and useful. We therefore have developed a multi-tiered keyword assessment method by beginning with the primary audience (known stakeholder) in a possible focus group setting, then adding in basic keywords (name of researcher, title words, etc.), and then adding in the very basic terms for tertiary, less or unknown, stakeholders.

The conclusion of the extended meeting was the need for the development of a simple framework for researchers to employ.

  • The focus should be an automated pre- and post- evaluation based on a keyword search.
  • Stepped away from complex linguistics software for this purpose of helping researchers’ evaluate their online public engagement impacts.
  • Adapted keyword analysis as a possible solution for researchers.
  • These methods can feed into positive development in online public engagement practice, as the analysis it requires provides useful information for the researcher to target her or his future communications more effectively.
Discussing 'Embedding Impact in Research Online'

Posted by Monae Verbeke on September 25, 2012

Following from the NCCPE workshop in London, PERO has had the opportunity to reflect on the wonderful discussion surrounding our research project. We came to the workshop with a developed model of how it might be possible to demonstrate impacts of research online [illustrated below]. The model was presented to the group for feedback - and we received great feedback! Five of the key points or questions developed by the group include:
Researchers/Academics still need more technical instructions, including what Google Analytics is and how to implement it.
A guide needs to developed on what researchers can expect to achieve with online dissimination and how to potentially achieve their objectives.
Group participants questioned what online engagement is? How do we (as a research community) define online engagement?
We need a cultural change within academia where academics anticipate the research they will dissimnate online and develop a plan for analysing that impact.
How do we validate the keywords in our model? Where do these keywords develop?

Many of these questions are going to be leading the discussion of our team meeting this week. How do we try to meet as many of the researcher's needs and at the same time develop a best practice of evaluating online impacts. One particular need that can be met is the training of all academics in methods of impact analysis, particularly qualitative methods. We have developed a series of workshops that have been useful to other academics in developing their skills:

As the project quickly progresses, we would enjoy hearing your thoughts and opinions on online engagement and impacts. What do you hope to achieve with online public engagement? Are there certain aspects of online engagement impact evaluation that you are struggling with? Do you have funders that would like you to engage online and demonstrate the impacts?