Skip to main content

How Should You Present Your Findings?

Depending on your stakeholders, it might be important to consider the ways in which dissemination is about making meaning. Once the evaluation data has been analysed, you will need to assess the implications of your findings. What does the data mean? A first step is to revisit your original aims for the evaluation. These may have been modified as data was collated and analysed, or in the light of unexpected findings. Evaluations are judged against how well they help you or others to do things. Methods for presenting your findings should build in further consultation and discussion on the evaluation.

Checking validity of the data

“Validity” is a scientific notion reliant on the assumption that you are representing reality. Education is considered a social, situated practice. Beyond a review of whether the questions are meaningful and relevant, checking the validity of your data might be problematic. It might therefore be better to consider whether or not the data are ‘credible’, which is a qualitative alternative to validity favoured in social science research. In this way, the evaluation findings are presented as a set of subjective, value judgements based on a particular context, rather than as quantitative evidence. It also emphasises (helpfully) that there is an audience to persuade at the end of it.

Key (1999) discusses reliability in qualitative studies. He suggests triangulation or corroboration methods to improve understanding and/or the credibility of a study. Procedures include “review and respond” for further consultation, focus groups with key stakeholders, research or inquiry audit, peer debriefing and the seeking of negative cases in the field that might disconfirm interpretations.

Dissemination as part of evaluation

There are two ways in which you might use dissemination as part of evaluation. Firstly, your data gathering strategy might include plans to review your approaches or materials with your designated target audience (probably the students), with peers or other stakeholders. This might simply be part of your summative methods; the boundaries between evaluation and dissemination can be fairly blurry. For example, you may wish to do follow up interviews or focus groups with a sample of students, or you may track who has used particular areas on a website.

Secondly, and in relation to the notion of professional practice raised at the outset of this guide, it is important to document or discuss your evaluation efforts and findings and any “next steps” that seem to be appropriate. This might be in the form of a report to your department or to a committee, consultation/review or staff development events, or might be a reflective account of the experience and outcomes for publication. Keeping a regular “diary”, such as a WebLog or “Blog” for short entries at key points, can also be a good (self) evaluation method and form the basis of a fuller article.

Publishing your evaluation

At Warwick, the Interactions web journal for sharing local innovation in educational technology is a useful vehicle for publishing your teaching developments, experiences and lessons learned. These are of enormous value to others treading a path behind you. For in-depth and rigorous “research” studies, you might consider peer-reviewed, academic journals such as Studies in Higher Education (edited by Malcolm Tight in the Institute of Education), International Journal of Academic Development, Association for Learning Technology (Alt) Journal, as well as discipline related educational and educational technology publications (see subject centres for guidance here).