Case study: Module Evaluation
In 2017, WIHEA Fellows formed a Learning Circle of colleagues interested in moving towards a more learning-centred approach to module evaluation, which would help inform students when choosing their modules, and allow staff to enhance modules where appropriate. The Fellows –which included Student Fellows- investigated existing practices in departments and peer institutions and developed good practice guidance for module evaluation.
The proposals for policy that followed focused on evaluating modules using a range of input, including staff evaluation, attainment data, peer input where available as well as student feedback. For the student feedback aspect, common questions were developed, and funding was gained to develop a Moodle-based module survey mechanism and a data storage arrangement that allows departments to gather and store data in an accessible way, thus supporting effective use of the data over time.
Department(s) / colleagues involved
Warwick International Higher Education Academy (WIHEA) Fellows representing a wide range of academic departments and professional services teams.
Our aim was to …
Our aim was to ensure that a common set of questions was established for use by as many departments as possible, so that students and staff have comparable and reliable data to inform module evaluation and module choices. We wanted to benefit from good practice elsewhere and at Warwick, to develop a pedagogically sound set of questions. The questions were only part of the overall aim, which was to ensure module evaluations were (re)focused on a range of data and input, and were geared towards enhancement and information provision.
We were aware that greater external accountability (TEF) which requires greater collective data use (combined subjects) would become an important aspect to take into account and that data storage and survey collection needed to be effective.
What we did …
The Learning Circle met every few weeks for about five months. We asked departments what they already did, what they would like to do and how they fed module evaluations back to their students. We spoke with other institutions to find out what they did, and why, and we went through a large amount of peer-reviewed research on the matter.
Several debates followed about what would be good practice in the Warwick context. We felt strongly that good guidance was needed on how to use and interpret feedback data, and what risk there might be. In particular we discussed whether to take a teaching performance approach (how was the teaching delivery?) or whether to focus on the student learning aspect (were students challenged to learn?) and decided on the latter. We had particular discussions about the distinct nature of a Warwick Education: intellectual challenge is key. This is why we included a question on this too.
Most importantly, we wanted the focus to be on enhancement, and taking a constructive approach. This is why –unusually- the first question became an open one, asking students to note the one thing that had the most impact on their learning.
We worked in three groups: one on the pedagogical matters, one on the policy approach and how to embed the desired change, and one on the operational side of things, most specifically on specifying the feedback survey approach in Moodle and the data storage aspects.
We then formulated the policy and worked together on consulting with departments and taking the policy through the relevant committees. This included bidding for funding for the online and in-class survey system. In July 2018 the proposed policy had been accepted by the Faculty Education Committees, SLEEC, AQSC and the Education Committee and finally supported by Senate.
The outcome has been …
As a result of the many debates in the WIHEA Learning Circle a greater understanding and knowledge has been developed regarding module evaluations, student survey questions and the sound use of student feedback data. The majority of departments will be using the common questions from 2018/19 onwards. Some departments have signed up to use the Moodle systems, others are using paper versions which are machine read, and yet other departments are using their own mechanisms. We expect that over time the paper version will be phased out.
Importantly, the common questions are the starting point for many departments which are still encouraged to add their own questions when they are felt to be relevant to the specific interests of the departments and their students.
The benefit/impact has been …
We don’t know yet what the impact of the new approach will be as it has only just come into effect. What is already clear is that the development approach of working from the classroom up (Fellows are actively teaching or directly supporting students) has allowed for policy development which is credible and relevant to academic practice. That has given much courage to address future policy needs in a similar way.
This supports the Education Strategy by …
This activity supports the Education Strategy by contributing to the ‘Continuous Development of Teaching Excellence’. This states that ‘we will … Engage in continuous enhancement of teaching and learning through partnership with students and staff and informed by peers and teaching-related research’. Module evaluations contribute to the enhancement of learning and teaching by gathering and evaluating data in a coherent and purposeful manner.
Our next steps will be …
Our next step is to gather departmental feedback on the use of the common questions and the Moodle system that supports using them. We will evaluate also how students have interpreted the questions and propose any changes needed.
We are also drafting full guidance and a set of related resources to support departments, students and individual staff wishing to explore how to make best use of module evaluation and associated data.
To find out more, you can contact …
Gwen van der Velden, Deputy Pro-Vice-Chancellor (Student Learning Experience)
G dot Van-der-Velden at warwick dot ac dot uk