Skip to main content Skip to navigation

Sharing expertise in digital education

Expertise in digital education is devolved: students and staff; professional services colleagues and academics all have a stake in it as users, and can help us to understand the affordances and limitations of our digital approaches.

The PGA Digital Education invites colleagues to critically interrogate an area of professional practice or policy with the student experience in mind, through an evaluative inquiry. Not simply an exploratory exercise, colleagues seek to make recommendations for improvements and an enhanced understanding of what the digital means, across the disciplines. Explore some of the inquiry work completed by our first cohort of colleagues below!

Visual communication in asynchronous STEM learning

Synchronous online modes of interaction

Peer adjustment policy as digital artefact

Re-designing Moodle pages with students

AI 'gap' on assignment briefs and rubrics

Kim Watts

Amy Stickels

Amir Cheshmehzangi

Kim Lockwood Clough

Negar Riazifar

Negar Riazifar (WMG)

AI and assessment rubrics

Negar

What was your evaluative inquiry all about?

"My PGA Digital Evaluative Inquiry examined the gap between generative-AI levels on assignment briefs and pre-AI rubrics. It evaluated the effects on clarity, inclusion and wellbeing, then proposed alignment: judgement and traceability rubric lines, a short declaration, VLE wording, and moderation support."


Negar is an Assistant Professor (Teaching-Focussed) in Engineering Mathematics

Why did you want to explore this area of digital education?
I explored this area because declared AI levels were visible on briefs but absent from rubrics, leaving students and markers guessing. In my UG and PG modules, students using MATLAB or Python with AI code assistants asked what was permitted, and departmental colleagues worried about fair, consistent marking. The mismatch was driving uncertainty, appeals and uneven feedback. I wanted a practical, low workload fix that supports learning, inclusion and wellbeing.

What did you discover?

I discovered that the five-level AI policy improved visibility and reduced guesswork, but rubrics staying silent created a fairness gap. Students could not tell whether judgement, verification and transparency mattered, so they focused on tool fluency and rule compliance. Markers reported inconsistent interpretations, slower moderation, and feedback drifting towards policing rather than learning. Across coding and quantitative tasks, the mismatch increased cognitive load and uncertainty, with likely knock-on effects for inclusion and wellbeing.

What change would you hope this inquiry could make?
I hope this investigation shifts AI from a compliance problem to a learning signal. The change I want is simple: when an assignment declares an AI level, the rubric should make the expectations assessable. A short add-on would reward judgement and traceability, not polished output alone, and a concise declaration would normalise transparency without turning assessment into surveillance.

 

Kim Lockwood Clough (Liberal Arts)

Affective factors in Moodle

What was your evaluative inquiry all about?

"My PGA Digital Evaluative Inquiry focused on how the design and organisation of Moodle pages can support accessible, inclusive, and engaging digital sites for learning. The Inquiry drew on my experiences of collaboratively redesigning a departmental Moodle page with student co-creators."


Kim is an Assistant Professor (Teaching-Focussed) in Liberal Arts

Why did you want to explore this area of digital education?
Virtual Learning Environments (VLEs) like Moodle are at the heart of teaching and learning in UKHE; they’re not the flashiest or fanciest of digital technologies, but they are spaces that teachers and learners both interact with frequently. Inclusive, accessible VLE design has also been linked to improving the equity of student outcomes as well as improved engagement with learning. I wanted to use the opportunity of the Evaluative Inquiry to explore how digital pedagogical practice can be developed and improved through engagement with User Experience (UX) principles and by drawing on theories of emotional design for learning

What did you discover?
Working with the student co-creators, we identified key attributes that the Moodle page should have: materials should be navigable, accessible, relevant, and engaging. The student co-creators were particularly emphatic about the importance of materials being engaging, which surprised me. This led me to focus my research on the role of emotion in learning engagement, which can often be overlooked. Given that we often ask students to engage with Moodle as part of their independent learning around in-person teaching events, it can be important to consider how emotional engagement affects motivation for sustaining independent learning.

What change would you hope this inquiry would make?
I hoped this investigation would identify key attributes within digital design to support student learning, which could be aligned with design features to create a set of transferable digital design principles to support student engagement with learning.
The difficult next step is moving from principles to practice: research over the past decade has consistently found that teaching staff feel they lack time, training, support, and recognition for their work in VLEs.
As a first step, I'm aiming to translate the findings of the project into a VLE template for use within my department, while still maintaining ease-of-use and flexibility within the format.

My wider research and teaching centres on emotion, embodiment, and material culture, and the PGA Digital Evaluative Inquiry provided an exciting opportunity to draw these ideas into dialogue with digital design and pedagogical theory.

Kim Watts (WMG)

Visual communication in asynchronous STEM learning

What was your evaluative inquiry all about?

"My inquiry evaluated how visual communication can enhance asynchronous learning in STEM higher education. It explored whether consistent, well-designed visual structures could reduce cognitive overload, improve clarity and accessibility, and create a more engaging, learner-centred digital experience within the virtual learning environment.


Kim is an E-Learning Multimedia Developer in WMG

Why did you want to explore this area of digital education?

This inquiry was inspired by a combination of student feedback, sector-wide observations, and my professional experience. These sources repeatedly highlighted that asynchronous learning spaces, such as Moodle, can result in fragmented content, inconsistent design, and increased cognitive overload when learning experience design principles are not applied. Alongside this, my appreciation for the impact of effective visual communication strongly influenced my decision to explore this area.
With a significant proportion of STEM learning now delivered in a hybrid format, poorly structured digital environments risk disengagement and surface learning. As a visual communication specialist working closely with academics, I saw an opportunity to apply evidence-based design principles to improve learner autonomy, motivation, accessibility, and wellbeing. The transition at WMG to a four-week delivery block highlighted the growing importance of asynchronous delivery and reinforced the need for digital learning materials that are not only informative, but coherent, navigable, and pedagogically intentional.

What did you discover?
The inquiry revealed that many asynchronous learning materials were presented as disconnected, file-based resources with limited visual structure or multimodal integration. While pockets of good practice existed, inconsistency across modules contributed to confusion and increased extraneous cognitive load. Literature consistently showed that simply adding technology or media does not enhance learning; impact depends on clarity, segmentation, signalling, and alignment with learning outcomes. I also found strong evidence that intentional visual design including visual overviews, multimodal content, and consistent structures can improve engagement, reduce overload, and support deeper understanding, particularly in complex STEM subjects.

What change would you hope this inquiry would make?

This investigation seeks to reposition visual communication and learning design as core to effective digital education, rather than optional enhancements. It demonstrates how small, intentional design interventions can meaningfully improve understanding, navigation, accessibility, and student confidence in asynchronous environments. The work advocates a move away from dense, disconnected resources towards structured, visually led learning pathways embedded at programme level. Its outcomes have already influenced my professional practice through the development of reusable design frameworks, coherent learning journeys, and the strategic use of authoring tools to support academic teams. The inquiry has also established a clear trajectory for further research into design-led approaches to engagement and learning in STEM contexts.

Amy Stickels (WBS)

What was your evaluative inquiry all about?

"My enquiry focused on the use of online, live, synchronous sessions, focusing specifically on evaluating how interaction is fostered in 3 areas - the learner-content, learner-learner and the learner -teacher."


Amy is a Teaching and Learning Consultant in WBS

Why did you want to explore this area of digital education?

Having been a participant in a number of programmes and courses which contained online features, I was interested in exploring the transactional distance that is experienced in online classrooms. Furthermore, I also have an avid interest in active learning pedagogies and so was interested to combine the two elements - the online and interactivity to see how interaction can bridge that distance gap.

What did you discover?
Interactions in online classrooms are largely learner-content or learner-teacher - most interaction is done through the teacher who delivers the content and asks and responds to questions (usually via the chat function). Transactional distances are reduced with simple acts e.g. using students names when reading out questions in the chat, being present on the screen but also with having more planned interactions e.g. activities for students to do together.

What change would you hope this inquiry would make?

This inquiry formed part of a larger piece of work looking at online classrooms. From this, good practice has been identified and a self-review tool has been created for teachers to use to help them to evaluate the effectiveness of their own online classes.

Amir Cheshmehzangi (WMG)

What was your evaluative inquiry all about?

"An evaluative inquiry into WMG’s Peer Adjustment policy for group work, treated as a digital education artefact. I analysed how the Moodle-based peer scoring process shapes fairness, student voice, inclusion, and wellbeing, and how it holds up as AI-enabled collaboration makes “individual contribution” increasingly difficult to define."


Amir is a Postgraduate Researcher and Associate Tutor in WMG

Why did you want to explore this area of digital education?

I wanted to explore this because peer adjustment sits at the intersection of assessment integrity and student experience. At WMG, the policy is widely used across MSc and apprenticeship group projects, yet I have seen confusion, anxiety, and reluctance to report problems. The policy is procedurally clear but primarily written for staff, which may lead students to perceive it as opaque or punitive. I was concerned that collaboration practices, including AI tools such as ChatGPT, GitHub and Copilot, blur authorship and effort in ways the policy does not anticipate. An evaluative lens helps shift the conversation from compliance to care and inclusion.

What did you discover?

I found a consistent gap between procedural clarity and pedagogical transparency. Students can follow the steps but lack scaffolding to understand how scores are interpreted, what thresholds mean, or what “fair contribution” entails. Despite the language of student empowerment, the process concentrates authority in staff and provides limited guidance on moderation, which risks inconsistent decisions. The design can generate emotional strain; students may hesitate to report peers for fear of conflict or retaliation. Finally, the policy is not future-proofed for AI-enabled collaboration, in which shared cognitive labour and tool use complicate individual attribution.

What change would you hope this inquiry would make?
I hope this work helps WMG evolve peer adjustment from a back-end compliance mechanism into a learning-oriented, inclusive process. The most immediate change would be a student-facing companion resource in Moodle, plain-language guidance, FAQs, a short explainer video, and worked scenarios, to ensure expectations and consequences are transparent. Next, I would introduce a low-stakes, mid-project formative peer feedback point to build feedback literacy and reduce end-point pressure, alongside clearer staff-moderation guidance to ensure consistent decisions. Finally, the policy needs explicit guidance for AI and digital collaboration, recognising shared cognitive labour and acceptable tool use. I have an implementation roadmap in the inquiry. Phase 1 is to pilot the student-facing resources in selected modules and gather feedback from students and tutors.

I would be happy to share the draft student-facing companion resource outline (FAQs, scenarios, explainer structure) and discuss inclusive assessment design for group work and digital policy.
Contact me: amir.cheshmehzangi.1@warwick.ac.uk

Let us know you agree to cookies