From authoring to reporting - the eAssessment lifecycle
Understanding the (e)Assessment lifecycle
The (e)Assessment lifecycle outlines the essential steps involved in designing, delivering, and reviewing effective assessments online, ensuring they align with learning outcomes and provide meaningful feedback.
While this framework is not exclusive to Questionmark Perception (QMP) and can be adapted to other tools, we have included specific QMP functionalities where applicable to guide users in using its features.
By following this structured approach—from planning and creating question banks to delivering assessments and analyzing results—educators can enhance the reliability, validity, and impact of their assessments, fostering better learning outcomes and an improved student experience.
- Define the purpose of the assessment
-
- Determine whether it will be a diagnostic test, formative weekly assessment, or summative end-of-term exam.
- Align the purpose with learning outcomes and course objectives.
-
Select appropriate question types
- Consider multiple-choice, essay, true/false, matching, or hotspot questions based on learning outcomes.
- Ensure a balance of question types to assess different cognitive levels (e.g., recall, application, critical thinking).
-
Design feedback and links to learning
- Decide the type of feedback: immediate vs. delayed, generic vs. personalized.
- Link feedback to additional resources for further study, such as reading materials, videos, or related modules.
-
Determine time limits and conditions
- Set appropriate time constraints based on the complexity and length of the assessment.
- Plan for accessibility accommodations and any conditional assessment pathways (e.g., unlocking additional sections based on performance)
Online Exams
Some tips for applying the lifecycle to delivering an online exam.
Planning an online exam
Define objectives clearly: ensure the purpose of the exam is specific—diagnostic, formative, or summative—and aligns with course learning outcomes.
Plan for the online format: consider how the online medium might affect question design. For example, avoid overly complex diagrams that may not display well on small screens.
Anticipate technical needs: identify technical requirements early, such as secure exam delivery tools, stable internet access, and device compatibility.
Creating the question bank
Categorise questions by difficulty: tag questions as basic, intermediate, or advanced to ensure a balanced exam.
Prepare for randomisation: write multiple variations of key questions to enable randomisation while maintaining consistency in difficulty.
Designing the assessment
Incorporate real-world scenarios: application-based questions incorporating contextual prompts; require justification; build in decision-making.
Optimise time allocation: allocate time per question based on its complexity, using a rule of thumb (e.g. 1 - 5 minutes for MCQs, 10 - 30 minutes for 'essays' etc).
Verification and peer review
Run usability checks: ensure all multimedia (e.g., images, videos) load quickly and correctly on various devices.
Test navigation: simulate the student experience to check if the assessment flows logically and instructions are clear.
Scheduling exams
Stagger start times: if students are in different time zones, or you are concerned about system stability, provide multiple start times to ensure fairness.
Set a buffer window: allow a grace period at the end of the exam to account for minor technical disruptions.
Delivering exams
Offer a practice exam: provide a short, timed mock assessment to help students understand the platform and question formats.
Communicate expectations clearly: include rules about permitted materials, behavior, and how to handle technical issues during the exam.
Analyzing reports
Identify question-level insights: use item analysis reports to pinpoint questions with low discrimination or ambiguous wording.
Evaluate timing: review time-on-question data to determine if the time limit was appropriate.
Reviewing and improving
Capture student feedback: ask students for feedback on the exam experience, focusing on clarity, fairness, and technical functionality.
Document lessons learned: record insights and adjustments made to inform future online exams.