Skip to main content

Supporting Computer-based Assessment

Joanna Bull, Teaching and Learning Directorate, University of Luton 

Computer-based assessment (CBA) has been used at the University of Luton for the last five years to delivery both summative and formative assessments in a wide range of subjects. Central support is provided for academic staff who wish to use the system which is based on Question Mark Designer for Windows software. The system evolved from a pilot study in 1993 and now encompasses approximately 30 modules each semester. The majority of use is for the delivery of end-of-module examinations, but increasing use of formative and self-assessment tests is being made as staff become aware of the benefits of the system (Zakrzewski and Bull, 1998) 

Role of Central Support Staff

The Unit for Learning, Technology Research and Assessment (ULTRA) is responsible for the co-ordination the delivery of examinations and other CBA, working with staff in the Information Services Division and the Examinations Office. Academic staff are given support and guidance in writing objective test questions. The questions are submitted as a Word or ASCII file, which is then converted into Question Mark tests by staff in ULTRA. Emphasis is placed firmly on academic staff developing question writing skills and efforts are made to encourage collaboration between colleagues in similar discipline areas and to support staff in investigating existing question banks. The time consuming nature of writing questions for objective tests means that it is important that questions can be re-used. Policies are in place which require academic staff to incorporate a percentage of new questions each subsequent time they use an examination. This helps staff to build up a bank of questions which they can amend and add to. Supporting staff in managing the cultural shift between investing time prior to the examination period, rather than after it, is also important. For summative examinations a series of deadlines and documented requirements ensures that staff submit tests in line with existing examination regulations and procedures and allows a process of proof-reading and revisions to take place. 

There is no requirement for staff to use the computerised assessment system and following the initial piloting the number of staff making use of the system has grown steadily. Academic staff using the system are often its best promoters and as result little internal marketing has taken place. 

In implementing CBA it is important to recognise the limitations of the technology. Technological innovation in teaching, learning and assessment can be hindered by unrealistically high expectations. Should technology fail to meet these expectations some academic staff may be discouraged from using any type of technology to support teaching, learning and assessment in the future. Support staff need to be aware of the limitations of the technology and be willing and able to suggest alternative approaches, whether they be technology-based or not. 

Educational Rationale

Rhe majority of tests delivered are for Level One modules with a predominance of subjects in Science, Technology, Psychology and Business Studies. Objective testing is seen to be suitable for testing some subject areas at for some levels, and is always used in conjunction with other assessment methods. Where a broad base of knowledge is to be tested with large groups of students, CBA is particularly beneficial in terms of the automatic marking and speed of feedback. 

While it is possible to write objective test questions which test higher order skills, such items are difficult to construct and take practice and time (Brown, Bull and Pendlebury, 1997). As staff become develop their skills and become familiar with the process they use a wider range of question types and increase the level of sophistication of their questions. However, it is important to recognise that in using objective test questions it is the marking which is objective - the questions are only as objective as the test writer makes them. 

Academic staff are encouraged to discuss their requirements with central support staff and to consider the learning outcomes and existing assessment methods used on the module. 

The nature of writing objective test questions requires academic staff to think carefully about exactly what knowledge and which skills they are testing. This sometimes results in amendments to existing assessment methods in order to enhance the balance of skills and knowledge being assessed on the module as a whole. 

The key to effective implementation of CBA is its use as an appropriate assessment tool for the particular module, its learning outcomes and students. It should be seen as part of a wider strategy which enhances the effectiveness of assessment methods employed across the whole degree programme. The key to this is the provision of a balance of assessment methods which test the range of skills, abilities and knowledge required by the learning outcomes of the module. This is clearly a difficult task and one which is the responsibility of both academic and support staff. 

Following each examination academic staff are provided with a statistical report which details the facility and discrimination of the test which allows them to evaluate their test questions. Unsuccessful distrators and questions can then be amended or replaced for subsequent year's examinations. Staff are encouraged to monitor results from CBA with those of previous cohorts and previous and alternative assessment methods. Student evaluation of the experience of computerised examinations was sought during the piloting of the system and their responses were in the main very positive. 

Approaches Across the Disciplines 

The majority of CBA at the University of Luton takes place in the following subject areas: Biology, Mathematics, Psychology, Computing, Information Technology, Politics and Public Policy, Leisure Studies, Marketing and Accounting. The approach taken by staff varies depending on the content and context of the test they are creating. Formative and self-assessment tests often have a detailed level of tutor-defined question specific feedback which supports student learning throughout their module. Subjects such as Biology and Leisure use graphics to support a range of question types, while Mathematics and Accounting naturally use numeric questions to a greater extent. 

At the University of Luton, as in the rest of the higher education sector, the use of CBA in the arts and humanities is less well developed. There has been some use of CBA in Media Studies and Languages for both summative and formative assessments. In the coming academic year additional modules in Media Studies will include an element of CBA and staff from History also to intend to develop CBA material for their modules. 

While there is potential to develop appropriate CBA material within the humanities the lack of relevant existing material often deters academic staff from becoming involved - somewhat of a chicken and egg situation! Staff within arts and humanities may need to take a different approach to the use of CBA - it may be more appropriate for formative or self-assessment, particularly to provide timely and effective feedback. The impetus to use CBA may come from their exploration of computer-aided learning materials. With staff from other disciplines, larger student numbers have often acted as the impetus for developing CBA. In the arts and humanities CAL is perhaps a better starting point, allowing staff to explore the use of technology in teaching and learning on a more general level before venturing into the potentially more problematic realm of CBA. 

Conclusions 

The development of the CBA system at the University of Luton has required excellent communication between support and academic staff as well as co-ordination between Faculties and central support. Pedagogic issues are key and it is important that the technology does not drive the assessment. A university-wide strategic approach is highly desirable, but within departments and for individuals, identification of key individuals who can assist the process can be most valuable (Stephens, Bull and Wade, 1998). Piloting on a small scale enables evaluation of the process and the development of appropriate strategies to implement on a wider scale. 


References

Brown, G, Bull, J, and Pendlebury, M (1997) Assessing Student Learning in Higher Education, Routledge: London. 

Stephens, D, Bull, J and Wade, W (1998) Computer-assisted Assessment: suggested guidelines for an institutional strategy, Assessment and Evaluation in Higher Education, 23 (3) p 283 - 294. 

Zakrzewski, S and Bull, J (1998) The Mass Implementation and Evaluation of Computer-based Assessments, Assessment and Evaluation in Higher Education, 23 (2) p 141 - 152.


Dr Joanna Bull
ULTRA
University of Luton
Email: joanna.bull@luton.ac.uk



Interactions Logo
bullet  Editorial
bullet  Articles 
bullet  News   
bullet  Innovations   
bullet  Resources   

 CAP E-Learning