Jay Dempster, Academic Staff Development, University of Warwick
While there has been a growing array of software tools developed for the purpose of enabling non-programmers to author computer-based tests and surveys, there is now a visible market for enabling such tests for be delivered over the internet, predominantly using the web. This article reviews the software package, Perception, the latest offering from the world leaders in this area of software development, Question Mark Computing, and makes some comparisons with alternative tools. The decision of which tools to invest in (both purchase and time/effort) is discussed in relation to the technological requirements for running the software, but more importantly, fitting the tools appropriately to your educational objectives.
Perception is the new suite of software from Question Mark Computing - now regarded as the world leader in secure and versatile computerised assessment software. Perception will be of interest to all organisations wishing to conduct surveys, tests, and other assessments across the web or an intranet. Suited for both high and low volume needs, Perception is a flexible and easy to use application. It harnesses the power of web technology to deliver tests or surveys on-line. Questions are created using templates in a Windows 95/98/NT environment, then stored on a web server. Students (or survey respondents) log in with a user name and password if desired, to answer the questions, delivered in order or randomly, using standard web browsers. The server software then allows instructors to see reports as the questions are completed, live on the web.
Those of you familiar with Question Mark for Windows (QM Designer), QM Web and Network Guardian might wonder what is the difference between these products and this new suite, Perception. While Perception incorporates the functionality of all these in one neat package, the real answer is that Perception can only be used with 32-bit Windows operating systems, both in terms of the authoring component and the server software. The QM Web server software (On-line Scorer), on the other hand, will run on any web server. Since here at Warwick, our web server software, Apache, is not ISAPI compliant, as is the case at most other UK universities and colleges , we cannot use Perception, only QM Designer with the QM Web add-on.
One of the most attractive features of Question Mark software over other assessment software packages has been the ability to create questions in a wide range of formats other than simple multiple choice and multiple response types, to deliver them in random order and include jumps (especially useful in evaluations) and to incorporate more intuitive feedback to student responses. The advantage of Perception is that it can incorporate this functionality, while QM Web cannot reproduce some of the features of its originator, QM Designer. Unsupported features include hot spot and fill in blanks questions, control page settings, external tests, supplementary questions and retrying. Jumps and multimedia calls are not converted from Windows, and need to be re-specified for the Web using HTML tags. Tests created in QM Designer can be imported into Perception, although libraries require a bit of copy/pasting. If you only wish to incorporate straightforward question types and have a non-ISAPI compliant web server, such as the UNIX-based Apache, you will need to stick to QM Designer/QM Web. Multiple choice questions are used predominantly in the sciences and there are often large banks of questions offered in a variety of disciplines (cf. CTI Centre resources information).
For those new to computer-based assessments, you might consider using the CASTLE toolkit . This is an extremely easy-to-use means of creating multiple choice and multiple response tests, authored and delivered on the web. And more to the point, it's free ! When you evaluate this, next to the high investment required for any of the Question Mark software, it will surely influence how far beyond the multiple choice/multiple response type of assessment you wish to proceed. It is a common fact, however, that the design of effective objective tests is the main deterrent in using this method at all for assessment of student learning. The adoption of more flexible question formats and forms of feedback is a strong incentive to rethinking computer-based assessment since well-designed tests can facilitate deeper probing of students' knowledge and understanding of concepts.
My indulgence in comparing at length some of the different tools currently available, is - I believe - justified in that this information is vital to those considering going down the track of web-based assessment. The array of choices and add-ons can be daunting and confusing. The decision of which software to invest in - and since the price tags for Question Mark software are high - might depend on whether you are an individual lecturer looking for an innovative method to deal with the enormous task of student assessment, or a information systems manager mainstreaming a whole department or university approach to assessment. The one might influence the other in terms of strategy development in this area.
All these web-based assessment tools can utilise the power of the web in enabling full use of graphics and other multimedia components. The ability to dabble in HTML will allow you more control of the way the tests look on the web page and to build in links to other web resources (perhaps within the feedback).
So onto the virtues of Perception !
This suite offers the same versatility in producing, delivering and reporting computer-based tests/surveys for which Question Mark is well-known. It is probably the first truly secure solution for web-based assessments. The suite comprises two main components, one for your individual desktop author(s) and one for your web server. Installation from the single CD-ROM is extremely straightforward and the structure of the "Getting Started" documentation for creating questions and tests is easy to follow, although the lack of a detailed index was irritating. The on-line Help was quite useful.
The terminology for Perception is slightly different than previous versions of Question Mark software, with questions managed under Topic headings and tests under Sessions. While you create questions within a Question Manager program, sessions are constructed using a separate Session Manager, which allows you to grab a number of different questions from various topics, individually, en masse, or randomly. Using the Question Wizard (a step-by-step process of filling in boxes based on a pre-defined format), it was fast and easy to get started - I created a series of ten questions in two topics and one session in about an hour using previously created questions (typed in, or copy/pasted from other documents). A full Question Editor offers greater control over questions and is necessary for inserting additional text blocks, graphics, additional HTML elements, and sophisticated "outcomes". Outcomes are a fabulous means of awarding scores for specific combinations of answer choices e.g. in multiple response or text matching question types. This is essential to deliver some of that "intuitive" feedback I mentioned above, related to students' specific combination of answers.
Once questions and sessions have been created, the Session Manager allows the author to publish the tests by writing it to the allotted area on the web server. (This is similar to the way that web authoring software works, such as NetObjects Fusion and Microsoft FrontPage, and avoids the need for separate FTP uploading.) Perception Server's security manager allows the author (i.e. administrator or tutor) to specify names and passwords for each participant (those taking the test, e.g. students) and the time schedule for releasing tests. There is also control at this point for limiting the number of attempts and the duration of the test. Participants login and can see a list of tests which they are eligible to take at that time along with the test information.
As participants submit their answers, you can immediately start analysing the results using your own customised reports, or Perception's two standard reports:
- Session Overview report, listing general statistics of all sessions on the server such as the number of sessions started and finished, low and high scores, mean score of all completed sessions, and the standard deviation,
- Question Analysis report, generating statistics such as difficulty, discrimination, standard deviation, as well as the frequency rates for answer-choices within questions.
Perception stores information for questions, sessions, participants, participant responses, and reports within the MS-Access database format, readable by a large number of professional reporting and analysis applications.
In conclusion, Perception is an excellent package. It includes the best of QM Designer/ QM Web/Network Guardian in a neater and more manageable package. The high price might put many folk off experimenting with web-based assessment in pilot projects and requires a departmental or institutional strategic commitment to its implementation. Since a large majority of UK universities and colleges currently run institutional web servers which are not compatible with the Perception server software, its use in UK Higher Education might be limited to departments running their own servers. It is quite within reason to consider investing in a Windows NT server (departmental or institutional) soley for the purpose of assessment since a suitable machine would cost less than the Perceptoin software that runs on it. If you consider not only formative and summative student assessment but also automated student evaluations for course feedback, a versatile and secure web-based solution such as Perception looks most appealing!
Comprehensive information can be found at the Perception area of the Question Mark website, including screenshots of the program components, examples of tests and surveys created using Perception, evaluation version for downloading, and tutorial and FAQs (customer support site).
Dr Jay Dempster
Centre for Academic Practice
University of Warwick