Skip to main content

Developing and Evaluating Courses to meet Learning Outcomes

Martin Oliver,Teaching & Learning Technology Centre, University of North London

Contextual support

The competing demands on staff time means that only a handful of motivated staff will engage with curriculum redevelopment out of interest. In order to reach the mainstream as well as the enthusiasts, it is important to provide incentives and requirements that encourage participation (Oliver et al., 1999). One valuable incentive is recognition; for this reason, the EFFECTS project was set up to provide a framework for national accreditation of professional development in the use of learning technology (Bailey et al., 1998). 

Part of this process of seeking accreditation included the creation of ‘generic learning outcomes’ that could be implemented within professional development programmes. These could be achieved through different routes, responding to the local context within each institution. To ensure the relevance and usefulness of these outcomes, they were ‘reality tested’ through consultation with stakeholders and revised (Phelps et al., 1998). The current list includes: 

  1. Conducted a review of C&IT in learning and teaching and shown an understanding of the underlying educational processes
  2. Selected appropriate C&IT with an understanding of the underlying educational processes
  3. Planned the integration of appropriate C&IT
  4. Managed and implemented a developed strategy
  5. Evaluated impact of the interventions
  6. Disseminated the findings of the evaluation
  7. Reviewed, planned and undertaken personal needs in relation to embedding C&IT and made an appropriate CPD action plan to meet your individual requirements

It is important to recognise that, at this stage, the outcomes do not represent standards. This shift will only take place once common levels of attainment are agreed, which is planned to take place in conjunction with national accrediting bodies later in the project. Instead, they can be viewed as an "agenda for change"; in many ways they act as values that outline what staff engaging in this process ‘ought’ to be able to demonstrate. As institutional programmes receive internal validation and eventually national accreditation, it is this level of good practice that will form the basis of certification. 

Importantly, at the heart of these outcomes is the notion of reflective practitioners. For a participant to successfully complete an EFFECTS course, they have to think critically about how their use of Learning Technology will help students to meet the learning outcomes for the course. They are also required to share what they learn about this process with other practitioners. In this way, EFFECTS has attempted to make the improvement of course design an issue for discussion amongst communities of practitioners. 

Designing and developing courses

Even when staff choose to engage in course design, it is important to recognise that this activity requires unique skills and experience. Few staff have received formal training in curriculum development or instructional design, and many will have received only a brief and informal introduction to pedagogy. It is important that this expertise is made available to staff in a format that supports, rather than challenges, their expertise and professionalism. 

The Evaluation of Learning Technology (ELT) project has developed two toolkits to support staff: one for course design and one for evaluation (covered in the following section). These toolkits consist of a model of the design process, with activities and customisable knowledge bases used to support each key decision point (Oliver & Conole, 1999). 

Media Advisor is a software implementation of the course design toolkit developed within ELT (Kewell et al., 1998). The software includes three key components: a tool for describing teaching techniques, a tool for modelling the educational processes within a course, and a resource that supports the planning and costing of the development process. The first of these tools uses a simple Likert scale to describe how good different teaching techniques are at supporting the delivery of information, discussion, activities that allow learners to try out concepts or skills, and feedback to the learner on their progress (Figure 1). An important aspect of this tool is that is does not use pre-defined values; users are required to describe what they do (or plan to do) to ensure that the model has local relevance and to avoid unjustified assumptions. It also allows staff to define their own terminology; this ensures that innovative or specialist teaching techniques can be included as easily as traditional methods such as lecturing. 

The Media rater toolFigure 1 The Media Rater Tool

The second tool, the modeller, is linked to the rater. By entering the number of hours that students are expected to spend involved in each form of teaching, a ‘course profile’ is created. This combines the allocation of time with the information in the rating tool in order to show the types of educational interaction the course is good or bad at supporting (Figure 2). This information then forms the basis of a professional, qualitative judgement about whether or not this profile is appropriate, given the aims (and learning outcomes) of the course. If it is not appropriate, potential changes can then be modelled and compared, allowing the user to assess the relative merits of different approaches. 

Figure 2: The course modelling toolFigure 2 The Course Modeller Tool

The third key component of Media Advisor concerns the costs, resources and skills required to implement changes (Figure 3). The primary purpose of this section is to assess whether the course models they are considering are achievable, of if they would be better advised to consider a compromise, at least in the short-term. Unlike the previous sections, this resource contains suggested data. This was included to reflect the fact that many practitioners find it hard to estimate development times, particularly for media such as the Internet with which they have little prior experience. Nonetheless, users are encouraged to customise this data to reflect their own skills and abilities. 

Figure 3: the media selection tool
Figure 3: Media Selection Tool

In summary, then, Media Advisor supports staff in analysing the appropriateness of their current course, comparing alternative approaches and assessing the resource implications of any proposed development work. 

Evaluating effectiveness

The complement to Media Advisor, which is essentially a planning tool, is the evaluation toolkit also produced as part of the ELT project (Oliver, 1999). This supports staff in the design of an evaluation plan that will enable them to judge how effectively their course supports particular learning outcomes. As with the previous toolkit, this resource comprises of a model of the evaluation design process, supported by activities and information about the steps in the decision-making process. Briefly, these decisions cover: 

  • Stakeholder analysis
  • Refining the evaluation question
  • Selecting a methodology
  • Selecting data capture methods
  • Selecting data analysis methods
  • Presenting the findings

This process has been designed to recognise the social and political value of evaluation in presenting a persuasive argument, as well as the importance of ensuring validity and reliability. The toolkit also works well alongside the LTDI Evaluation Cookbook (Harvey, 1998), which gives advice on the implementation of evaluation methods once they have been selected. 


Using Learning Technology to support the achievement of learning outcomes is a complex process. In order to help staff achieve this, the University of North London has engaged in research and development work on three key areas: creating a context in which reflective practice is valued, supporting curriculum development and encouraging evaluation. This work provides a holistic framework that places individual expertise and professional judgement at the centre of the process. It also recognises the need for expertise in new areas and techniques to be made available to staff in a simple, accessible format. This has been achieved through the introduction of a structured programme of professional development, drawing on resources that are also available to staff to use on an individual basis. In this way, the institution seeks to make the process as accessible as possible, but also goes one step further by providing incentives (such as a validated course, to form part of a postgraduate award) that encourage staff to engage in the process at all. 

Dr Martin Oliver
Senior Research Fellow
Teaching & Learning Technology Centre
University of North London


Bailey, P, Jenkins, A., Oliver, M., Maier, P. & Young, C. (1998) ALTering EFFECTS: rewarding teaching using C&IT. Active Learning, 9, pp. 69-70.

Harvey, J. (Ed) (1998) LTDI Evaluation Cookbook. Edinburgh: Learning Technology Dissemination Initiative.

Kewell, B., Oliver, M., Beaumont, D., Standen, R., Hopkinson, S. & Conole, G. (1998) The creation of Media Advisor: from theoretical framework to multimedia resource. ELT report no. 5, University of North London.

Oliver, M. & Conole, G. (1999) From theory to practice: A model and project structure for toolkit development. BP ELT report no. 12, University of North London.

Oliver, M., Phelps, J., Beetham, H., Bailey, P. & Conole, G. (1999) Evaluating EFFECTS: identifying issues and assessing the evaluation framework. EFFECTS report no. 3, University of North London.

Phelps, J., Oliver, M., Bailey, P. & Jenkins, A. (1998) The development of a generic framework for accrediting professional development in C&IT. EFFECTS report no. 2, University of North London.

Interactions Logo 
bullet  Editorial 
bullet  Articles 
bullet  News   
bullet  Innovations   
bullet  Resources   

 CAP E-Learning