Skip to main content Skip to navigation

Measuring outcomes

What effect does guidance have? How can we measure it? Why do we need to measure it? These questions are explored through papers, briefings, reviews and discussions. A range of viewpoints are explored, including those of policy makers, practitioners and users of guidance services.

The importance of measuring outcomes is widely recognised. We need to know whether we are doing a good job, for our own professional satisfaction, to ensure we are meeting the needs of the clients, and to provide evidence to those who fund such services that they are getting good value. A CfBT (2009) research project aimed to inform the evidence base for careers and guidance and on eoutcome has been the production of a set of resources: a synthesis report, an online interactive resource and a literature review: all of which are downloadable from: http://www.cfbt.com/evidenceforeducation/whatweoffer/resources/guidance/careersandguidance.aspx

Key themes related to Measuring Outcomes

This paper 'Measuring Outcomes' by Edwin Herr identifies six key sets of questions about measuring the outcomes of guidance.

This paper 'Measuring Outcomes' by Edwin Herr identifies six key sets of questions about measuring the outcomes of guidance.
Herr's paper 'Measuring Outcomes' considers the topic in terms of six sets of questions which are given below:

What outcomes should policy-makers expect from career guidance services?
In assessing outcomes for policy purposes, what balance should be struck between different types of outcomes?
What strategies are needed to improve the collection of data that can be used to evaluate outcomes? Who should data be gathered from? What types of data should be gathered? How should it be gathered?
Do individual practitioners, service managers and policy makers have different interests in the outcomes of career guidance? Do they need different types of data? Can all of their needs be accommodated?
Should evidence on outcomes be linked to the allocation of resources and to priorities for service delivery? If so, how?
How can the costs of career guidance be assessed in relation to its benefits? How should this data be used to influence policy and resource allocation?
Measuring outcomes paper by Edwin Herr
These ideas were presented by Edwin Herr at the OECD sponsored Toronto conference in October 2003

Impact Analysis: a survey of the literature

This literature review is intended to help you make an informed judgement about whether or not to seek out the full texts for the items listed.

This literature review is intended to help you make an informed judgement about whether or not to seek out the full texts for the items listed.
These resources are drawn from research material and theory on the provision of careers education and guidance for the 11-16 age group. Specifically, they consider the impact of CEG on students in compulsory education.

Clicking on the little 'sticky' icon at the top right of an item brings up a viewer with details of the focus of study, key findings and additional comments.

Armstrong 1998
Armstrong, D. (1998) Careers Guidance, Psychometric Testing and Unemployment Amongst Young People: An Empirical Analysis for Northern Ireland. Applied Economics. Vol. 30 pp. 1203-1218

This paper presents a quantitative analysis of the effect of testing in Northern Ireland on young people’s subsequent experiences of unemployment.

Bagnall 2000
Bagnall, N.F. (2000). The Balance Between Vocational Secondary And General Secondary Schooling In France And Australia. Comparative Education. Vol. 36. pp. 459-475.

The article makes a comparison of school-based vocational programmes in the two countries.

Bingham 1998
Bingham, W.C. (1998) A Perspective on Career Guidance in the Twenty-First Century. International Journal for the Advancement of Counselling. Vol. 20 pp. 61-69.

The content of the article touches upon many concerns, which are likely to confront operators of career guidance programs through the first quarter or so of the next century.

Bullock and Wikeley 1999
Bullock, K., Wikeley, F. (1999). Improving Learning In Year 9: Making Use Of Personal Learning Plans. Educational Studies. Vol. 25. pp. 19-33.

This paper takes an evaluative look at an action-planning initiative, which aimed to improve pupils' learning in Year 9.

Carlo 2003
Carlo,R. (2003). Disaffected Young People and the Work-related Curriculum at Key Stage 4: issues of social capital development and learning as a form of cultural practice. Journal of Education and Work, Vol. 16. pp. 69-86.

Government concern with underachievement in certain school environments and the relatively poor staying-on rates into post-compulsory education and training in particular urban contexts has led to the development of work-related/work-based alternative curricula as a way of re-motivating and re-engaging those young people classified as being ‘at risk’ or disaffected.

Colley 2000
Colley, H. (2000). Mind The Gap: Policy Goals And Young People's Resistance In A Mentoring Programme.

This paper draws on research evidence from a study of mentoring relationships within a pre-vocational training scheme for 16-19 year olds identified as "disaffected", based on principles underpinning current policies on social exclusion.

Cornwall and Devon Careers 2000
Cornwall And Devon Careers. Anonymous. Connecting With Young Offenders: A Case Study Of Careers Work With YJT Clients. DfES. (2000 Nov). CSNU/DY0004,

This project, managed by Cornwall and Devon Careers, investigated how a dedicated careers adviser, working in partnership with a Youth Justice Team (YJT), could provide enhanced guidance, support and mentoring to young people for whom the team was responsible.

Creed and Patton 2002
Creed, P.A. and Patton, W. (2002) Differences in Career Attitude and Career Knowledge for High School Students With and Without Paid Work Experience. International Journal for Educational and Vocational Guidance. Vol. 3:1 pp.21-33

A large sample (N=1,279) of high school students was assessed using the Career Maturity Inventory (CDI-A; Lokan, 1984). The two composite scales of Career Development Attitude and Career Development Knowledge were examined in relation to age, gender and whether the students had engages in paid work experience.

Davies 1991
Davies, P. (1991). Careers Work with Younger Pupils. Newscheck with Careers Service Bulletin. Vol. 1:7 Feb 1991.

The author puts the case for introducing careers education at an earlier stage and outlines the results of some recent evaluation studies in this field.

Dewhirst et al 1994
Dewhirst, S., Lines, S., Martin. D. (1994) Careers Guidance Interviewing: What Should it Achieve. Newscheck, Vol. 4:5 March 1994.

The authors seek to open a debate on how best to evaluate the effectiveness of the one-to-one careers guidance interview.

DfES Mar 27 2003
DfES (Mar 27 2003) Study of young people permanently excluded from school. Report No. 405.

This study tracked the careers for a two-year period, of 193 young people after their permanent exclusion from school during Year 9, Year 10 or Year 11 (13 to 16 years of age) in a representative sample of 10 LEAs.

Drier 2000
Drier, H.N. (2000). Special Issue Introduction: Career And Life Planning Key Feature Within Comprehensive Guidance Programs. Journal Of Career Development. Vol. 27. pp. 73-80.

This edition of the journal focuses upon career planning within schools.

Edwards et al 1999
Edwards, A., Barnes, A., Killeen, J., Watts, A. (1999) The Real Game: Evaluation of the UK National Pilot. CRAC/NICEC.

The policy of the National Life/Work centre (which manages the development of The Real Game) is to reach national agreements with other countries wishing to promote and distribute the game.

Fardig 1992
Fardig, D. (1992) Career Education: Program Evaluation Report. Orange County Public Schools, Orlando, FL. Program Evaluation Office.

In the Orange County (Florida) Public Schools, an evaluation was conducted of career education provided in elementary schools and secondary schools.

Flouri and Buchanan 2002
Flouri, E., Buchanan, A. (2002) The Role of Work-Related Skills and Career Role Models in Adolescent Career Maturity. The Career Development Quarterly. Vol. 51. pp. 36-44.

The authors used data from 2, 722 British adolescents, ages 14-18 years, to explore whether work-related skills and career role models are associated with career maturity when sociodemographic characteristics (age, socio-economic status, gender, family structure), family support (mother/father involvement), and personal characteristics (self-confidence, academic motivation) are controlled.

Gfroerer 2000
Gfroerer, M. (2000). Career Guidance On The Cutting Edge Of Competency-Based Assessment. Journal Of Career Development. Vol. 27. pp. 119-131

This article describes New Hampshire's efforts in developing and implementing a Competency-Based Transcript for secondary education.

Gitterman et al 1995
Gitterman, A. et al. Office of Educational Research and Improvement ((ED), Washington DC, 1995). Outcomes Of School Career Development. EDO-CG-95-58.

This publication shows how successful programs of school career development in the US can help all students experience an enriched education.

Golden et al 2002
Golden, S., Spielfhofer, T., Sims, D., Aiston, S., & O’Donnell, L. (2002). Re-Engaging The Hardest-To-Help Young People: The Role Of The Neighbourhood Support Fund. 366. DfES Research Report.

The Neighbourhood Support Fund (NSF), which was launched in September 1999, aims to re-engage disaffected and disengaged young people aged 13 to 19 into education, training or employment.

Golden et al 2003
Golden, S. Nelson, J. O’Donnell, L. Morris, M. (2003) National Evaluation of the Increased Flexibility for 14-16 Year Old Programme. Research Brief RBX 11-03. DfES.

The Increased Flexibility for 14-16 Year Olds Programme (IFP) was introduced by the Department for Education and Skills (DfES). This £120 million programme aims to ‘create enhanced vocational and work-related learning opportunities for 14-16 year olds of all abilities who can benefit most.’

OHMCI 1997
Great Britain Office of Her Majesty's Chief Inspector of Schools in Wales; (1997) A survey of careers education and guidance in the secondary schools of Wales, OHMCI, pp. 1-16.

The study aims to explore the nature of CEG provision in secondary schools in Wales and to analyse the standards and quality of teaching and learning in CEG and to appraise the factors that influence it.

Hillage et al 1996
Hillage, J., Honey, S., Kodz, J., Pike, G. (1996) Pre-16 Work Experience in England and Wales. The Institute for Employment Studies Executive Summary Report 319.

This summary presents the main findings of the yearlong study for the Department of Education and Employment (DfEE) evaluating the scope and quality of pre-16 work experience in England and Wales.

Howieson and Semple 1996
Howieson, C., Semple, S. (1996) Guidance in Secondary Schools: The Pupil Perspective, CES Briefing, Centre for Educational Sociology (A Research Centre of the ESRC)

Against a background of major changes in schools and in pupils’ post-school opportunities, a recent SOEID funded project, Guidance in Secondary Schools, explored pupils’ guidance needs and the effectiveness of provision.

Hudson 1996
Hudson, G. (1996) Dealing with Work: secondary students' work experience and the curriculum Journal of Vocational Education and Training Vol. 48:3.

The Technical and Vocational Education Initiative (TVEI) was introduced as a major programme in the United Kingdom that would relate secondary schools' curricula to the 'world of work'.

Huiling 2001
Huiling, P. (2001). Comparing The Effectiveness Of Two Different Career Education Courses On Career Decidedness For College Freshmen: An Exploratory Study. Journal Of Career Development. Vol. 28. pp. 29-41.

An exploratory study was conducted to compare the effectiveness of two different career education courses on career decision making for college freshmen in Taiwan.

Huteau 2001
Huteau, M. (2001). The Evaluation Of Methods In Career Education Interventions. International Journal For Educational And Vocational Guidance. Vol. 1. pp. 177-196.

This article initially examines the conditions in which group methods in career intervention have emerged, reviewing their general traits and the criteria which evaluation research must meet.

Hutton 1994
Hutton, D. (1994) Action Plans. Careers Guidance Today 1994 Part 2.

This describes a single case where the most positive achievement had been to give ownership of the Careers Guidance Action Plan to the pupils themselves.

Jahnukainen 2001
Jahnukainen, M. (2001). Two models for preventing students with special needs from dropping out of education in Finland. European Journal of Special Needs Education. Vol. 16. pp. 245-258.

Numerous studies have shown that pupils with special needs are at high risk of dropping out and leaving school without finishing their studies, in particular, in post compulsory schooling.

Kucker 2000
Kucker, M. (2000). South Dakota's Model For Career And Life Planning. Journal Of Career Development. Vol. 27. pp. 133-148.

This article provides information about how a rural state manages its resources to deliver a well-coordinated career and life planning system for students in grades 9–12.

Lui 1989
Lui, H.W.E. (1989) The Effectiveness Of Career Guidance Approaches. Research PapersEUR/15/89.

The educational research unit project ITL2 study in Singapore was quasi-experimental research designed to measure the effectiveness of non-traditional personnel and resource material package in developmental career guidance.

Luzzo and Pierce 1996
Luzzo, D.A., Pierce G; (1996) Effects of DISCOVER on the career maturity of middle school students. Career Development Quarterly. Vol. 45:2 pp. 170-172.

This study evaluates the effect of computer-assisted career guidance system on the career maturity of 38 students in a rural middle school.

MORI 1998
MORI (1998) The Value of Information Types and Sources for Year 11 Decision Making (commissioned by DfEE).

This document presents the findings of a qualitative research study carried out by MORI’s Social Research Unit on behalf of the DfEE.

Morris et al 2000
Morris, M., Rudd, P., Nelson, J., Davies, D. (2000) The Contribution of Careers Education and Guidance To School Effectiveness in Partnership Schools, National Foundation for Education Research, DFEE.

This study was commissioned by the DfEE in order to gain a clearer understanding of the impact that careers education and guidance may have upon the overall effectiveness of schools.

NFER 1998
NFER (1998) The Impact of Careers Education and Guidance on Young People in Years 9 and 10: A Follow up Study.

This study is a follow up study to the baseline study

NFER 1996
NFER (1996). Careers Education and Guidance Provision for 13 and 14-Year Olds, (commissioned by DfEE).

This report is an outcome of a detailed study commissioned by the Department for Education and Employment in 1995 to provide baseline information on the careers education and guidance provision made by schools and careers services for students in Years 9 and 10.

OECD 2000
OECD Organisation For Economic Co-Operation And Development. (2000). From Initial Education To Working Life: Making Transitions Work. OECD Education & Skills. May 2000. pp. 1-199.

This publication outlines the experiences of the transition from compulsory education to work in 14 OECD countries change during the 1990s and examines which types of transition policies worked best.

Office for Standards in Education 1995
Office For Standards In Education (1995) A Survey Of Careers Education And Guidance In Schools, Ofsted.

This report outlines the findings of a survey of careers education and guidance (CEG) in schools, conducted between May 1994 and June 1995.

Patton and Creed 2002
Patton, W., Creed, P.A. (2002) The Relationship Between Career Maturity and Work Commitment in a Sample of Australian High School Students, Journal of Career Development. Vol. 29:2. pp. 69-85

This study reports on a study conducted with 377 Australian students enrolled in grades 9 through 12.

Peterson et al 1999
Peterson, G.W. Long, K.L. Billups, A. (1999) The Effect of Three Career Interventions On Educational Choices of Eighth Grade Students. ASCA Professional School Counselling. Vol. 3:1 October 1999.

Investigates the formulation of 4-year high school programs of study in order to help US students realise their career aspirations.

Reid 1999
Reid, H.L. (1999). Barriers To Inclusion For The Disaffected: Implications For 'Preventive' Careers Guidance Work With The Under-16 Age Group. British Journal Of Guidance & Counselling. Vol. 27:4. pp. 539-554.

This article suggests guidance is ineffective if it fails to consider social difference and pay adequate attention to the social context within which individuals operate.

Reid 2002
Reid, K. (2002). Mentoring With Disaffected Pupils. Mentoring And Tutoring. Vol. 10. pp. 153-169.

This article focuses on a critical evaluation of progress being made by a number of major innovative schemes, which are aimed at raising standards in school and combating disaffected behaviour.

Robinson 2002
Robinson, S. (2002). Influences On The Career Decision-Making Of Ethnic Minority Pupils At Key Stage 4 - Implications For Careers Education And Guidance In A Multi-Ethnic School. Careers Education And Guidance. February 2002. pp. 2-5.

This study examines the influences on the career decision-making of ethnic minority pupils at the end of key stage 4 in a large co-educational comprehensive school in an urban city area.

Roger and Duffield 2000
Roger, A., Duffield, J. (2000) Factors Underlying Persistent Gendered Option Choices in School Science and Technology in Scotland. Gender and Education. Vol. 12 pp. 367—383

After survey and meta-analysis the article highlights various school initiatives and estimates the likelihood of their success in addressing the underlying influences on girls’ choices away from science and technology.

Saunders et al 1997
Saunders, L., Stoney, S., & Weston, P. DfEE (1997). The Impact Of The Work-Related Curriculum On 14-To 16-Year-Olds. Journal Of Education & Work. Vol. 10. pp. 151-167

The Department for Education and Employment commissioned a research team at the National Foundation for Educational Research in 1996 to undertake a review of research into the impact of the work-related curriculum on the motivation and achievements of young people aged 14-16.

Smith 2002
Smith, J.P. (2002) Training Young People Through A School/Enterprise Partnership: A Longitudinal Study. Education + Training. Vol. 44. pp. 281-289.

This paper reports a longitudinal study of 58 students who undertook an engineering traineeship concurrent with their final two years of secondary school.

Stuart et al 2000
Stuart, N., Tyers, C., & Crowder, M. Outcomes From Careers Education And Guidance (Phase II) - A Tracking Study. DfES. (2000 Oct 27). RBX9.

This document provides a summary of the main findings from the second phase of a research study into the outcomes from Careers Education and Guidance (CEG) that was carried out between September 1997 and October 1999.

Thomson et al 1995
Thomson, G.O.B., Latter, J., Ward, K. (1995) Planning for Transition: Canadian and Scottish Experience in Comparative Perspective. European Journal of Special Needs Education. Vol. 10:3 pp. 199-209.

This paper reports on a comparative approach to planning the transition to adulthood of young people with physical disabilities.

Watts 2001
Watts, A.G. (2001). Career Guidance And Social Exclusion: A Cautionary Tale. British Journal Of Guidance And Counselling. Vol. 29. pp. 157-176.

The author concludes that while career guidance has an important contribution to make in addressing social exclusion, this should be secondary to supporting individual progression and development.

Wentling and Waight 2001
Wentling, R-M., Waight, C.L. (2001). Initiatives That Assist And Barriers That Hinder The Successful Transition Of Minority Youth Into The Workplace In The USA. Journal Of Education And Work. Vol. 14. pp. 71-89.

This article reports the results of a study on the initiatives that assist and barriers that hinder the successful transition of minority youth into the workplace.

Wessel 2003
Wessel, R.D. (2003). Enhancing Career Development Through The Career Success Club. Journal Of Career Development. Vol. 29. pp. 265-276.

This article traces the success of a career management plan in enhancing the career development of US undergraduates.

Yip 1990
Yip, K. (1990) Excellence in education for Singapore: The role of Career Guidance. International Journal for the Advancement of Counselling. Vol. 13. pp. 185-191.

This paper begins with a brief history of career guidance in Singapore and a description of provisions made in the past for the educational and vocational needs of weaker students.

Outcomes from career information and guidance services
As part of a study of guidance systems in Member State, in a perspective of lifelong learning, which the European Commission is undertaking in co-operation with OECD, the National Institute for Careers Education and Counselling (NICEC) was commissioned to prepare a paper on "evaluating outcomes in guidance service delivery". The aim was to consider different approaches to measurement of outcomes from career information and guidance service delivery, provide exemplars of different approaches, summarise reviews of available evidence using these approaches, and outline policy priorities for the collection of such evidence in future, and for effective sharing and dissemination of such evidence.

Theoretical Models of Impact Analysis

This section contains the results of a search for theoretical models on impact analysis conducted by the Centre for Guidance Studies, University of Derby. It provides abstracts of research into impact analysis on career education and guidance, as well as research on measuring of outcomes in other disciplines.

This section contains the results of a search for theoretical models on impact analysis conducted by the Centre for Guidance Studies, University of Derby. It provides abstracts of research into impact analysis on career education and guidance, as well as research on measuring of outcomes in other disciplines.
The search process involved:

the electronic searching of databases (ERIC, BEI, EBSCO, PsychInfo, Ingenta, Taylor & Francis, SOSIG) using keywords (models, frameworks, strategy, assessing, measuring, outcomes, impact, benefits, service, delivery, career guidance);
a hand search of the National Library Resource for Guidance (NLRG);
searches of websites such as the Organisation for Economic Co-operation and Development (OECD), International Association for Educational and Vocational Guidance (IAEVG), National Association of Career Guidance Teachers (NACGT), Department for Education and Skills (DfES), and the National Institute for Careers Education and Counselling (NICEC).
The text has been taken either from the abstract or main body text of the included publications.

Models from guidance and guidance-related disciplines

Impact analysis models are considered from within the field of guidance or guidance-related fields. There are studies from the UK and the USA and subjects covered include the Economic Outcomes of Guidance and Accountability and Evaluation in Careers Services.

Impact analysis models are considered from within the field of guidance or guidance-related fields. There are studies from the UK and the USA and subjects covered include the Economic Outcomes of Guidance and Accountability and Evaluation in Careers Services.
Clicking on the name of the author acts as a link to further details of the study.
Killeen 1996
Killeen, J. (1996) The learning and Economic Outcomes of Guidance, in A.G. Watts, B. Law, J.M. Kidd and R. Hawthorn (eds) Rethinking Careers Education and Guidance: Theory, Policy and Practice. London: Routledge.

Included in this chapter is the presentation of a model, which summarises the relationships that need to be taken into account when examining the learning outcomes and economic effects of guidance.

Hughes et al 2003
Hughes, D., Gration, G. and Mayston, D. (2003) Measuring the Impact of Advice and Guidance within the Northamptonshire IAG Partnership. International Centre for Guidance Studies, University of Derby.

The iCeGS research team developed an evaluation framework requiring an initial, face-to-face contact with the client, followed by a second contact, typically by telephone, four to six weeks later.

Sampson et al 2004
Sampson, JR. J.P., Reardon, R.C., and Lenz, J.G. (2004) Accountability and Evaluation in Career Services, in Career Counselling & Services: A Cognitive Information Processing Approach. Thompson Brooks/Cole.

Bezanson 1995
Bezanson, L. (1995). "Quality Career Counseling Services:" A Developmental Tool for Organizational Accountability: ERIC Digest.

This briefing draws on a recently completed study of the effectiveness of Careers Services in Scotland to consider some of the issues involved in assessing the quality and effectiveness of careers guidance work.

Kraft 1993
Kraft, N.P. (1993) An Analytic Framework for Evaluating the Impact of Education Reform. Paper presented at the Annual General Meeting of the American Educational Research Association (Atlanta GA April 12-16 1993).

Marzano et al 1993
Marzano, R.J., Pickering, D. and McTighe, J. (1993) Assessing Student Outcomes: Performance Assessment Using the Dimensions of Learning Model. Mid-Continent Regional Educational Lab., Aurora, CO.(BBB23081) p. 143

Models from other disciplines

Impact analysis models are considered from other disciplines. The seven studies in this section cover a wider range of topics, including benchmarking and outcome mapping in other settings such as psychotherapy.

Impact analysis models are considered from other disciplines. The seven studies in this section cover a wider range of topics, including benchmarking and outcome mapping in other settings such as psychotherapy.
Clicking on the name of the author acts as a link to further details of the study.
Pawson and Tilley 1997
Pawson, R. and Tilley, N. (1997) Realistic Evaluation. London: Sage Publications.

The purpose of a realistic evaluation is to establish whether there is an ‘inequivocable causal relationship between a program and its outcome’.

Earl and Carden 2002
Earl, S. and Carden, F. (2002) Learning from complexity: the International Development Research Centre’s experience with Outcome Mapping. Development in Practice. Vol. 12:3&4. pp. 518-524.

This paper introduces the major concepts of Outcome Mapping and discusses the International Development Research Centre’s experience in developing and implementing Outcome Mapping with Northern and Southern research organisations.

Campenhausen and Petrisch 2004
Campenhausen, C. von. and Petrisch, G. (2004) The benchmarking matrix. Managerial Auditing Journal. Vol. 19:2pp. 172-179

Barkham et al 1998
Barkham, M., Margison, F., Evans, C., McGrath, G., Mellor-Clark, J., Milne, D. and Connell, J. (1998) The rationale for developing and implementing core outcome batteries for routine use in service settings and psychotherapy outcome research. Journal of Mental Health. Vol. 7:1. pp. 35-47

This paper presents the rationale for developing and implementing a core outcome battery in routine clinical practice as well as psychotherapy research.

Ambrozic 2003
Ambrozic, M. (2003) A few countries measure impact and outcomes – most would like to measure at least something. Performance Measurement and Metrics. Vol. 4:2. pp. 64-78.

This paper discusses theoretical and practical approaches to the problems of assessing library performance and especially of measuring outcomes in the developing world.

Liu and Walker 1998
Liu, A.M.M. and Walker, A., (1998) Evaluation of Project Outcomes. Construction Management and Economics. Vol. 16. pp. 209-219.

The evaluation of the outcome of construction projects has been the subject of unresolved debate for many years. This paper argues that previous views have tried to find a simple solution to a complex problem.

Stufflebeam et al 2000
Stufflebeam, D.L., Madaus, G.F., Kellaghan, T. Evaluation Models: Viewpoints on Educational and Human Services Evaluation. (2nd Ed. November 2000) Kluwer Academic Publishers.

Evaluation Models is an up-to-date revision of the classic text first published in 1983.

Discussions on what matters and what counts in guidance

Does guidance make a difference? How can we measure that? These questions generated a lively discussion among a group of practitioners, researchers and policy-makers into the reasons for undertaking impact analysis and the potential pitfalls. The consensus reached was that measuring outcomes is essential if improvements are to be made, with the proviso that any studies and data collected are used in meaningful ways.

Does guidance make a difference? How can we measure that? These questions generated a lively discussion among a group of practitioners, researchers and policy-makers into the reasons for undertaking impact analysis and the potential pitfalls. The consensus reached was that measuring outcomes is essential if improvements are to be made, with the proviso that any studies and data collected are used in meaningful ways.
Some opening thoughts ...

In simple terms impact analysis attempts to provide answers to the question ‘Are we making a difference?’ But how this is judged will depend on the particular perspective of whoever is standing in judgement, it might mean:

Does Impact Analysis reflect that the service provides value for money?
Does Impact Analysis help me judge if what I do is meaningful and of value to my users?
Does Impact Analysis suggest that practitioners are achieving organisational goals?
Does Impact Analysis demonstrate that using guidance services will help me in some way?
How can Impact be measured?
What relevance do Impact Analysis studies have for guidance practitioners?

What is Understood by Impact Analysis

Here is a summary of an earlier discussion on the theme which includes some later comments by participants

Here is a summary of an earlier discussion on the theme which includes some later comments by participants
We can’t measure all that matters!

In simple terms impact analysis attempts to provide answers to the question 'Are we making a difference?' However, there was animated discussion about how that might be judged. It was recognised that there are a number of drivers that influence perceptions on what might be regarded as a 'positive impact', and different views expressed on the relative merits of the potentially conflicting, perspectives.

Drivers could take a number of forms:

Political – Does Impact Analysis reflect that the service provides value for money?
In a context in which successive governments represent the main funding source for adult guidance, it is inevitable that for policy makers, impact analysis might be expected to reflect how career guidance links to government agendas. There may be pressure to demonstrate achievement of pre-determined outcomes such as entry into employment or raised skills levels among particular client groups. Furthermore, this impact may need to be achieved within relatively short timescales. Such expectations could shift analysis of impact from holistic investigation and enquiry, to the potentially more suspect arena of meeting the requirement to demonstrate success against project criteria. That is, e.g. has a particular project met agreed outcomes that reflect the desire of government to pursue specific targets related to participation in learning, and does the investment of resources therefore represent value for money?

Professional. Does Impact Analysis help me judge if what I do is meaningful and of value to my users?
For the guidance professional, impact measurement may be more subtle. Practitioners may view the soft outcomes of their work as absolutely fundamental. Can the practitioner gauge the extent to which what they do matters and makes a difference to their clients? A careers adviser may sense that soft outcomes such as increased motivation, increased self confidence or greater self awareness are beneficial to individuals, but how can that perception be translated into a more robust measurement? If it is possible to find a mechanism for impact analysis that relates to professional experience, then it could be a basis for improving practice. Subjective anecdote needs somehow to metamorphose into a more robust evidence base, and from this perspective impact analysis needs to find a mechanism by which to capture those impressions. Professional satisfaction – and therefore performance, may relate to impact analysis, but how can this be judged?

Employer led. Does Impact Analysis suggest that practitioners are achieving organisational goals?
Employing organisations inevitably dictate how practitioners spend their time. Impact Analysis could take a different form, it might link to broader aims such as partnership development, maintenance of a service, productivity judged in terms of numbers of clients seen with effectiveness of individual interactions placed in a broader context of efficiency. A cynic might argue that sustainability of project funding may at times seem to link more to correctly completed paperwork than the quality of an individual client’s experience. A properly recorded National Insurance number may be more important to an employer than the practitioners claim to have ‘moved a client on’ in some significant, but intangible way.

Personal. Does Impact Analysis demonstrate that using guidance services will help me in some way?
For the individual on the receiving end of guidance impact analysis may be even harder to pin down. It may link to the extent to which a client’s expectations have been met, but given that some research suggests expectations of careers guidance are at times pitifully low, that may be an insufficient guide. Equally a client may initially perceive their encounter with guidance negatively because it was challenging in some way, but over time recognise this as having value in helping them to face up to issues that needed to be addressed. In isolation, the client perspective, though pivotal, may not be enough. Equally, it may be possible to show that an intervention has made a difference, but it does not follow that the difference is one that is valued positively by the client. (E.g. if they are placed in employment, but it is not sustainable, or linked to their particular talents or interests.) There remains a debate about the extent to which individuals – or policy makers – understand what guidance is, and therefore its potential.

The collective experience of the group suggested that the drivers that determine success can conflict with one another. For example, a unit providing impartial advice to individuals enquiring about college courses may be judged as performing poorly if the time spent on dispensing advice does not translate into the expected number of enrolments. However, if the service was to be judged in terms of the retention rate of students who had enrolled on courses following their advice sessions it might be found to score very highly. When conflict occurs, it may be that the dominant driver becomes the economic imperative, irrespective of the front line experiences of guidance practitioners. This creates a dilemma. In relation to Impact Analysis, there is a need for 'joined-up thinking' to use the cliché, but nevertheless some studies of impact are too narrow in focus to take account of the broader context of the work they attempt to evaluate.

Impact Analysis needs to address these conflicting drivers, and yet evolve a mechanism for communicating the effectiveness or practice against a wide range of possible outcomes that is sufficiently robust to be meaningful. There was some discussion around the theme of ‘objectivity’. Given that guidance is often more than an art than a science, this raises questions about the extent to which it is possible to illustrate success. There is an ever present danger that what is measured is that which is easy to measure, rather than that which is most relevant.

Much guidance is linked to outcomes that may only emerge over time. These outcomes are particularly hard to measure, and it may be hard to link progression to any single intervention where a client may be seeking support from a wide range of different sources.

How can Impact be measured?

The discussion around drivers led to consideration of how impact might be measured. It was agreed that to demonstrate impact, it was necessary to know the starting point of the client, in order to judge how any intervention had impacted on them. Impact Analysis therefore necessitates finding a means by which to capture that shift. It was agreed there is no single obvious way of achieving this, though a number of different approaches might be valid, appropriate and helpful.

'Distance travelled' might involve a comprehensive survey of clients’ pre and post interview. However, this approach could be intimidating and burdensome for a client, and not conducive to building rapport as an opening for effective guidance.

It could be argued that the effective practitioner will as part of good practice gain this information through a structured guidance interview. It was accepted that most careers guidance workers have been trained to begin by 'hearing the client’s story' and that involves understanding the starting point of the individual, in order to move the client on in some relevant and appropriate way during the course of an interview. Progression might then be reflected in any action points arising, or in asking directly of the client what they feel they have gained and/or will now do as a result of the interview, as part of the closing dialogue. However, it is recognised that for many practitioners the individual vocational guidance interview with time for reflection is now an unimaginable luxury, and it would be naïve to imagine everyone involved in giving IAG is consistent in their adherence to good practice when practical considerations rather than idealised theory become the main influences on day to day practice.

Prof Sampson, in Florida State University, has been looking at developing a system of output measurement, linked to assessing the extent to which individuals have moved in relation to skills, knowledge and attitudes, perhaps this approach may be found to have resonance in Impact Analysis in the UK.

Tol Bedford may provide a basis for analysing what goes on in a guidance interview with his FIRST framework. The mnemonic suggesting effective guidance according to him, should demonstrate movement along the areas of focus, information, realism, scope and tactic. How relevant do these goals seem in today’s context?

There is an issue around proxy indicators, which are meaningless without context, e.g. the individual who drops out of a course (a negative outcome) but enrols on and completes a different more appropriate course of action or study (a positive outcome). Around 50 have been identified that could be termed in this way.

It is increasingly recognised that different clients have different needs, and therefore different outcomes may be appropriate – this should be taken into account in any Impact Analysis enquiry.

What about forms of interaction that are based on e.g. group work, information provision or brief staff-assisted queries. Where the contact is more minimal, how does this influence the process of Impact Analysis?

What relevance do Impact Analysis studies have for guidance practitioners?

The immediate response of the group was that although relevant and effective Impact Analysis could be of fundamental relevance for guidance practitioners, in reality they were probably largely perceived as at best irrelevant and at worst actually threatening.

It was felt that Impact Analysis was sometimes linked to a sense of being judged. This observation led to discussion around the theme of peer review. Practitioners may welcome input from others who understand the issues they face as a device for improving practice, yet the experience of being watched and evaluated by a manager perceived to be operating from a different set of objectives might be viewed very differently. How Impact Analysis studies are viewed, may link to who is doing the studying, against what criteria and the audience to which any Impact Analysis study is to be addressed.

Impact Analysis is not unproblematic, yet the group were unanimous in agreeing that these obstacles need to be overcome. Whilst recognising the limitations of particular approaches, and the influence of the different drivers, it was felt that Impact Analysis is crucial to building an effective evidence base that could improve practice. If practitioners could be involved in Impact Analysis then they may be better placed to inform the various debates around guidance. By capturing 'soft outcomes' through documented case study or longitudinal follow up, practitioners might then be better placed to lobby policy makers and demonstrate the professional value of their work in terms other than those imposed by financial imperative. Impact Analysis studies might then cease to be threatening to practitioners and perceived as focusing on what is irrelevant or unreasonable, but instead regarded as a useful tool for collecting qualitative data that could be used to champion and disseminate good practice. For example, it may be that somebody 'dropping out' from a course following guidance, would be judged by policy makers as inherently negative and a 'failing', yet case study research might indicate that for the individual concerned this was a positive outcome, leading to an alternative that was more appropriate for the individual concerned.

The group felt that there is common ground to be found between policy makers and practitioners operating from a client centred approach. Outcomes are not necessarily mutually exclusive. However, at present there is a lack of shared language and dialogue, between the different groups. Practitioners may feel isolated and far removed from the decision makers who dictate how they work. Perhaps the development of this strand of discussion may help to bridge that gap. By providing a platform for individual practitioners to share concerns and views the process of knowledge creation will develop arguments that can be presented in a coherent and persuasive way so practitioners are able to influence the future direction of career guidance work.

How might Impact Analysis discussions develop?

At present there is a mass of evidence in existence, but it is often fragmented or collected and stored rather than put to any particular purpose. It was hoped that this strand of discussion might represent an opportunity to bring together some of this data so links can be made and findings synthesised to begin to accumulate an evidence base that informs practise, rather than existing in a vacuum.

It was felt important that the language used should be inclusive and recognise the different starting points of users of the forum. Contributions continue to be needed in a whole range of areas, but specific suggestions were made. One idea was that a straight forward argument could be put forward in terms aimed at practitioners to address the question 'Why bother with Impact Analysis' in order to engage and stimulate their involvement in the forum. Practitioners need to 'own' Impact Analysis studies, rather than feel victims of them. Understanding and influencing such studies could be to make them tools of empowerment, rather than tools for monitoring. An outline of some of the many proxy indicators that may distract from rather than illuminate what is going on in practice.

Research and academic contributions whilst highly relevant, and pivotal to this project, should be contextualised for practitioners, and where possible accompanied by abstracts and comment that draw out key points using accessible language that is jargon free. Further debate is required about why policy makers value hard outcomes and how can impact analysis meet that requirement. A celebration of soft outcomes might also remind readers that qualitative data is (or should be) especially pertinent to any study of impact.

In conclusion the group had a lively discussion that it is hoped will stimulate further contributions on the theme of Impact Analysis. It was agreed Impact Analysis studies not only have value, but are crucial to improving practice, however, there is much to debate about how to ensure such studies are relevant, meaningful and used to inform the future rather than just reflect on the past.

What can be learned from other disciplines?

The guidance profession is not alone in having to regularly evaluate the effectiveness of its provision. This a common feature for all organisations providing services and this section explores two areas of potential comparison, the medical profession and research done by the Research Centre for Museums and Galleries. It provides access to a briefing which defines and compares impact analysis and evidence-based practice and to a discussion related to this.

The guidance profession is not alone in having to regularly evaluate the effectiveness of its provision. This a common feature for all organisations providing services and this section explores two areas of potential comparison, the medical profession and research done by the Research Centre for Museums and Galleries. It provides access to a briefing which defines and compares impact analysis and evidence-based practice and to a discussion related to this.
This discussion poses the question of whether the guidance community has anything to learn from other research disciplines. The challenges of trying to measure and evaluate the effectiveness of guidance are not unique. Other interventions such as medicine and social welfare, face similar difficulties in evaluating the effectiveness of practice

Participants explored whether comparisons were valid and could be used to develop more effective impact methodologies, or a better understanding of the issues involved.

Two areas of comparison were explored one related to the medical profession and one to research done by the Research Centre for Museums and Galleries.

Learning from other disciplines - the medical profession

This briefing defines impact analysis and evidence-based practice (EBP). It compares the use of randomised controlled trials (RCT) to measure the impact of interventions in the medical and guidance professions, highlighting the benefits and limitations of this approach in the different contexts. The conclusion is that the nature of guidance-related research lends itself to less scientific approaches. However, the methods adopted to measure impact at an organisational or national level may be more comparable

This briefing defines impact analysis and evidence-based practice (EBP). It compares the use of randomised controlled trials (RCT) to measure the impact of interventions in the medical and guidance professions, highlighting the benefits and limitations of this approach in the different contexts. The conclusion is that the nature of guidance-related research lends itself to less scientific approaches. However, the methods adopted to measure impact at an organisational or national level may be more comparable
Can the guidance community learn anything about impact from other disciplines?
Preview Info - Can the guidance community learn about impact from other disciplines.doc - 98.00 Kb
This briefing paper introduces the concepts of impact analysis and evidence-based practice (EBP). Drawing on two empirical examples, it compares the use of randomised controlled trials to measure the impact of interventions in the medical and guidance professions, highlighting the benefits and limitations of the approach.

Links referred to in the paper above:

Medical Research Council electronic publications/ information.
http://www.mrc.ac.uk/index/current-research.htm
http://www.mrc.ac.uk/index/about.htm
http://www.mrc.ac.uk/index/current-research/current-clinical_research.htm
http://www.mrc.ac.uk/index/publications.htm
National Library of Medical Electronic Publications.
Database of Controlled Trials.
Find articles search engine – articles in BMJ, HSJ.
British Medical Journal
Electronic publications of articles appearing in the LANCET journal
Health service journal
National Institute for Clinical Excellence
Department of Health website

Learning from other disciplines - museums and art galleries

Measuring the outcomes and impact of learning in museums, archives and libraries
The generic learning outcome system: Measuring the outcomes and impact of learning in museums, archives and libraries. This link will take you to the Learning Impact Research Project report of the Research Centre for Museums and Galleries (RCMG) Department of Museum Studies University of Leicester

Inspiring Learning for All is a website that describes what an accessible and inclusive museum, archive or library which stimulates and supports learning looks like. It demonstrates how this sector evaluates impact and perhaps provides a point of comparison with guidance.


Bibliography

Alderson, P. Roberts, I. (Feb 5, 2000) Should journals publish systematic reviews that find no evidence to guide practice? Examples from injury research. British Medical Journal. Vol.320, pp. 376 - 377.

Abrams 2001
Abrams, H. (Feb, 2001) Outcome measures: In health care today, you can’t afford not to do them. Hearing Journal

Barber and Thompson 1998
Barber, J.A. Thompson, S.G. (Oct 31, 1998) Analysis and interpretation of cost data in randomised controlled trials: review of published studies. British Medical Journal. Vol. 317, pp. 1195 - 1200.

Barker and Gilbert 2000
Barker, J. Gilbert, D. (Feb 19, 2000) Evidence produced in evidence-based medicine needs to be relevant. (Letter to the Editor). British Medical Journal. Vol. 320, pp. 515.

Barton 2001
Barton, S. (Mar 3, 2001) Using clinical evidence: having the evidence in your hand is just a start – but a good one. (Editorial). British Medical Journal. Vol. 322, pp. 503 - 504.

Barton 2000
Barton, S. (Jul 29, 2000) Which clinical studies provide the best evidence? (Editorial). British Medical Journal. Vol.321, pp.255 - 256.

Braun 2002
Braun, J. et al., (2002) Treatment of Active Ankylosing Spondylitis With Infleximab: A Randomised Controlled Multi-centre Trial. THE LANCET. Vol.359, pp. 1187-1193

Chantler 2002
Chantler, C. (2002) The second greatest benefit to mankind? THE LANCET, Vol. 360, pp. 1870-1877.

Culpepper and Gilbert 1999
Culpepper, L. Gilbert, T.T. (1999) Evidence and ethics. THE LANCET. Vol. 353, pp. 829-31.

Glaniville et all 1998
Glaniville, J. Haines, M. Auston, I. (Jul 18, 1998) Finding information on clinical effectiveness. British Medical Journal. Vol. 317, pp. 200 - 203.

Haynes and Haines 1998
Haynes, B. Haines, A. (Jul 25, 1998) Barriers and bridges to evidence based clinical practice. (Getting Research Findings into Practice, part 4). British Medical Journal. Vol. 317, pp. 273 - 276.

Irvine 1999
Irvine, D. (Apr 3, 1999) The performance of doctors: the new professionalism. THE LANCET. Vol.353, pp.1174-1177.

Lipsey and Cordray 2000
Lipsey, M.W. Cordray, D.S. (2000) Evaluation methods for social intervention. Annual Review of Psychology. Vol. 51, pp. 345-375.

Lock 2000
Lock, K. (May 20, 2000) Health impact assessment. British Medical Journal. Vol. 320, pp. 1395 - 1398.

Luzzo and Pierce 1996
Luzzo, D.A., Pierce, G. (1996) Effects of DISCOVER on the Career Maturity of Middle School Students. Career Development Quarterly. Vol.45(2), pp.170-172.

Malterud 2001
Malterud, K. (Aug 4, 2001) The art and science of clinical knowledge: evidence beyond measures and numbers. THE LANCET. Vol. 358, pp. 397-400.

Mant 1999
Mant, D. (Feb 27, 1999) Can randomised trials inform clinical decisions about individual patients? British Medical Journal. Vol. 353, pp. 743-746.

March and Curry 1998
March, J.S. Curry, J.F. (Feb, 1998) Predicting the outcome of treatment. Journal of Abnormal Child Psychology. Vol. 26 (1), pp. 39-51

Medical Research Council 2000
Medical Research Council (MRC) (Apr 2000) A Framework for Development and Evaluation of RCTs For Complex Interventions to Improve Health: A discussion document.

Medical Research Council 2002
Medical Research Council (MRC) (Nov 2002) Cluster randomised trials: Methodological and ethical considerations: MRC Clinical trials series.

Moher et al 1998
Moher, D. et al (Aug 22, 1998) Does quality of reports of randomised trials affect estimates of intervention efficacy reported in meta-analyses? THE LANCET. Vol. 352, pp. 609-613.

Newburn 2001
Newburn, T. (2001) What do we mean by evaluation? Children & Society, Vol. 15, pp. 5-13.

Paton 1999
Paton, C.R. (Jan 16, 1999) Evidence-Based Medicine. British Medical Journal. Vol. 318, pp. 201.

Pogue et al 1998
Pogue, J. Salim, Yusuf. (Jan 3, 1998) Overcoming the limitations of current meta-analysis of randomised control trials. THE LANCET. Vol. 351, pp. 47-52.

Rosser 1999
Rosser, W.W. (Feb20, 1999) Application of evidence from randomised controlled trials to general practice. THE LANCET. Vol. 353, pp. 661-664.

Sheldon et al 1998
Sheldon, T.A. Guyatt, G.H. Hanines, A. (Jul 11, 1998) When to act on the evidence. (Getting Research Findings Into Practice, part 2). British Medical Journal. Vol. 317, pp. 139 - 142.

Smith et al 2001
Smith, G.D. Ebrahim, S. Frankel, S. (Jan 27, 2001) how policy informs the evidence: “evidence based” thinking can lead to debased policy making. (Editorial). British Medical Journal. Vol. 322, pp. 184 - 185.

Sniderman 1999
Sniderman, A.D. (Jul 24, 1999) Clinical trials, consensus conferences, and clinical practice. THE LANCET. Vol. 354, pp. 327-330.

Trindel and Reynolds 2000
Trindel, L. Reynolds, S. (2000) Evidence-Based Practice: A Critical Appraisal Blackwell Science Ltd. Chapters 1-2.

Van Weel and Knottnerus 1999
Van Weel, C. Knottnerus, J.A. (1999) Evidence-based interventions and comprehensive treatment. THE LANCET. Vol. 353, pp. 916-18.

DeMets et al 1999
DeMets, D.L. Pocock, S.J. Julian, D.G. (1999) The agonising negative trend in monitoring of clinical trials. THE LANCET. Vol 354, pp. 1983-88

Falshaw et al 2000
Falshaw, M. Carter, Y.H. Gray, R.W. (Sept 2, 2000) Evidence should be accessible as well as relevant. (Letter to the Editor). British Medical Journal. Vol. 321, p.567.

Gilber et al 2003
Gilber, J. Morgan, A. Harmon, R.J. (Apr 2003) Pretest-posttest comparison group designs: analysis and interpretation. (clinicians’ Guide to Research Methods and Statistics). Journal of the American Academy of Child and Adolescent Psychiatry. Vol. 42:4, pp. 500

Mariotto et al 2000
Mariotto, A. Lam, A.M. Bleck, T.P. (Jul 22, 2000) Alternatives to evidence based medicine. British Medical Journal. Vol. 321, p.239

McColl et al 1998
McColl, A. Smith, H. White, P. Field, J. (Jan 31, 1998) General practitioners’ perceptions of the route to evidence based medicine: a questionnaire survey. British Medical Journal. Vol. 316, pp. 361 - 365.

Measuring impact: workshop at the University of Derby, 26 April 2005

At a workshop event on 26 April 2005 at the Centre for Guidance Studies, University of Derby, practitioners, researchers and managers came together to consider some of the key issues associated with assessing and measuring the impact of guidance.

Summary of the main discussion themes

Main themes that emerged from the discussion are grouped in terms of four key questions associated with assessing and measuring the impact of guidance.

Main themes that emerged from the discussion are grouped in terms of four key questions associated with assessing and measuring the impact of guidance.
Participants were asked to consider four key questions associated with assessing and measuring the impact of guidance. The main themes that emerged are summarised below.

Why is impact assessment important for the guidance profession?

· To help identify good practice and support continuous professional development.

· To help maximise the benefits to clients and enhance the credibility and standing of the profession.

· To provide evidence to justify current levels of funding for the service and future investment.

· To provide key performance indicators and benchmarks to enable comparisons to be made between different approaches and delivery mechanisms.

What do we want and need to know, and what measures should be used to demonstrate the impact of guidance?

· As a profession we need to “develop a language” about what we do and how we would measure/define our success.

· Some commercial organisations offering advice to young people define “effectiveness” as simply the ability to generate repeat business.

· “Hard outcomes” – volume of business, rates of entry into employment etc – are usually required by “third parties”, but “soft outcomes” – confidence building, empowerment etc – are just as important.

· We don’t need to know the eventual outcomes for all clients, eg. a sample of follow-ups should be sufficient to tell us something significant and would be more cost-effective to manage.

· Precise measures would depend on the context and the rationale for funding, eg. in educational guidance retention and achievement would be important; in workforce guidance promotion, increased job satisfaction and productivity could be relevant.

How can we know that our results are significant and are telling us something meaningful?

· With “hard outcomes”, a key issue is knowing whether a particular outcome is as a result of the guidance intervention or whether it would have happened anyway.

· It is important to have data from “control groups”, and to have access to other comparative data, to be confident that outcomes are linked to interventions.

· Data in relation to “soft outcomes” should be used to help “corroborate” results.

What are the significant factors that could support and/or limit our ability to assess and measure the impact of guidance?

· If practitioners are required to be involved themselves, there must be adequate time allocated to do this.

· If practitioners are to be involved, they need to be ‘willing researchers’ in order for the process to be successful and they need to be “comfortable” with research criteria and methodology.

· Using practitioners to collect data about their own effectiveness, or that of their peers, raises a number of ethical issues, including impartiality.

· Clarity around protocols and code of practice and ethics, including client permissions and data protection.

Supporting information


Further information is given to support some of the key themes emerging from the workshop
What are the possible benefits and outcomes of guidance?

One of the key themes that emerged from the workshop was the variety of differing measures for the impact of guidance, often linked to the context in which guidance is delivered and the basis of funding. Participants offered a number of possibilities that they felt were most relevant. However, a more comprehensive schedule detailing the many possible outcomes and potential benefits of guidance is given in: S.Byshhe, D.Hughes, and L.Bowes, (2002), The Economic Benefits of Guidance: a Review of Current Evidence. Derby: Centre for Guidance Studies, University of Derby and can be accessed elsewhere in the NGRF via the link below:

http://www.guidance-research.org/EG/ip/theory/tp/evidence/view?searchterm=Byshhe,%20Bowes%20and%20Hughes

How can we be sure that benefits are truly attributable to the guidance intervention, and not to some other factor?

During the workshop, participants identified as a major issue the difficulty in demonstrating that an outcome could be genuinely attributable to the guidance intervention and not to some other factor. A major longitudinal study into the possible benefits of information, advice and guidance has been commissioned by the DfES and the report is now published. This study throws light on the question of whether a particular outcome is as a result of the guidance intervention in that clients receiving information only were used as a comparative control group. The report can be downloaded via the link below.

http://www.dfes.gov.uk/research/data/uploadfiles/RR638.pdf

A discussion around the issues contained within this report has started via a weblog accessed at:

http://www.guidance-research.org/collaborate/discussions/weblog.entries/4481160190

What are the priorities in other countries and how do they measure impact?

Some of the workshop participants were keen to know how other countries viewed issues around the impact of guidance.

The European Commission has made clear that it views career guidance as one of the crucial elements in the achievement of the four public policy goals of the so called “Lisbon Strategy” - lifelong learning, social inclusion, labour market efficiency and economic development. Accordingly, the Commission has set up an “Expert Group on Lifelong Guidance” to make recommendations on, amongst other things, priorities for indicators and benchmarks in connection with careers guidance. Preliminary work has already been carried out, including the development of a basic typology for guidance provision and the collation, for 11 countries including the UK, of views on the typology and possible sources of bench mark and performance indicator data.

Although this development is still in progress, some interesting work has been published that resonates with many of the issues raised by the participants in the University of Derby April workshop. Further details of the Expert Group, and related papers, can be accessed via the cedefop (the European Centre for the Development of Vocational Training) virtual community on guidance. You can access the virtual community by going to the link below and registering:

http://cedefop.communityzero.com/lifelong_guidance

Weblog discussion of DfES report on impact

The DfES has recently published a report summarising the findings of a large, longitudinal study to evaluate the intermediate impact of advice and guidance. A discussion on this site has started via a weblog and all interested membre sare encouraged to join in the discussion.

The DfES has recently published a report summarising the findings of a large, longitudinal study to evaluate the intermediate impact of advice and guidance. A discussion on this site has started via a weblog and all interested membre sare encouraged to join in the discussion.
The link address is: http://www.guidance-research.org/collaborate/discussions/weblog.entries/4481160190