About

This section describes the practice of using research and evidence to inform the design of interventions, and draw conclusions from evaluation and measurement.

We reference a number of papers that discuss consideration, frameworks and perspectives in the area of evaluation and evidence led practice, before showing some case studies that indicate how research and evidence has been used or gathered. These range from small scale pilots to larger programmes.

In proposing some selected publications, we have noted the insight they might contribute to career practitioners, service managers and policymakers.

Contents


  1. Practices and outcomes - Demonstrations of achieving different outcomes in a range of settings

  2. Further illustrations and perspectives - Sources of further perspectives, from discussions, podcasts, video etc

  3. Future research questions - Candidate topics for future research based on the CDI’s discussions with stakeholders.

1. Practices and outcomes

Selected publications that describe practices and outcomes for different challenges are listed below, with links in the title column. We have mostly included open access sources, but where the sources requires payment, it is noted next to the link by “(Paid)”.

TitleInsightBrief description

Maguire, M. (2004). Measuring the outcomes of career guidance. International Journal for Educational and Vocational Guidance, 4, 179-192. (Link)(Paid)

Considering measures in light of contextual factors that affect outcomes from guidance

This paper is oft-cited in evaluation research. The author draws attention to the various factors that can characterise a career guidance intervention and can influence an outcome. From such reflections, the author proposes how to consider the selection of suitable evaluation measurements. The implications are discussed for both practice research and policy-making. 

Crust, G. (2007). The impact of career related interventions in higher education. Journal of the National Institute for Career Education and Counselling, 17(1), 16-22. (Link)

Making the case for evaluation

The paper sets out the case for evaluating career services and their effectiveness, using the context of a higher education setting. (Similar arguments could be made in many settings to the ones proposed in this paper). Topics are covered that span commercial (cost effectiveness), effectiveness (the necessity to target capability gaps of potential users to effectively help them e.g. career management skills), and standards (the critical value of implementing an underlying process of change and to elicit feedback to drive further improvement.)

Baudouin, R., Bezanson, L., Borgen, B., Goyer, L., Hiebert, B., Lalande, V., Magnusson, K., Michaud, G., Renald, C., & Turcotte, M. (2008). Demonstrating Value: A Draft Framework for Evaluating the Effectiveness of Career Development Interventions. Canadian Journal of Counselling and Psychotherapy, 41(3). (Link)

Creating an effective evaluation framework for practice

This article was written in Canada, set against a backdrop where evaluation of practice was viewed as an exception rather than a norm. The authors develop and propose an evaluation framework for evaluation that permits linking the services provided with the client outcomes that are being achieved. The paper starts with a review of some existing evaluation frameworks from the literature, but recognises that “no one evaluation model is “best” in all regards”. Criteria are suggested for what makes for a “good” evaluation framework. The paper thereby offers both a practical tool and insights to consider the criteria of an effective evaluation framework.

Dany, F. (2014). Time to change: The added value of an integrative approach to career research. Career Development International, 19(6), 718-730. (Link)(Paid)

Conducting and Interpreting research to reach deeper insights 

The paper calls for an integrative approach to research. A criticism is levelled at some research approaches in the careers field, which are reviewed, that they “stick to narrow views of {chat a} career {is}”. The paper provides examples that invite career differences to be re-examined which seek alternative explanations than offered through some treatments. The authors propose that adopting their wider perspectives will create richer discussions amongst researchers.

Hiebert, Bryan, Karen Schober, and Lester Oakes. "Demonstrating the impact of career guidance." Handbook of career development: International perspectives. New York, NY: Springer New York, 2014. 671-686. (Link)(Paid)

Making the case for evaluation, and reviewing approaches to determine a choice of framework

This book chapter is contextualised by an agee in which there is an onus on career guidance practitioners to “prove it works”. The importance of the topic is described. The article describes some alternative approaches for documenting the impact of career guidance services that account for the emphasis on evidence-based practice and outcome-focused intervention, observed by the researchers. Drawing from examples in Canada, the US and Europe, the authors draw attention to a number of different frameworks that have been employed to measure impact.

Haug, E. H., & Plant, P. (2016). Research-based knowledge: researchers’ contribution to evidence-based practice and policy making in career guidance. International Journal for Educational and Vocational Guidance, 16, 137-152. (Link)(Paid) 

Considering the role and opportunity for valuable practitioner- research 

The paper focuses on researchers’ contribution to evidence-based practice and policymaking in career guidance. The article puts a specific focus on the need for a stronger involvement of the voice of users.

Neary, S., & Johnson, C. (2016). CPD for the career development professional. Crimson Publishing. (Link)

Recognising the role of research in practitioner CPD.

This publicly available book chapter does not introduce new research per se, but draws together thinking to explain the role of research in a practitioner's professional practice. A broad definition is taken of what constitutes “research”, with varied suggestions on where to source it. References are provided to other texts which discuss the wider concepts and rationale for “evidence based practice”.

Elliott, J., Stankov, L., Lee, J., & Beckmann, J. F. (2019). What did PISA and TIMSS ever do for us?: The potential of large scale datasets for understanding and improving educational practice. Comparative Education, 55(1), 133-155. (Link)

Using large public datasets alongside qualitative studies

The authors of the paper suggest that a gulf exists between researchers who use large data sets and other researchers who develop deeper qualitative understanding of individuals and groups and how they make career choices. The authors suggest how analysis of large data sets can be employed alongside the latter type of research to provide richer and deeper insights into, for instance, cross-cultural and regional differences between the career experiences of different groups. Specific examples are noted in this paper of using the PISA and LEO data which means outcomes at two stages of early adulthood. 

Hooley, T., & Rice, S. (2019). Ensuring quality in career guidance: A critical review. British Journal of Guidance & Counselling, 47(4), 472-486.  (Link)
Reflective practice on using the ideas of "quality" and "quality assurance" in guidance; while also guarding against unintended consequences.This article discusses quality and quality assurance in career guidance. The authors note there is no clear agreed international understanding of what quality career guidance looks like, through a review of approaches in education show two distinct areas: economic evaluations and humanist/progressive perspectives. The current approaches in the careers field lead the authors to propose six pillars to evaluation: 1) Policy, 2) Organisation, 3) Process, 4) People, 5) Product, 6) Consumption. The article also takes a balanced view to quality, by suggesting that the pursuit of quality might undermine the goals it seeks to achieve if it "distorts" decision making (e.g. by reducing negative appraisals by removing elements of a curriculum that lower scores).

Whiston, S. C., Mitts, N. G., & Li, Y. (2019). Evaluation of career guidance programs. International handbook of career guidance, 815-834. (Link) (Paid)

Designing an evaluation activity having reviewed past evaluations

The authors first examine previous research related to the effects of career guidance programs or interventions with a discussion of the effectiveness of career guidance programs, which modalities are preferable in providing career guidance, which clients benefit from these interventions, and outcome measures that are typically used in the evaluation of career guidance programs. Second, the authors provide a summary of how to conduct an evaluation of a career guidance program. This overview utilises a six-step process for evaluating career counselling programs proposed in previous studies

Robertson, P. J. (2020). Evidence-based practice for career development. (Link)(Paid)

Thinking through an evidence-led strategy for career guidance and mitigating risks and obstacles

The author discusses the ambition of having an evidence led approach to career guidance, and highlight some obstacles and challenges to the goal. Firstly, there are innate differences between the medical profession, which is often seen as a “standard” for such practice. Secondly, there “policymaking and practice are political processes and research evidence is necessary but not sufficient to influence decision-making.” Therefore, is it suggested that “to best inform practice, research evidence should be combined with local knowledge, practitioner experience, and input from service users”

Percy, C. and Hooley, T., (2023) Lessons for career guidance from return-on-investment analyses in complex education-related fields, British Journal of Guidance & Counselling  (Link)

Building ROI investment cases for careers guidance to influence policy or institutional decision making

The authors tackle the problem of finding a methodology to calculate the return on investment for career guidance, as a foundation of making cases for investment. A review was carried out of 32 studies in different countries that measured return on investment in education and related settings, but concluded there was a high degree of inconsistency. A practical method was ultimately proposed, while the discussion int the paper provides insights that can stimulate a critical appraisal of different approaches.

Winter, David. "A framework for analysing careers and employability learning outcomes." Journal of the National Institute for Career Education and Counselling 51.1 (2023): 15-25. (Link)

Taking a strategic and critical perspective on measuring outcomes, that draw awareness to inadvertent biases

Set against the context of seeing greater integration of careers education within the curriculum, the authors questions how to create suitable measurement frameworks that overcome ideological biases, and account for different forms of “capital” that a graduate might accrue on the path towards the labour market (e.g. sociological). A framework is suggested which profiles career interventions in terms of “depth” of learning (we can either ‘discern’, ‘acquire’, ‘adapt’ or ‘enhance’ while learning, across various “domains” where learning could occur (e.g. forms of capital like social networks and personal identity).

Cedefop (2024). Learning outcomes going global: a multifaceted phenomenon. Luxembourg: Publications Office. (Link)

Building learning outcomes into guidance to enable global comparisons for better learning

This study from Cedefop examines the international trend for measuring learning outcomes across different education based systems, in careers and beyond. The move to learning outcomes is described as “one of the most significant trends to have influenced European VET over the past two decades.”  The learning outcome approach facilitates new benefits, such as the ability to compare international policies, and also to create foundations for designing lifelong learning systems. A stakeholder analysis of these measures is also included.

Hughes, D., Mc Cormack, D., Neary, S., & King, P. (2024). Praxis in guidance and counselling: new frontiers. British Journal of Guidance & Counselling, 1-6. (Link)

Understanding the value of involving practitioners in research studies to clients, professionals and the evidence base

The authors draw attention to the fact that much research is conducted by academics without the input of practitioners - a feature of past studies that has been observed by other researchers. Reasons for this ‘praxis gap’ are cited. However, the authors make the case for involving practitioners more, drawing attention to the ways this can be done and the way that it can increase the value of the study: For instance, “a credible and sustainable model of professionalisation in careers practice depends on narrowing the gaps between both theory and practice.”



2. Illustrations and discussions

The following publications represent examples and case studies of evaluation studies, to test the impact of career interventions. These straddle smaller scale pilots through to larger scale programmes, using multiple-stakeholder inputs. Papers also often refer to the use of an underlying theory or framework to guide the evaluation, which differs based on context.

Title

Insights

Brief description

Hughes, D. and Gration, G. 2006. Performance Indicators and Benchmarks in Career Guidance in the United Kingdom. University of Derby (Link)

Selecting indicators by which to measure career guidance effectiveness

The report reviews the use of indicators for the effectiveness of career guidance in Europe, and benchmarks the UK against other nations. The paper reviews a range of potential indicators that could be used and explains that they cover inputs, processes and outputs. The use of indicators by different UK institutions involved in the career landscape are described.

ETI (2009), Evaluating the Quality of Careers Information, Advice and Guidance provided by Career Information, Advice and Guidance Providers (Link)

Assessing information quality used in guidance using a holistic framework of measures

While this study is now dated in terms of the findings, it provides a systematic method and approach to assessing the quality of information (as well as wider CIAG) provision from different service providers, looking at both the information itself as well as policy and infrastructure dimensions (amongst others).

Frigerio, G. (2010). “Narratives of Employability: Effective Guidance in A Higher Education Context. A Qualitative Evaluation of the Impact of Guidance.” Higher Education Career Services Uni (Link)

Conducting a small scale case study project to better understand expectations of service users and the impact of the service.

This study is an example of a small-scale research study at Warwick University that took the form of six case study students who underwent a career consultation. Expectations were elicited before the event, and then outcomes reviewed two months later. While the authors stress the limited applicability of the specific study to the context where it was deployed, it shows a practical example of a small exploratory study to increase understanding of students’ expectations, build them into practice, and review the effectiveness of practice.

Reese, R.J., & Miller, C. (2010). Using Outcome to Improve a Career Development Course: Closing the Scientist-Practitioner Gap. Journal of Career Assessment, 18, 207 - 219. (Link)

Anticipating unexpected or anomalous outcomes

In this study, the authors test a follow up to a previous study to understand if modification that had been made led to desired improvements. The authors found a large increase in the effect size of a career class in terms of the self-efficacy of students, which were sustained into a second year of the course. The “uneven” nature of results prompted the researchers to evaluate improvements, however, and include measurements for outcome data. The paper provides an example of interactive learning.

Hiebert, B., Schober, K., & Oakes, L. (2014). Demonstrating the impact of career guidance. In Handbook of career development: International perspectives (pp. 671-686). New York, NY: Springer New York. (Link)(Paid)

Reviewing the different ways that evidence for career guidance impact and value has been presented

This chapter discusses the challenge of demonstrating the value of career guidance services. The authors review international practices and present alternative approaches for documenting the impact of career guidance services that embrace the current emphasis on evidence-based practice and outcome-focused intervention. In particular, to address a “prove it works” challenge, the authors provide examples of two approaches to dealing with this situation. (More studies have followed since this one in 2014).

Jacquin, P., & Juhel, J. (2017). An individual mixed‐evaluation method for career intervention. The Career Development Quarterly, 65(1), 16-28. (Link)

Demonstrating impact on clients over the course of a career programme

The paper addresses the challenge that career counsellors face to demonstrate the value of their services from policy makers. This method proposes a mix method approach to demonstrate impact: The method used 5 items related to a client’s career decision self-efficacy and studied the evolution of those items throughout the intervention of 1 career counsellor (43 days) to show improvements.

Frigerio, G. (2018). Making connections through practitioner research. In Graduate Careers in Context (pp. 179-192). Routledge.

Embarking on a practitioner research initiative to measure impact

This chapter focuses on the role of the career development practitioner integrating theory with their practice through engaging in practitioner research. It uses the systems theory framework developed to show the complexity of career development, where individuals are mutually influenced by a range of other people. 

Whelan, Nuala, et al. "EEPIC-Enhancing Employability through Positive Interventions for improving Career potential: the impact of a high support career guidance intervention on the wellbeing, hopefulness, self-efficacy and employability of the long-term unemployed-a study protocol for a randomised controlled trial." Trials 19 (2018): 1-18. (Link)

Designing an evaluation activity with a control group

The paper provides a detailed outline for an example study that involves a single-centre randomised, controlled, partially blinded trial. A total of 140 long-term unemployed job-seekers from a disadvantaged urban area will be randomly assigned to two groups: (1) an intervention group; and (2) a ‘service as usual’ group. Each group will be followed up immediately post intervention and six months later.

Maree, J. G. (2019). Group career construction counseling: A mixed‐methods intervention study with high school students. The Career Development Quarterly, 67(1), 47-61.(Link)(Paid)

Measuring a an intervention using multiple tests and synthesising the results

This study investigated the value of group career construction counselling in a high school with (n = 57) students. The paper describes the intervention, but also the range of tests and evaluations completed by students to measure different facets of pre- and post-intervention attitudes and capabilities: a) the Career Adapt‐Abilities Scale–South Africa (CAAS‐SA), b) the Career Interest Profile and c) the Maree Career Matrix. Results showed that the students’ career adaptability scores had improved meaningfully and no gender‐based differences had been introduced. However, differences were detected between both the boys’ and the girls’ pre‐ and posttest Control and Confidence subscale scores.

Hanson, J., Moore, N., Neary, S., & Clark, L. (2021). An evaluation of the North East of England pilot of the Gatsby Benchmarks of good career guidance. University of Derby (Link)

Designing a comprehensive evaluation into a major or complex intervention.

The evaluation used longitudinal research to examine the impacts of a career programme implemented in schools in the North East of England. It was particularly focussed on the impact of the programme on helping schools and students to reach Gatsby Benchmarks, which are indicators of progress in career knowledge, experience and capability. There were six different components to the measurement and evaluation exercise, which combined to give a robust and holistic understanding of how well the pilot performed. The evidence used for this work has helped to “make the case” for careers education in other schools, showing the wider value of such work.

The Careers & Enterprise Company (2021).Swindon and Wiltshire Careers Hub: Evaluation Guide for Careers Activities and Programmes: Edition 1 – February 2021. London: The Careers & Enterprise Company. (Link)

Creating a matrix of research subjects and research instruments to evaluate a programme

This document offers an example of a research approach employed by the Careers and Enterprise Company to measure the impact of a schools intervention programme. The document describes a matrix of stakeholders and research instruments that were used to give a rounded view of the programme, and come to robust conclusions that account for multiple stakeholder perspectives. Several methods are described for getting feedback, with some innovative ways offered for engaging target respondents.

Dodd, V., Hanson, J., & Hooley, T. (2022). Increasing students’ career readiness through career guidance: measuring the impact with a validated measure. British Journal of Guidance & Counselling, 50(2), 260–272. (Link)

Measuring career readiness amongst secondary school students

This research (1) details the development of a career readiness measure and (2) tests the relationship between career guidance interventions and career readiness among secondary school students over three separate studies. One factor across nine items was found to effectively capture career readiness. Greater participation in career guidance activities was also found to be significantly associated with increased career readiness.

3. Future research questions

From the CDI’s discussions with stakeholders, we have heard an appetite for more practitioner based research, particularly recognising that there are very many variables to test. Related suggestions for future research activities in this area included:

  • A pilot fund to facilitate practitioners or researcher-practitioners  - for conducting a series of faster, small scale tests - that can shed light on the impact of changing different key variables in interventions (e.g. online vs offline, group size, dosage effects with different client groups etc).
  • Deconstruction of the career guidance interview into components, noting the decisions taken by a practitioner at different junctures, and evaluating the best decisions that practitioners could in different scenarios.
  • Development of metrics that can be used as useful proxies for mid and longer term outcomes from career interventions, that can be asked of clients shortly after a career intervention, and which mitigate the need to do costly, longitudinal studies.
  • Increasing the use of appropriate, proven career theories by practitioners, for both designing services and evaluating them, through training and support.