In this blog, our Head of Research, Anna Round, explores how our soon-to-be launched Youth Employment Toolkit will make it easier to access insights and learning from research. The summaries it will provide are “the perfect espresso; a powerful distillation of a lot of different inputs”.
‘What works?’ is a simple question with complex answers. The evidence for the effectiveness (or otherwise) of interventions that aim to help young people facing the greatest challenges to get good quality jobs is plentiful. But it’s often hard to navigate, and the studies that are available vary in their quality, relevance, and reliability.
Relatively few use the kind of ‘experimental’ methodology that provides a robust assessment of whether a particular programme or activity makes a difference—by comparing people who take part with similar people who don’t. Where this kind of evidence is available, it may be reported in a format that is difficult to interpret or apply to a new context.
And of course, some of the things that might work haven’t been evaluated, or sufficiently evaluated, to see whether they do. Finding the gaps in the research can take up as much time as finding the best available answers.
Launching a new Youth Employment Toolkit
To remedy these problems, our new Youth Employment Toolkit, to be published by June this year, will be a free online resource making it easier to access insights and learning from research into youth employment interventions. The Toolkit contains summaries of in-depth literature reviews that draw together evidence from multiple studies in high-income countries across the globe.
Each summary includes an assessment of the likely level of impact for each kind of intervention, based on a ‘meta-analysis’. This pools evidence from numerous evaluations, along with information about how to implement the intervention well, notes on the kinds of evidence that were used, and links to in-depth resources and case studies.
The summaries are short and non-technical, designed for decision-makers who don’t have time to delve into the literature or work their way through a lot of specialist material. They’re a bit like the perfect espresso; a powerful distillation of a lot of different inputs, which demands a wide range of expertise and plenty of hard work!
Summarising seven different interventions
A team of researchers from the Centre for Evidence and Implementation, the Institute for Employment Studies and Monash University considered over 700 articles and reports—starting with those in the Youth Futures Evidence and Gap Map—to see if they were suitable to be included in the ‘Rapid Evidence Assessments’ (REAs) that underpin the Toolkit. In the end, the REAs drew on material from around 70 different evaluations that met the criteria for quality and relevance and are published with the Toolkit.
This is just the first version. We will add evidence on more interventions and update the existing sections as more evaluations become available. For the moment, it contains summaries of evidence on seven different interventions that are of particular interest to policymakers, practitioners and employers in England.
Being clear about evidence gaps
The Toolkit aims to make reliable analysis as accessible as possible. Inevitably, what it says is limited by what evidence is available to be included in a meta-analysis. Where there are gaps in the evidence base, those gaps will—unfortunately—be reflected in the Toolkit as well. If something hasn’t been evaluated, or hasn’t been evaluated well enough, we can’t include it. We do make recommendations about priorities for future research, which will inform our own agenda too.
Where published evaluations don’t go into enough detail to get a clear sense of whether, or how, something works, we point that out. We also suggest what might help to make things clearer. The Toolkit aims to answer that key question—‘What works?’—but it can only do that using the best evidence that’s available at the moment. Most decision-makers can’t wait for the ideal evidence to be published; they need to work with the best information that’s available now.
Supporting policymakers, employers and youth organisations
As well as the REAs, we commissioned user research to help us make the Toolkit as accessible and practical as possible for those who help marginalised young people enter employment. And we talked to young people with experience of the interventions in the Toolkit. The evidence summaries are designed to meet the diverse needs of policymakers, employers, and organisations that provide support for young people, in the context of the contemporary English labour market.
The best way to learn about how the Toolkit works for its users, though, is to get feedback, suggestions and ideas from them as they read it and start to put its evidence into practice. So, we’ll listen hard to what they say, and incorporate this learning as the Toolkit is updated and expanded. We’ll also build in links to our new Data Dashboard, and to other resources including findings from our evaluations and research projects, and the Evidence and Gap Map.
Building a Toolkit is a long and sometimes challenging process. One of the issues that we encountered early on has to do with the way many youth employment interventions are structured. Instead of offering just one thing to the young people they support, most are programmatic; that is, they bring together several different kinds of activity.
This helps them address the different challenges that participants face—but it also makes those components harder to evaluate. If there are several different elements to a programme that works, it’s hard to tell which bits were effective, or whether the combination itself was the key factor. The next blog in this series explains how the research team solved this tricky problem!