Mobilising a support programme for a full impact evaluation
Oct 11, 23
Beta version – website under development. Can’t find what you need? Contact us
CloseThis guest blog is the second of a two-part series looking at the work that Youth Futures Foundation and the Behavioural Insights Team (BIT) have been conducting to mobilise a support programme, ‘Reboot’, for a full impact evaluation.
The Reboot programme works with young people aged 16-25 who are (or have been) looked after by statutory care services in the west of England, and provides them with coaching support for up to three years to help them obtain and sustain employment, education and/or training (EET). The programme is delivered by 1625 Independent People (1625ip), a homelessness charity based in the south west of England, with funding from the Youth Futures Foundation and West of England Combined Authority (WECA).
The impact evaluation is a randomised controlled trial, in which participants for both the Reboot group and comparison group are sourced within WECA and North Somerset Council. Part one of this blog series set out how we approached a feasibility study to better understand the programme’s potential for evidence generation. This is part two of the series, written by BIT, detailing the findings from the qualitative ‘process study’ of the programme we conducted ahead of the randomised controlled trial.
The process study
The Reboot programme supports care-experienced young people to obtain and sustain EET outcomes through an intensive coaching model based on a youth version of Acceptance and Commitment Therapy. The programme is delivered by 1625 Independent People (1625ip), a charity based in the south west of England and funded by Youth Futures Foundation with matched funding from WECA.
Our process study sought to understand how Reboot is delivered and identify improvements that would support the programme and a future impact evaluation. We interviewed coaches and managers across the programme and analysed the transcripts to identify key themes across different aspects of the programme such as the induction process, how young people are referred to the programme, and the way that support is delivered.
Here’s some of what we found (you can also read the full report by clicking the link at the bottom of this page):
As you’ll see below, these findings helped shape both the design of our impact evaluation and ensure that the programme was sufficiently prepared to support an evaluation.
Why did we do a process study?
For many (if not most) impact evaluations, evaluators are expected to drop in, rapidly design and run an evaluation, and then get out. This leaves very little scope for evaluators to develop a good understanding of the programme or identify pre-launch changes that could support an evaluation to take place.
By providing a comprehensive understanding of a programme’s implementation, process studies offer a unique opportunity to identify and address potential obstacles before a full-scale evaluation. This process study supported the evaluation mobilisation in three main ways:
The process study has been pivotal in laying the groundwork for a successful impact evaluation. It has illuminated crucial areas for improvement, while also identifying risks and opportunities, and these insights have paved the way for an evaluation which will ultimately benefit the young people Youth Futures Foundation strive to support. Given the value it has provided to this evaluation, we’ll continue to support Youth Futures Foundation’s work to embed these studies as best practice in social policy evaluations.
To delve deeper into the process study findings, you are invited to read the full report available here.