This guest blog is the second of a two-part series looking at the work that Youth Futures Foundation and the Behavioural Insights Team (BIT) have been conducting to mobilise a support programme, ‘Reboot’, for a full impact evaluation.
The Reboot programme works with young people aged 16-25 who are (or have been) looked after by statutory care services in the west of England, and provides them with coaching support for up to three years to help them obtain and sustain employment, education and/or training (EET). The programme is delivered by 1625 Independent People (1625ip), a homelessness charity based in the south west of England, with funding from the Youth Futures Foundation and West of England Combined Authority (WECA).
The impact evaluation is a randomised controlled trial, in which participants for both the Reboot group and comparison group are sourced within WECA and North Somerset Council. Part one of this blog series set out how we approached a feasibility study to better understand the programme’s potential for evidence generation. This is part two of the series, written by BIT, detailing the findings from the qualitative ‘process study’ of the programme we conducted ahead of the randomised controlled trial.
The process study
The Reboot programme supports care-experienced young people to obtain and sustain EET outcomes through an intensive coaching model based on a youth version of Acceptance and Commitment Therapy. The programme is delivered by 1625 Independent People (1625ip), a charity based in the south west of England and funded by Youth Futures Foundation with matched funding from WECA.
Our process study sought to understand how Reboot is delivered and identify improvements that would support the programme and a future impact evaluation. We interviewed coaches and managers across the programme and analysed the transcripts to identify key themes across different aspects of the programme such as the induction process, how young people are referred to the programme, and the way that support is delivered.
Here’s some of what we found (you can also read the full report by clicking the link at the bottom of this page):
- Young people experienced significant delays starting on the programme – Reboot had good processes in place to make sure young people would benefit from the programme, but staff felt that later stages of the referral process took too long, largely because it was difficult to coordinate the busy schedules of both coaches and local authority staff (who refer young people to the programme).
- Management capacity was limited – Reboot coaches were well-supported in their roles, but managers’ time was in short supply, and we heard that this had repeatedly delayed important pieces of work to support the programme.
- Some coaches found it difficult to manage the tension between EET outcomes and more holistic work – as with many programmes that take a more holistic approach to supporting young people, some Reboot coaches found it difficult to strike the right balance between the immediate needs of their young people (which may not be directly related to EET) and the ultimate objectives of the programme.
- The COVID-19 pandemic, followed by a national lockdown in mid-March 2020 lasting several months, resulted in disruption of service delivery – for most this meant initially ceasing in-person delivery of services and co-working. Remote working during the COVID-19 pandemic impacted some Reboot coaches’ sense of connectedness, and changed their ways of working as they were no longer co-located with LA leaving care teams within the LA offices.
As you’ll see below, these findings helped shape both the design of our impact evaluation and ensure that the programme was sufficiently prepared to support an evaluation.
Why did we do a process study?
For many (if not most) impact evaluations, evaluators are expected to drop in, rapidly design and run an evaluation, and then get out. This leaves very little scope for evaluators to develop a good understanding of the programme or identify pre-launch changes that could support an evaluation to take place.
By providing a comprehensive understanding of a programme’s implementation, process studies offer a unique opportunity to identify and address potential obstacles before a full-scale evaluation. This process study supported the evaluation mobilisation in three main ways:
- It helped identify and mitigate risks to the impact evaluation – for example, insights around management capacity led to recommendations for reviewing the management structure and considering additional staffing resources. Such actions not only strengthen the programme’s delivery but also ensure the future evaluation can be conducted without unnecessary delays.
- It helped identify programme improvements – following a review of the referral process, 1625ip have now redesigned the process to move referral forms online, ensure important information is provided up front, and minimise the steps needed to onboard a young person to the programme. This is likely to speed up onboarding and improve referrals to the programme.
- It helped us identify opportunities for the evaluation design – for example, developing an in-depth understanding of the referral process helped us to identify opportunities to introduce randomisation (to enable a randomised controlled trial), and also to develop a more robust way of collecting baseline EET outcomes.
The process study has been pivotal in laying the groundwork for a successful impact evaluation. It has illuminated crucial areas for improvement, while also identifying risks and opportunities, and these insights have paved the way for an evaluation which will ultimately benefit the young people Youth Futures Foundation strive to support. Given the value it has provided to this evaluation, we’ll continue to support Youth Futures Foundation’s work to embed these studies as best practice in social policy evaluations.
To delve deeper into the process study findings, you are invited to read the full report available here.