Launching our first randomised controlled trial

Sep 21, 2023

By Jane Mackey (Evidence and Evaluation Manager)

Our Evidence and Evaluation Manager, Jane Mackey, works on our evaluation of Reboot West III. Below, she explores our journey to launching Youth Future Foundation’s first randomised controlled trial.

Today we are celebrating the launch of Reboot West 3’, Youth Futures Foundation’s first randomised controlled trial. This is a big milestone in our journey as a What Works Centre, and we will raise a glass to several years’ collaborative, hard work on the part of the teams at Youth Futures Foundation, 1625 Independent People (1625ip), the Behavioural Insights Team (BIT), and Local Authorities (LAs) in Bristol, North Somerset, Bath and North East Somerset and South Gloucestershire.  

Our commissioning philosophy 

As the What Works Centre for youth employment, we are committed to generating high quality evidence about what works to support young people into good jobs. When it comes to commissioning evaluations, our ultimate goal is to fund and deliver the most credible impact evaluation possible. However, mobilising towards a trial can be complex. Therefore, we commissioned BIT to work with 1625IP, Local Authorities and other stakeholders to agree the trial approach so that BIT could deeply understand how the Reboot model of support works. We also recognise that delivery organisations are likely to need capacity-building support in advance of engaging with more high-quality evaluation methods. As such, our first grants and evaluation programme (the ‘What Works Programme’) was designed to meet grantees where they were on the evaluation journey:  

  • Our development grants supported grantees to build their evaluation capacity and scale programmes to an appropriate size for an impact trial; 
  • Our impact pilot grants were designed to help grantees to fully evidence their programme’s theory of change;  
  • Our impact efficacy grants were designed to support mobilisation work prior to launching a full impact evaluation.  

Taking this staged approach has been hugely beneficial. It has allowed us to invest in impact evaluations where there is evidence of promise; where we are confident that an impact evaluation is feasible; and where we believe that an impact evaluation has a good chance of success.  

The Reboot journey 

Our partnership with 1625ip began in April 2021, when we agreed to fund the second phase of their Reboot West programme (Reboot II) through an impact efficacy grant.  

The Reboot West programme supports care-experienced young people to obtain and sustain employment, education and training outcomes through an intensive coaching model based on a youth version of Acceptance and Commitment Therapy. 1625ip works in partnership with four LAs in the south-west of England, each of which make referrals into the programme.   

The first phase of the programme was funded through a Social Impact Bond, and the learning data available during this phase suggested that it could be a suitable candidate for an impact evaluation. To test these assumptions, we commissioned BIT to conduct three key pieces of analysis:   

  • A ‘feasibility study’ exploring whether the Reboot intervention was suitable to measure through an impact evaluation and which evaluation design would be optimal;  
  • A qualitative ‘process study’ looking at how the programme was operating, and whether there were any obstacles to implementing an impact evaluation in practice;  
  • A theory of change and participant tracing analysis to deeply understand how the model of support works and how young people achieve outcomes through the model. 

BIT concluded that an impact evaluation was feasible and that a randomised controlled trial was the most suitable design. However, we weren’t ready to launch a trial just yet. BIT’s analysis identified that delivering a trial would require new procedures and increased staffing time on the part of both 1625ip and LAs. We wanted to test these new ways of working before we launched the full trial, and decided to run a pilot phase prior to the full launch. 

The pilot proved invaluable and enabled us to resolve risks and issues before it was too late. Taken together with the previous work that BIT, 1625ip and the LAs had conducted, this meant that when the trial went live on 1st August we were reassured that we had done as much as we could to make the trial a success.  

The Youth Futures Foundation evaluation journey 

Preparing for our first trial has not only involved providing capacity-building support to 1625ip – we have also been rapidly developing our own capacity as a What Works Centre to be able to adequately support a trial. 

In the lead up to the trial we worked with our expert advisory group to develop a trial protocol template, statistical analysis plan template, and statistical analysis guidance.  

Alongside this, we have also had to make important strategic decisions; of key importance has been deciding the primary outcome that would be measured as part of the trial. In ‘The What Works Centres’ (2023), my colleagues discussed the challenges of outcome measurement in our policy area. When designing Reboot 3, we have had to align the trial’s primary outcome to Reboot’s theory of change and Youth Futures Foundation’s priorities, while also ensuring that the outcome measure (1) maximises our ability to make impact claims, (2) is relevant (i.e. it relates to real world behaviour), (3) is sensitive (i.e. it captures a high amount of information), and (4) can be measured through the available government administrative datasets (which has been moving target). To find out more about the design choices we have made for the Reboot West 3 trial read our Trial Protocol and Statistical Analysis Plan. 

What’s next? 

We are proud of the work we have accomplished in collaboration with our What Works Programme grantees over the last three years, and we are continuing to explore opportunities to take more of our grant portfolio through to an impact evaluation and continuing to mobilise other trials. 

We are also working with Amber Foundation to deliver a qualitative impact evaluation. To hear more about this, join my colleagues Hannah Murphy and Vera Stiefler Johnson on Day Three of the 2023 UK Evaluation Society Conference “Rising to Challenges”. 

Alongside the launch of new impact evaluations, we are also continuing to develop our evaluation guidance, and we will be releasing an invitation to tender for theory-informed implementation and process guidance soon. 

We are also beginning to think about the shape of our next grants and evaluation programme, based on learnings from our first round of grant-making. 

To stay up to date with our work at Youth Futures, sign up for our newsletter and check out latest developments on our website.  

Skip to content