September 22, 2017

Please reload

Recent Posts

The Research Schools Network aims to put the use of research evidence into the hands of schools and practitioners. One of the three main strands of it...

Aiming high for quality CPD

September 22, 2017

Please reload

Featured Posts

Putting Evidence to Work: A Case Study

March 29, 2019

Schools are busy places. With so many conflicting priorities, it can be hard to know where to start in terms of improvement planning. However, evidence can point us in the right direction of not just what the ‘best bets’ may be in terms of teaching strategies, but also the practicalities of how to implement them. Working with one of the new schools in our Trust, we have used the EEF’s Implementation Guidance ‘Putting Evidence to Work’ as a framework for developing our teaching and learning priorities. This provides a series of steps for schools to consider in their implementation plans. Here I will aim to outline how we have used this guidance to help us develop a new teaching and learning strategy based around retrieval practice.


Setting the foundations for good implementation

This is firstly about creating a school climate which is conducive or ready for good implementation. Culture is key here. Do staff feel empowered to take on new approaches and new responsibilities? Have we set aside adequate time for training?


Secondly, we need to check that we are treating implementation as a process, not an event. Have we given enough time to the planning and preparation for implementation? Do we think about implementing changes in a structured and staged manner? Good implementation is often about doing fewer things better. Crucially, it is just as important to make decisions about what we stop doing, as well as what new strategies we might introduce.



This phase of implementation is crucially important. It is about defining the problem you want to solve and identifying appropriate practices or programmes to implement.


For us, this firstly involved observing what was currently happening in school and why it wasn’t having the desired impact. We spoke to teachers, who cited students’ lack of resilience, and lack of independent work or revision, as key factors in their underperformance. We realised that teachers themselves had a lack of understanding of the evidence in relation to how students learn, and that there was a culture of teaching in the school which favoured discover learning or skills-based approaches over the acquisition of knowledge. Students tended to be passive in lessons and were often resistant when presented with challenging tasks. Upon further exploration, it became clear that this was because they lacked the core knowledge to attempt these tasks. This included many students, particularly those from disadvantaged backgrounds, having a limited range of everyday (Tier 2) and subject specific (Tier 3) vocabulary, which was preventing them from accessing the curriculum. The sum of these factors was clearly evident in the school’s performance data, with low progress and attainment levels across the curriculum.


Having identified our problems, we needed to start to look for solutions. Here, we turned to the evidence for strategies which would help us improve students’ knowledge, and consequently their confidence and resilience. We quickly identified retrieval practice as a strategy which we wanted to explore further. Many studies, for example Karpicke & Roediger (2007) point to the power of the ‘testing effect’: that retrieval practice, particularly spaced retrieval of previously learned information, is a powerful tool for long-term retention. Additionally, we were interested in Bjork’s work on ‘desirable difficulties’ (the idea that introducing an optimum level of challenge into the learning process will improve long-term retention of the learned material); and on Sweller’s work on Cognitive Load Theory and its implication that improving long term memory will lead to improvements in working memory, and consequently the ability to tackle challenging problems.



With our exploration complete, we were confident that retrieval practice was an area we wanted to focus on as a priority (this, in implementation terms, is known as the ‘adoption decision’). We therefore entered the preparation phase and began to create an implementation plan and prepare staff and resources.


The first step for us in this phase was in defining the ‘active ingredients’ of our strategy. These are the aspects which define the strategy or intervention, and which we want to ensure fidelity to. We identified, for example, that we wanted retrieval practice to take place at the start and end of every lesson, that it must take place from memory (closed book), that it must take the form of quizzing, and that each episode of retrieval should not last more than 5 minutes.


Next for us in our preparation was to trial the strategy. As a Research School, we wanted this trial to be as rigorous as possible, and so we conducted a pilot Randomised Control Trial (RCT) to test whether retrieval practice had an impact on students’ long-term memory. The trial took place over three months and involved over 300 Year 8 students from two of our Trust’s schools. The ‘control’ groups were taught the standard scheme of work for English Literature, whilst the ‘intervention’ group received a daily retrieval quiz instead of the standard ‘starter’ activity at the beginning of the lesson. The results (which we have written about before here) were certainly promising, and encouraged us that this was a strategy worth introducing across the curriculum.


Next in our preparation was the development of a training (CPD) plan for staff. Again, the evidence was important here in influencing the model we chose. We consulted the evidence-based DfE Standard for Professional Development and the Teacher Development Trust review into ‘Developing Great Teaching’, for example, and evaluated our proposed programme against this. We recognised the importance of creating a programme which was sustained, iterative and provided opportunities for individual and subject-specific reflection as well as whole staff, face-to-face training.


Finally in this phase, we established our implementation plan and considered how we would monitor and evaluate the change.



Once delivery begins, the key features of this phase are in supporting staff, identifying and solving problems, and adapting strategies.


Having launched our strategy and the training programme, the focus now was on monitoring the initial implementation of the strategy. Working on the basis that ‘anything worth doing well is worth doing badly’ in the first instance, it was important here to let staff experiment initially and to praise them for doing so. We spent a lot of time in the early weeks doing classroom drop-ins and identifying the ‘bright spots’, then promoting and sharing the good practice we saw. This also allowed us to identify staff or subject teams who needed further support. Expert coaching and mentoring was key here, and we were grateful that in the preparation phase we had identified and developed a delivery team to do this.


One of the key considerations for us in terms of monitoring implementation was in deciding where we needed to be ‘tight’ and where we could be ‘loose’ in terms of fidelity to the active ingredients of the strategy. We identified, for example, that we wanted absolute consistency in terms of the language used and the positioning of retrieval at the start and end of lessons, but were happy for intelligent adaptation of the format of the quizzing in each subject domain. This gave us immediately clear criteria for our monitoring activities. When we spotted problems, for example that the quizzing was not effectively incorporating spaced retrieval of previously learned material and was too focused on what students had studied more recently, we were able to refine subsequent training to reflect this.


As well as monitoring the process, it was important to monitor the impact on staff, and of course on students. For the former, the impact on culture was key. We observed a high level of buy-in from staff early on in the process, and reflected on the fact that we felt this was because they could see the robust evidence behind the strategy. Staff told us they felt the strategy was working, and the perceived that students were more confident. But crucially, we started to see an impact on outcomes, with significant gains in terms of progress and attainment at the end of the academic year. And whilst we are all too aware of the dangers of conflating correlation with causation, it was clear from students’ exam performance that they simply ‘knew more’.



And so to the final stage of the cycle. For us, sustaining the momentum of our strategy is now key and we are looking for ways to enhance the skilful use of retrieval practice across the school. In many ways, this is the equivalent to beginning an entirely new implementation process: thinking about what training is required, what refinement to the active ingredients is necessary, and what new monitoring and evaluation milestones are now needed. One of the key dangers in this phase is in assuming that once a new practice is integrated into the normal routines of a school, that the implementation process requires no further leadership support. We have been careful to continue to monitor and refine the strategy, and build time into our CPD programme to revisit the active ingredients of effective retrieval.

Share on Facebook
Share on Twitter