Prototyping Impact Assessment Methodologies in Humanitarian Training
RedR UK and the University of Sussex re designing and testing innovative methods to better understand participants’ learning and changes in behaviour following humanitarian capacity building interventions.
In this blog post, project researchers from Sussex University describe the process of designing the prototype for an innovative methodology to better assess the impact of humanitarian training. The focus of this methodology is on video recording in-class exercises that can involve role playing or simulation. Capturing the before-and-after-instruction performance by training participants will allow them, as well as RedR trainers, to see where participants have improved and where further work is necessary. We are also exploring how this methodology might be used for participants’ reflective practices and ongoing self-evaluation and as a tool for peer evaluations as well.
In preparation of our upcoming first pilot run for this methodology, we have conducted various prototype activities: a research day at RedR’s London offices, literature review and technology experimentations. We decided early on in the process to focus on one specific training course – Training of Trainers (TOT) – and to undertake the pilot in three different locations. Testing our prototype during a regular TOT training in London and in two other regions means that we will be able to see how different cultural assumptions around video recording in the class room and different levels of equipment resources– from cameras to microphones to lighting – are influencing the methodology. This is especially important as we are planning for replication in different contexts.
During our research day at RedR’s offices in London we interviewed not only the main TOT trainer, but also two other trainers about their thoughts of using video in the classroom. One of the main insights was that this needs to be done sensitively in order to maintain the classroom as a safe space. It also needs to be organically integrated into the curriculum so as not to disrupt the teaching. Lastly, the rationale for videoing needs to be explained to participants in a positive way, focusing on how it will improve learning rather than finding flaws. How to frame the methodology is therefore of critical importance, as is the voluntary nature of participation. We also met with one of RedR’s M&E staff to better understand how these impact assessment methods would fit into current M&E capture and reporting. We have refined our thinking based on the findings from the research day through ongoing email and phone conversations with the TOT trainer and the RedR project lead.
Together with the trainer, we have also explored some of the pedagogical literature around video recording in the classroom. This provided helpful practical tips for how to create an unobtrusive recording environment but also pedagogic considerations regarding sharing the outputs with individual training participants. From these readings and interviews, the possibility of linking this methodology to both reflective practices and ongoing engagement (the other two methodologies this project is exploring) has emerged. As the date for the first pilot run comes closer, we will be taking our thinking and equipment into an actual class room (at the University of Sussex) to test some of our assumptions.
Authors: Anke Schwittay and Paul Braund