Ongoing Engagement and Reflective Practices
RedR UK and the University of Sussex re designing and testing innovative methods to better understand participants’ learning and changes in behaviour following humanitarian capacity building interventions.
The innovative impact assessment project is prototyping and testing three methodologies to capture learning, behaviour change, and perhaps results, from participants of humanitarian capacity building interventions. Alongside the video capture, which has been the subject of the two most recent blogs, we are also looking at the use of ongoing engagement and reflective practices. Our key questions for all three methodologies are:
- Are improvements in learning, behaviour change and results more likely?
- Did these methodologies promote that?
- Are we better able to assess L2 and L3 impact?
- Is it cost effective?
- Does it support internal learning and ongoing improvement
This blog will provide an update on the activities to date for the ongoing engagement and reflective practices methodologies.
Bi-weekly email scenarios to prolong participants’ engagement with course content and support learning being put into practice after the face to face course
Update: For two courses, the Essentials in Humanitarian Practice, and the Certificate in Security Management, a series of six short scenarios with discussion questions has been developed, where each scenario relates to an area of the course content. Participants on these courses in late 2017 have been receiving these scenarios and discussion questions in a group email, approximately every two weeks since attending their course. The idea behind these bi-weekly email scenarios is two-fold. First, to prolong participants’ engagement with the course content, and to thereby support learning being put into practice after the face to face course. Second, to increase the response rate to follow-up surveys designed to capture information on participants’ learning and use of that learning in the first three months after the course.
We are now in the process of reviewing the data and collecting user feedback from participants on the pilot courses.
RedR already uses a range of reflective practices, both in our face to face training, and through the delivery of other forms of capacity building, particularly coaching and mentoring. Our previous work with the University of Sussex recommended that redesigning some of the processes and tools used in our delivery of these learning interventions could improve our ability to capture information on the outcomes beyond Kirkpatrick’s level 1, reaction.
We have therefore reviewed monitoring and evaluation templates for coaches, coachees, mentors and mentees, and revised these to include questions better designed to capture information on higher level evaluation of learning, but particularly of behaviour change and results. Guidance was developed for trainers on credit-rated courses, who are being asked to identify and write up short case studies that exemplify learning, behaviour change or results.
Once these tools and guidance documents were in place, they have been piloted over the previous 3-5 months, on two coaching schemes, on the open mentoring programme, and for any written assignments submitted for our credit-rated courses. We are now in the process of reviewing the data and collect feedback from a sample of all involved – coaches, coachees, mentors, mentees, course participants, and those trainers marking the written assignments.