I’ve just emerged from having my head deep in data, budgets, evaluations and reports, wrapping everything up before the end of the User-Centred Design (UCD) for Rapid Community Engagement project in which Oxfam was the research and evaluation partner. Amongst all the detail you can almost lose sight of the bigger picture – good sanitation saves lives, but what ‘good’ looks like is hard to know if we don’t put the user at the heart of the process.
This project encompassed two concepts:
Engaging affected populations is nothing new or particularly controversial, the sector seems to embrace the principle but it often doesn’t translate well enough into practice. Participation in sanitation projects may involve the community only in the construction phase as a paid labour force, or as a cash-for-work initiative. And in fact, as our Landscape Review showed, we don’t know enough about what is happening in practice, especially in sanitation, to drive sector learning in community engagement and ultimately provide better sanitation. There is little for practitioners to draw upon for rapid-onset emergencies in the way of quick and simple methods to measure the extent to which communities engaged with projects and with what effect.
After the Landscape Review, we worked with the HIF to select three partners for five community engagement pilot projects. We established a Steering Committee of all partners and, with Save the Children, developed a monitoring and evaluation framework for the projects.
Following feedback from the Bangladesh and Iraq evaluation teams, a shorter format was used for Lebanon and Uganda, whilst still retaining the same framework to ensure research coherence across all four evaluations. Each evaluation produced a report for the partners (we didn’t publish them), and those reports, along with a detailed analysis of the transcript data, formed the basis of the final Evaluation Report. We also collaborated fruitfully with ALNAP for their UCD research, producing an anonymised, synthesis report of findings for them. Both of us shared our findings and this helped triangulate data and deepen the analysis.
We originally set out to evaluate community engagement in rapid-onset emergencies but ethical and logistical considerations led to a change of focus to early-onset and protracted contexts in Bangladesh, Iraq, Lebanon and Uganda. The partners piloted UCD approaches – bringing in new design-led thinking from outside the humanitarian sector.
This introduced new approaches (and interesting innovations like Save the Children and Eclipse’s digital survey tool – engaging visuals on software uploaded onto tablets which allowed community members to touch problem areas on the images of latrines– great for people who can’t read and for kids too!).
However, this meant that the findings didn’t necessarily ‘fit’ with the research questions. Challenging at times…
The findings from our evaluation report strongly indicated that engaging users has an impact on design, increasing appropriateness and therefore satisfaction. There has been limited documented evidence of this to date in the sector. The detail and rigour of the evaluations has also generated new evidence and challenged assumptions
For example, the assumed link between appropriate latrine design, satisfaction and ownership (use, maintenance, cleaning) did not turn out to be a tidy causal chain. The findings indicated that multiple factors affect satisfaction.
The findings also showed that appropriate design needs to be combined with other factors, such as quality construction, higher value outputs – and, importantly, whether the latrines are shared or not, to generate ownership. Unsurprisingly perhaps, people don’t like cleaning and maintaining latrines that are used by lots of other people! These findings could contribute to further research to test the relative importance of the different factors in sanitation. The evaluation process also generated considerable learning about the challenges of evaluating community engagement, which proved complex, as well as learning about the particular methodology used by Oxfam, which prioritised research rigour over ‘quick and simple’ methods. Revisions are needed.
Oxfam worked informally with Science Practice, a design partner of HIF to produce insights for dissemination from the evaluations and bring new design thinking to the next steps of the project.
The Evaluation Report will be promoted widely within Oxfam. The Public Health Promotion team will be redesigning the evaluation methodology for further testing. We have also coordinated with Sani Tweaks, a series of communication tools promoting good practice in sanitation which are highly coherent with the findings of this project.
Author: Peta Sandison, Project Manager
You are seeing this because you are using a browser that is not supported. The Elrha website is built using modern technology and standards. We recommend upgrading your browser with one of the following to properly view our website:Windows
Please note that this is not an exhaustive list of browsers. We also do not intend to recommend a particular manufacturer's browser over another's; only to suggest upgrading to a browser version that is compliant with current standards to give you the best and most secure browsing experience.