To examine some of the roadblocks in evaluating health projects rooted in adaptive implementation and human-centered design (HCD) approaches. The webinar speakers shared insights from the CyberRwanda, Adolescent 360 (PSI) and the Vihara Innovation Network projects.
Senior Advisor, HCDExchange
Director of Design Strategy and Innovation, Vihara Innovation Network
Project Director, Adolescents 360
Deputy Director, Adolescents 360
Senior Product Designer, YLabs
Research Manager, YLabs
Combining HCD and adaptive implementation (AI) approaches to attain robust evidence and greater impact for programs
- Using the adaptive implementation approach, the findings from routine user testing informs iterations to the program.
- Applying HCD during the implementation and evaluation stages of the intervention lends itself well to adaptive learning and iteration; allowing for new information to be continuously introduced.
- HCD can support the long-term sustainability and cost-efficiency of the intervention through addressing implementation challenges early and often, rather than relying solely on endline results.
Challenges of conventional outcome evaluations
- The design of the intervention & the implementation strategy is not known at the outset.
- Practitioners cannot assume consistency in the intervention design and implementation strategy as it evolves over the life of the project.
- Documentation of the iterative design process, decision making, testing along the way is either absent or incompatible with what is acceptable measurement metric.
- Differing learning priorities and styles of different disciplines,thus limited common language
Limitations with financial resources
Mitigating challenges to rigorously evaluate AI and HCD interventions
- Keeping the ‘essence’ of the intervention in mind
Evaluation designs need to be responsive and adaptive too
- Implementers also need to have a robust understanding of evaluation protocols
- Implementers need to manage adaptation effectively
- Think beyond outcome evaluations
Evaluators need to be resourced to reconcile results emerging from different evaluation approaches
“ The dream would be to see all disciplines kind of bring their individual strengths to bear but for the boundaries to blur, small cross-disciplinary, hybrid design and measurement teams working on a day to day basis together, optimising implementation strategies for outcomes.” – Divya Datta
- Enhancing the understanding of designers and design teams to understand and factor for indicators of program success. For example, involving design teams in program shaping discussions that are grounded in the Theory of Change (ToC)
- Promoting collaboration between design and research teams to find the right balance. This collaboration will help project teams to identify and align on key research objectives, using both quantitative and qualitative data, to measure the impact of the project and identify the current challenges.
- Use of rapid and utilitarian measurement approaches in the design phases to inform creative hypotheses and prototype development in the design process and link efforts to indicators.
- Moving toward agile, adaptive and utility focused evaluations.
- Better documentation capabilities to record every change and adaptation made.
- Being adaptive in terms of the implementation is not enough, practitioners have to embrace flexibility with processes.
- Implementing adaptive youth-centered adolescent sexual reproductive health programming: learning from the Adolescents 360 project in Tanzania, Ethiopia, and Nigeria (2016-2020)
- Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360
- Monitoring and evaluation: five reality checks for adaptive management
- How to Monitor and Evaluate an Adaptive Programme: 7 Takeaways
- Randomised Controlled Trial (RCT) – Impact Study (2018-2021)