Purpose
Presenters
Anne LaFond
Senior Advisor, HCDExchange
Divya Datta
Director of Design Strategy and Innovation, Vihara Innovation Network
Matthew Wilson
Project Director, Adolescents 360
Fifi Ogbondeminu
Deputy Director, Adolescents 360
Hanieh Khosroshah
Senior Product Designer, YLabs
Laetitia Kayitesi
Research Manager, YLabs
Combining HCD and adaptive implementation (AI) approaches to attain robust evidence and greater impact for programs
Insights
- Using the adaptive implementation approach, the findings from routine user testing informs iterations to the program.
- Applying HCD during the implementation and evaluation stages of the intervention lends itself well to adaptive learning and iteration; allowing for new information to be continuously introduced.
- HCD can support the long-term sustainability and cost-efficiency of the intervention through addressing implementation challenges early and often, rather than relying solely on endline results.
Challenges of conventional outcome evaluations
Insights
- The design of the intervention & the implementation strategy is not known at the outset.
- Practitioners cannot assume consistency in the intervention design and implementation strategy as it evolves over the life of the project.
- Documentation of the iterative design process, decision making, testing along the way is either absent or incompatible with what is acceptable measurement metric.
- Differing learning priorities and styles of different disciplines,thus limited common language
Limitations with financial resources
Mitigating challenges to rigorously evaluate AI and HCD interventions
Insights
- Keeping the ‘essence’ of the intervention in mind
Evaluation designs need to be responsive and adaptive too - Implementers also need to have a robust understanding of evaluation protocols
- Implementers need to manage adaptation effectively
- Think beyond outcome evaluations
Evaluators need to be resourced to reconcile results emerging from different evaluation approaches
Key Points
- Enhancing the understanding of designers and design teams to understand and factor for indicators of program success. For example, involving design teams in program shaping discussions that are grounded in the Theory of Change (ToC)
- Promoting collaboration between design and research teams to find the right balance. This collaboration will help project teams to identify and align on key research objectives, using both quantitative and qualitative data, to measure the impact of the project and identify the current challenges.
- Use of rapid and utilitarian measurement approaches in the design phases to inform creative hypotheses and prototype development in the design process and link efforts to indicators.
- Moving toward agile, adaptive and utility focused evaluations.
- Better documentation capabilities to record every change and adaptation made.
- Being adaptive in terms of the implementation is not enough, practitioners have to embrace flexibility with processes.
Resources
- Implementing adaptive youth-centered adolescent sexual reproductive health programming: learning from the Adolescents 360 project in Tanzania, Ethiopia, and Nigeria (2016-2020)
- Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360
- Monitoring and evaluation: five reality checks for adaptive management
- How to Monitor and Evaluate an Adaptive Programme: 7 Takeaways
- Randomised Controlled Trial (RCT) – Impact Study (2018-2021)