Measuring A Moving Target

Sep 17, 2022

Presenters

Anne LaFond

Senior Advisor, HCDExchange

Divya Datta

Director of Design Strategy and Innovation, Vihara Innovation Network

Matthew Wilson

Project Director, Adolescents 360

Fifi Ogbondeminu

Deputy Director, Adolescents 360

Hanieh Khosroshah

Senior Product Designer, YLabs

Laetitia Kayitesi

Research Manager, YLabs

Combining HCD and adaptive implementation (AI) approaches to attain robust evidence and greater impact for programs

 

  • Using the adaptive implementation approach, the findings from routine user testing informs iterations to the program.
  • Applying HCD during the implementation and evaluation stages of the intervention lends itself well to adaptive learning and iteration; allowing for new information to be continuously introduced.
  • HCD can support the long-term sustainability and cost-efficiency of the intervention through addressing implementation challenges early and often, rather than relying solely on endline results.

Challenges of conventional outcome evaluations

  • The design of the intervention & the implementation strategy is not known at the outset.
  • Practitioners cannot assume consistency in the intervention design and implementation strategy as it evolves over the life of the project.
  • Documentation of the iterative design process, decision making, testing along the way is either absent or incompatible with what is acceptable measurement metric.
  • Differing learning priorities and styles of different disciplines,thus limited common language
    Limitations with financial resources

“The dream would be to see all disciplines kind of bring their individual strengths to bear but for the boundaries to blur, small cross-disciplinary, hybrid design and measurement teams working on a day to day basis together, optimising implementation strategies for outcomes.”

– Divya Datta

 Mitigating challenges to rigorously evaluate AI and HCD interventions

  • Keeping the ‘essence’ of the intervention in mind
    Evaluation designs need to be responsive and adaptive too
  • Implementers also need to have a robust understanding of evaluation protocols
  • Implementers need to manage adaptation effectively
  • Think beyond outcome evaluations
    Evaluators need to be resourced to reconcile results emerging from different evaluation approaches

Suggested Solutions

  • Enhancing the understanding of designers and design teams to understand and factor for indicators of program success. For example, involving design teams in program shaping discussions that are grounded in the Theory of Change (ToC)
  • Promoting collaboration between design and research teams to find the right balance. This collaboration will help project teams to identify and align on key research objectives, using both quantitative and qualitative data, to measure the impact of the project and identify the current challenges.
  • Use of rapid and utilitarian measurement approaches in the design phases to inform creative hypotheses and prototype development in the design process and link efforts to indicators.
  • Moving toward agile, adaptive and utility focused evaluations.
  • Better documentation capabilities to record every change and adaptation made.
  • Being adaptive in terms of the implementation is not enough, practitioners have to embrace flexibility with processes.