To build on this knowledge, the HCDExchange brought together a panel of MLE experts and designers in 2024 to discuss the evolution of MLE approaches and techniques, their learnings and strategies over the years, the challenges that persist, and how the public health practice can address them. The speakers included Abednego Musau from PSI Kenya, Felician Luchagula from Pathfinder Tanzania, Stefanie Wallach from IPPF, and Kapil Vacchar from Commonplace, India.
Key learnings that emerged
Over the past decade, the integration of HCD into global health has transformed approaches to MLE, a critical driver of progress in health programming as MLE experts began collaborating with designers to align evaluation techniques with the HCD process, traditional notions of measurement required reevaluation. This shift has led to a deeper exploration of how to measure the outcomes and impacts of HCD-driven initiatives effectively. As human-centered programming gains momentum, it has become necessary to bridge the gap between practice and evaluation by focusing not only on learning how to design but also on measuring in ways that reflect the values and complexities of this approach. To this end, our fireside chat speakers shared their experiences, shedding light on the evolving role of measurement in HCD and its broader implications for global health.
The panel emphasized the importance of integrating evaluation and implementation throughout HCD programs, highlighting the need for evaluators to be actively involved during the design phases. This involvement ensures that findings and recommendations emerging from both the design and program processes are robust and actionable. Speakers also stressed the growing significance of embedding MLE into HCD processes from the outset rather than treating it as an afterthought. In health programs, this early integration enriches outcomes by blending diverse perspectives. Collaboration between designers and MLE experts allows program teams to define success more comprehensively, moving beyond traditional metrics such as service uptake to a deeper, more nuanced understanding of user experiences and their measurement. This integrated approach ultimately enhances the design process and the overall impact of health programs.
Learnings on integrating MLE and HCD more effectively:
- Adopting an open and creative mindset: Speakers emphasized that MLE experts must adopt open, creative mindsets that prioritize human desires, behaviors, and experiences in design decisions. They noted that traditional MLE approaches must evolve to accommodate the dynamic and iterative nature of HCD. This evolution requires embracing ambiguity, uncertainty, and flexibility in evaluation processes to keep pace with the rapid changes in design interventions. Iterative solution development, involving active user participation, was highlighted as essential for refining interventions. It was also shared that MLE could benefit from using flexible research methods and continuous feedback loops with clients and providers to adapt services in real-time. Additionally, speakers noted that user-centered metrics—focusing on elements such as satisfaction, privacy, respect, and communication – are important for evaluating service delivery and health outcomes effectively.
- Adjusting traditional MLE to be more adaptive and iterative: Traditional M&E approaches, often centralized and systematic, were observed to require adjustments to better align with the iterative nature of HCD. Speakers highlighted the importance of adaptive learning and iterative feedback to demonstrate the value of incorporating user insights, even when the process may appear less structured than conventional practices. This approach was seen as a way to refine solutions before making significant investments, ensuring the design process is efficient and effective. Participants also emphasized that rigid, one-size-fits-all methods are inadequate, as outcomes must be defined at different stages. They noted that insights gathered early in the process often differ from those needed to determine solution effectiveness later on. It was shared that defining evaluation frameworks too early, before interventions are fully designed, could lead to misalignment. Instead, evaluation processes were recommended to evolve alongside interventions, even if this required extending timelines, to maintain alignment with program implementation.
- Integrating HCD processes into traditional MLE to make it more learning-focused: Participants discussed how integrating HCD processes, such as data gathering, small experiments, and generating insights, could enhance traditional M&E approaches. These methods were described as effective for better understanding user needs and refining solutions. A learning-focused measurement approach was observed to increase the relevance of interventions across different communities by aligning evaluation practices with real-world user experiences.
- Participatory evaluation enhances insights: Observations emphasized the importance of participatory evaluation, mirroring the user involvement central to HCD design processes. Speakers shared examples of engaging users – those directly experiencing the intervention – in activities such as conducting interviews, analyzing data, and interpreting findings. This approach was seen as a way to generate richer, more meaningful outcomes and provide deeper insights into the effectiveness of interventions.
- Integrating user experience into the theory of change and creating space for soft and experiential indicators: Speakers highlighted the value of adapting the traditional theory of change to include intentional sequences of user experiences. This adaptation was observed to provide a shared framework for understanding how design impacts program outcomes. Participants noted the importance of including user-centered insights in evaluation processes while also recognizing the continued need for clear and standardized metrics. It was shared that incorporating qualitative and experiential indicators allows for a more nuanced understanding of program outcomes. Centering the theory of change around user experiences was also described as a way to align HCD and MLE practices while fostering a common language among stakeholders.
Learnings about alignment and collaboration amongst different stakeholders:
- Managing stakeholder expectations: Clear communication of evaluation objectives was highlighted as essential to avoid misalignment between stakeholder expectations and what is feasible at different stages of a program. For instance, expecting impact data during the pilot phase, when the primary goal is feasibility testing, can lead to misunderstandings. Managing these expectations ensures that evaluation results are both relevant and actionable.
- Managing funders’ expectations: Funders of design-driven programs often seek detailed clarity upfront regarding evaluation outcomes, methodologies, and costs. However, since interventions evolve throughout the HCD process, speakers emphasized the importance of managing funder expectations to allow for flexibility and adaptability in both program implementation and evaluation.
- Establishing a shared understanding and common goal among stakeholders: A shared understanding among stakeholders is critical for achieving user-centered, high-quality programming, particularly in reproductive health and global health initiatives. Participants emphasized the need to intentionally align objectives from the outset rather than assuming all stakeholders are naturally aligned.
- Collaboration between HCD and MLE experts: Rather than working in silos, stakeholders were encouraged to engage actively with one another. Collaboration was seen as vital for integrating insights from designers, evaluators, and other experts, creating a feedback loop where MLE shapes HCD design decisions and HCD principles inform evaluation approaches. Speakers noted that trust and collaboration within teams play a key role in navigating challenges and adapting to evolving needs. Additionally, fostering a mindset of curiosity and continuous learning, which are core principles of the design process, helps team members remain open to new insights and collective problem-solving.
- Active stakeholder engagement: Involving stakeholders in decision-making, ideation, and prototyping was described as essential for aligning their insights and expectations with program design and evaluation. Examples included engaging community members, healthcare providers, and key decision-makers, such as Ministries of Health, local governments, and facility managers. This collaborative approach integrates diverse perspectives, fosters collective ownership, and strengthens stakeholder buy-in. In turn, these actions inform program adaptations and enhance implementation and program success.
Learnings from specific public health programs on the use of MLE in HCD processes
Learnings from the A360 program
About A360: A360 aims to improve adolescent sexual and reproductive health, with a focus on girl-centered contraceptive programming to increase the uptake of modern contraceptives.
A360 leverages HCD to develop engaging, relevant solutions that are tailored to the needs of adolescent girls aged 15 to 19. Their MLE approaches focused on assessing user delight, experience quality, and solution effectiveness. By employing qualitative and quantitative methods, including structured client surveys and interviews, A360 gathered insights into how users interacted with solutions and the challenges they encountered.
Recognizing the impact of environmental factors on user experience, A360 ensured that evaluation metrics evolved over time to capture individual and contextual influences. Sustainability was a key focus, as donor-funded solutions must remain viable beyond the program’s direct involvement. To achieve this, A360 integrated solutions within existing health systems and monitored their continued implementation to prevent fidelity loss. They assessed user experiences when the program led implementation and after transitioning solutions to government providers, ensuring consistency in service delivery.
Additionally, A360 triangulated data from multiple sources, combining structured evaluation methods with unstructured insights to obtain a more authentic understanding of user experiences. Acknowledging that participants may alter their responses when they are aware that they are being studied, the organization employs innovative strategies to capture genuine feedback outside traditional research settings. This adaptive and user-focused approach ensures that A360’s solutions remain effective, contextually relevant, and sustainable over time.
ITAD, an external evaluator who evaluated the A360 program, engaged people using the evaluation—including implementers and program managers—through participatory approaches. ITAD employed participatory action research to collaborate closely with program managers from the A360 program. Instead of relying solely on traditional evaluation methods, this approach focused on identifying real-time learning needs at different stages of the program and designing rapid evaluation techniques to collect and deliver actionable insights quickly.
By ensuring that evaluation findings were immediately fed back to program implementers, this method enabled timely adaptations and course corrections, making the program more responsive to user needs. This approach highlights the necessity of building time into the evaluation process for sharing insights across designers, evaluators, and implementers, ensuring that findings directly inform program improvements.
Learnings from the Beyond Bias program
About Beyond Bias: A project that addressed the different types of provider biases and behaviors that translate into barriers for youth who want access to contraceptive counseling and services.
Beyond Bias employed various quantitative and qualitative data collection methods to assess provider bias and client experiences. These included client exit surveys, mystery clients, discrete choice experiments from provider surveys, and in-depth interviews. However, the team later concluded that mystery clients provided the most reliable data.
The mystery clients were extensively trained female enumerators who were well-versed in identifying key behavioral indicators during provider interactions. Unlike exit surveys, where clients often provided socially desirable responses, mystery clients were incentivized to report observations honestly. Their unannounced and anonymous visits ensured that their feedback was unbiased and reflected real-world interactions.
In contrast, provider surveys, client exit surveys, and in-depth interviews relied on self-reported data, susceptible to social desirability bias. Intervention facility providers, having been trained on expected responses, may have overstated positive attitudes toward clients. In contrast, actual clients may have downplayed negative experiences to avoid portraying providers negatively. Due to these limitations, data from these self-reported methods were ultimately excluded from the final analysis of the effectiveness of solutions designed to address provider bias.
This process highlighted the importance of employing multiple data collection methods to enhance the reliability of findings. By using diverse approaches, Beyond Bias was able to critically assess which methods provided the most accurate insights, ensuring the integrity and effectiveness of their evaluation process.
Learnings from a Health program in India
One of the speakers in the panel shared a case example of a health program in India that addressed multiple subject areas, including family planning, contraceptive care, and post-pregnancy care while working with migrant communities who typically remained in a specific location for only six to nine months. This transient nature posed significant challenges for frontline workers with limited time to onboard participants, link them to relevant program services, and evaluate service effectiveness.
To address these constraints, the program adopted a human-centered approach that integrated MLE directly into the user experience. Rather than collecting large amounts of programmatic data from participants upfront, the approach prioritized understanding users’ aspirations and motivations. For instance, if a woman expressed a goal of saving a specific amount of money to start a small business, this aspiration became the entry point for engaging her in the program.
Frontline workers then tailor service delivery by linking these aspirations to relevant health interventions. For example, rather than presenting health insurance as a standalone offering, it was framed as a financial strategy to reduce medical expenses, enabling participants to redirect savings toward their personal goals. This approach transformed the onboarding process into both an engagement tool and an embedded M&E platform, allowing the program to track whether participants were benefiting from the services provided.
Additionally, this user-driven framework enhanced program accountability by ensuring participants remained engaged for health-related reasons and broader personal motivations. By aligning program objectives with user aspirations, the program demonstrated an innovative way to merge MLE and design, moving beyond traditional phased approaches to a more dynamic, integrated model that better serves both the program and the community.
In the context of HCD and MLE, speakers in the fireside chat agreed that integration is an iterative and labor-intensive process that requires careful navigation rather than an expectation of immediate change. Practitioners must recognize that building new measurement systems and integrating emerging processes takes time, effort, and collaboration. By sharing this evolution’s cost and labor, HCD and MLE experts can work together to create more sustainable and impactful change.
Resources
- To view the complete webinar, please click here.