Monitoring, Evaluation and Learning (MEL) in HCD-influenced Programming

October 9, 2023

Katrina is a designer and strategist with 15+ years of experience using her vibrant thinking to facilitate change in the public and private sector.  She recently joined Jhpiego as the Senior Technical Director for Applied Design. In this new role she supports project teams in using human-centered approaches to design across a variety of challenges and purposes including generating demand for health products and services, designing for behavior change, and knowledge production and use. Katrina co-founded and led Picture Impact, a human-centered design and feminist evaluation firm working to facilitate change across a wide range of issues including in public health, child protection, community development, food systems and agriculture. She holds a Master of Urban Planning from the University of Minnesota’s Humphrey Institute, where she studied participatory, people-centered and systems-aware approaches to international development, with a focus on sustainability.

What is your experience merging evaluation and design in the health field? 

I see design and evaluation as collaborators and co-conspirators; partners in a beautiful dance through shared spaces of problem definition, challenge framing, understanding context, listening for insights, creating and learning. For more than a decade I have nourished a practice that purposely mixes these trans-disciplines. My colleagues and I often commented that evaluation and design are “same-same, but different.” Evaluation often looks back at where we came from, and provides clarity on where we are so that we can move forward. Design starts with a deep understanding of the current context while looking toward the horizon, defining how we might journey from where we are now toward preferable, possible, or desired futures. In both disciplines, how we approach the work determines not only the quality of the journey but also the possible destinations. 

I will use the terms data generation and data harvesting and want to take a moment to talk about this purposeful language. Data is created. We generate data throughout our days, going about our lives. To make meaning out of this data we have to be able to see it. When we track, harvest, prepare, and store data well, we are able to shape it into something useful and nourishing. Data generation is the process of making this information visible and useful, whereas data harvesting is the process of preparing information so that we can make sense from it. 

I have intentionally woven design and evaluation together in a number of ways. One of the most satisfying ways is by approaching data generation and data collection using design research methods such as serious games and prototyping, storytelling, role playing, and low-tech interactive tools. Building on the work of my mentor, Dr. Helzi Noponen, I have also woven data generation and data harvesting into the ways that people interact with program delivery, materials, products and services. For example though client workbooks with spaces to record goals, track progress, and identify resources. When we generate data for ourselves, it’s not only more accurate but also more relevant. This kind of data is all around us, but often we don’t create ways for people to really see it for themselves, harvest it and make meaning out of it. 

Another way that evaluation and design are intertwined is how they both rely on feedback loops, connecting doing and learning, when done well. The iterative, learning-by-making approaches such as HCD, afford us an opportunity for continuous learning and adaptation. By mixing design and evaluation, it gives us an opportunity to think about ways to be even more intentional about learning along the way through making and through doing.

This emergent, flexible way of working necessitates a shift from mechanistic, attribution, and causal models toward recognizing that the horizon keeps changing. As our design goes out into the wild, it will continue to interact, shift and change. What might we learn from it, how might we continue to evolve it, what impact is it making, how are people’s lives improved through it. These are all questions at the intersection of design and evaluation.

What methods, frameworks or tools do you use to monitor and evaluate design-led programs?

There are many opportunities to weave design and evaluation together in intentional ways. Any of the participatory and complexity-aware methods could be appropriate for understanding how design facilitates change. Common methods in participatory MEL include Outcomes Harvesting, Most Significant Change, Ripple Effect Mapping, and other methods and tools for adaptive action. Developmental Evaluation is a particularly useful support for understanding what is emerging through a longer design process, as ideas become real through prototypes and are refined toward pilot and implementation.

Both action research and participatory data collection methods merge program delivery with M&E in ways that make data generation and harvesting an activity of the program users, for their own uses. This is a meaningful way to shift from extracting data toward sharing knowledge production and knowledge access. 

Design is hungry for meaningful inputs. The more evidence goes into the design process, the better the outcome. This isn’t an argument for lengthy formative research processes, rather for recognizing the value in creating feedback loops between design and evaluation, particularly when evaluation is understood as a process for learning.

Designers can build data capture directly into their work. We do this all the time; from writing on flip-charts and sticky notes, creating assessments and debrief forms for program users to developing workbooks and interactive apps. All of this data is generated for our use in the design process, or for the user’s purposes within the implementation. This data can be harvested to make meaning. 

For example, a habit tracker that allows new PrEP users to set a goal for themselves, write down who will support them along their PrEP journey, decide on how they will remember to take PrEP and gives them a place to track their PrEP use for the first 30 days, can yield meaningful real-time data for program M&E, as well as provide feedback to the design team, all while supporting habit formation and continued use. 

We can build M&E directly into the materials, services, interactions, and experiences we are designing, allowing users to generate and make use of their own data (e.g, tracking how many calories are eaten or if you’ve taken your daily dose of PrEP), while also the program uses it to understand what is happening, why and how (and for whom, and under what circumstances such as, real-time tracking of PrEP use and continuation without an additional data collection process).

We must recognize that this, too, necessitates some shifts in how we typically do program delivery and program evaluation, as separate activities that are not intimately intertwined. Adaptation and learning call us to consider these more integrated ways of working and learning where we use program implementation to get real-time feedback that then helps us improve. The private sector has been using these kinds of continuous improvement techniques for a long time. It is possible and practical to build evaluative capacity throughout program and service delivery, and to harvest the data that we are already generating for evaluation. 

What are the key elements that practitioners should plan for and keep in mind when monitoring and evaluating design-led interventions?

We are often using design to facilitate change in complex contexts and on challenges that have resisted intervention. Participatory design processes, particularly co-design but also human-centered approaches, give us an opportunity to not only evaluate the impact of the intervention, and whether the design-led approach was more responsive or effective, but also to look at the impact of the process itself. This list is not exhaustive, but is a jumping off point:

  • Consider going meta with your evaluation of design-led interventions. Look beyond the impact of what was designed through HCD. How is a human-centered design approach creating change? 
  • Consider how you will weave data collection into the intervention in ways that are meaningful and provide real-time feedback to the program users.
  • Look for ways to leverage existing program structures that can be enhanced with evaluative thinking and activities. Where can evaluative thinking make implementation more effective? How will you harvest this thinking?
  • Who will be part of the design and evaluation team? Have you included the end-users or people with lived experience? 
  • What frequency will you be able to harvest data, synthesize it and feed insights back into program implementation for adaptation? 

How do you use the data and insights from HCD and M&E to inform program/intervention improvement or adaption?

This is really about creating opportunities to adapt and derive meaning. If you are doing co-design and have a core design team, this group will be constantly using data and insights from design and MEL to inform or adapt the program/intervention/product/service. If co-design isn’t part of your process, consider a co-evaluation team with participants from across the project (from leadership, MEL and to clients/end-users and possibly even donors) that meets regularly to pause, look at data together and make or foster changes. 

How can taking an HCD approach improve standard ways of conducting MEL in health programs? 

This question is really asking, what does a human-centered approach to MEL yield that other approaches might not. Looking at the four principles of HCD we can begin to see some opportunities, noting that many of these principles are aligned with complexity-aware MEL approaches, which also offer similar benefits. 

  • Understanding issues deeply and looking at the challenge in context asks us to understand not only what happened, but why, how, when, and what it was about the context that either facilitated or impeded the program’s progress or impact.
  • Making and learning together is an opportunity to use participatory approaches and to put data generation into the hands of those closest to its source, and to consider co-research and participatory meaning-making, recognizing that we each might ask different questions, generate different data or make meaning from the same data set in different ways. 
  • ‘Everything is interconnected’ means we must take a complexity-aware and systems thinking approach — looking at relationships and dynamics that impact and are impacted by the desired change. 
  • Understanding people — core to a human-centered design approach — may mean bringing different frames or lenses to the analysis, looking beyond behavior and understanding capability, motivation, opportunity as well as trauma, influence and power. 

How can we use MEL approaches and tools to increase understanding of the influence of HCD in health programming?

Evidence continues to emerge that an HCD approach to design makes for more effective interventions. This has long been understood in the corporate sector. We can use MEL to not only continue to build this evidence base, but to understand when, why and how this is true, and for whom. We can also use MEL as we work toward design standards and maturity models that help us better understand quality in the design approach and process. 

Co-creation in project design and implementation is becoming a more established practice, how do we ensure that we are also being human-centered in the way we do our program MEL?

This is a great space for exploration. If being human-centered in the way we approach MEL means sharing power with people and co-evaluating, that opens up more space for people and programs to learn together. 

Inclusive approaches to MEL — such as community-based, feminist, indigenous and decolonizing methodologies — ask us to consider power, how research questions shape the research inquiry, how the methods shape the available data, and how the available data shapes our conclusions. At the least, co-creation of the research questions, process, and results is essential to designing, implementing and understanding the real impact of interventions.

This is not a space unique to HCD, there is a long history of approaching meaning-making with people, as those most impacted by an issue are the ones who can most accurately measure, manage and make meaning around it. You can look to citizen science, participatory planning processes, community-led development and community-based participatory research (CBPR) for ideas on how to make meaning with people. Some of my favorite participatory methodologies include Photo Voice, Outcomes Harvesting, and Most Significant Change. Many design thinking tools can be used within M&E for people to think and learn together and, ethnographic, exploratory, semi-structured, story-based qualitative interviews, immersive observation and group dialogues are all methods that can be used to better see and understand people, their context and the nature of change. 

If this all sounds too idealistic or too hard to do, consider these ideas: integrate data generation into program delivery, look for ways to enhance existing program structures such as regular meetings or site visits with reflection questions, let design and evaluation continue to inform adaptation and learning.

Helpful resources for participatory MEL or co-evaluation include: 

You May Also Like

Creating a career in HCD: A young-person’s Cheat sheet

Creating a career in HCD: A young-person’s Cheat sheet

What is your name and job title? My name is Mercy Kipngeny and I am a Clinical Designer at Medtronic Labs. What has your career journey been like? I started my career in Human-centered Design (HCD) and Adolescent Sexual and Reproductive Health and Rights (ASRHR) in...

Pathways for Scaling Social Impact Solutions

Pathways for Scaling Social Impact Solutions

We kicked off 2023 with back-to-back sessions that explored how to Demystify Scale and Scale HCD Solutions, and this January, we delved into the different pathway options for scaling social impact with our partner, Spring Impact. What it takes to scale well...

Share This

Your Favorites

      No Favorites

Download Resource

Please log in or register to enjoy curated content and resources tailor-made for you and add them to your favorites' collection for ease of reference and download. Or close the dialog box to continue without logging in.