In this evidence story, WFP’s Chief of Social Protection, Sarah Laughton, highlights the value of evaluative thinking within WFP.

It seems as if social protection has really gained prominence in WFP during Covid-19. What are your views about this?

Covid-19 has put a spotlight on social protection, but the truth is that most of WFP’s country offices were involved in supporting social protection long before the Covid-19 pandemic hit. We did a review about a year ago and found that 78 of our country offices are supporting national social protection systems in one way or another. Covid-19 accelerated WFP’s work in social protection, but it’s not the reason we’re doing it.

WFP is mandated to help governments move towards Zero Hunger, and social protection is one evidence-based way to achieve that. There are many different paths to Zero Hunger, yet many people are simply not going to achieve it without the significant expansion of nationally-managed, nationally-owned social protection programmes.

Social protection is also a policy tool that governments use to help people manage risks and shocks better. If we assess how risks and shocks are multiplying and affecting increasingly more people, Covid-19 being only one example, then we must look beyond what we do ourselves, and start looking at how we support governments with the strategies to move towards Zero Hunger.

Evidence seems to play a central role in the social protection policy. Why this focus on evaluative evidence in social protection?

Well, firstly, because the evidence that showed why social protection works, is one of the key reasons why there’s so much more investment in it now than there was five years ago. And that’s a significant point.

Evidence is also important because any government around the world faces difficult choices about where to put limited resources and identify the best possible impact for those resources.

Many countries don’t have enough resources to support everybody who needs it, and evidence is the missing link to help policymakers understand what is going to work best, what courses of action to take, and what strategies are going to be most effective.

Even countries with sufficient resources need guidance on prioritization. Evidence is an important part of how any government makes decisions about where they will allocate funding. And for WFP it becomes critical, because it ensures and improves the quality and the relevance of our technical support to governments.

How did the evaluation of the safety Nets policy in 2012 help to inform the social protection strategy?

We used the evaluation of the Safety Nets Policy in 2012 to inform our thinking about how we should move forward in social protection.

For example, our entire work plan for the past two years has been moving forward on the five recommendations that came out of the evaluation. It highlighted that we needed a strategy. The safety nets policy was good, but there wasn’t a vision; there wasn’t overall strategic direction, or a theory of change.

We spent a year and a half working with that recommendation, and launched our social protection strategy on 1 July 2021. The evaluation set us on the path and gave us a mandate.

WFP has a strong history of supporting the design and delivery of nationally-led social protection. Are there any lessons you can draw from evaluation?

Evaluation as a science and a practice can teach us a lot. For example, when working with governments, specifically in the field of social protection, it’s important to ensure that you understand the context and plan appropriately. What are the main problems? Are you diagnosing those problems correctly? Do you understand how government is set up to deal with them?

In social protection, we need to ask these questions, because we work with long-term projects. Governments spend millions of dollars on programmes that are going to run for 20, 30, 40 years.

We also need to shift our approach. We are often too concerned with looking for entry points for ourselves, and what we need to do, is look at it in a different way. The question needs to be: “What are your needs and priorities as government, and how can we fit in?”

At a personal level, what has been some of the biggest lessons you have learnt or insights you have gained from an evaluation?

There’s a lot of value in consciously, intentionally asking thoughtful questions, reflecting on the answers and trying to seek out different perspectives.

I think the spirit of curiosity and deliberate, thoughtful questioning that evaluation teaches us, is critical. There’s nowhere where evaluative thinking is not relevant, and there’s no job in WFP where that’s not going to help you, especially if you’re in a country office.

In WFP, we are still on a journey towards fully embracing evaluations and evaluative thinking. Like everyone everywhere else, we’re afraid of having what we’re doing wrong, being pointed out publicly, and we don’t always fully embrace and learn from our mistakes.

WFP’s country offices are full of people who genuinely want to be doing a good job and work on programmes that produce impact. We work for WFP because we want to make a difference. And therefore, we should welcome evaluation. It’s an ally that is helping us do a better job, but we must be willing to see it that way: evaluation as a strategic partnership to help us be a good agency and reach our goals.

Do you think the evaluation function has a role to play in in learning within social protection?

I think organizational learning in social protection is very important. A whole pillar of the strategy is how we support knowledge and learning.

In the strategy, social protection requires systematic efforts to produce evidence. Also, then to exchange that knowledge, communicate it effectively, and to support a process of learning through capacity strengthening.

It focusses on what we need to do for research and evidence generation, and one specific output we need more of is impact evaluations.

In your career at WFP, you may have come across stories from the field. Can you think of encounters where you have witnessed evidence in action, or where the insights from an evaluation have played a role in directly changing a programme outcome?

The question makes me think of a specific evaluation that was done that was so impactful and had the potential to really change the way a program was run.

When I was the Head of Programme in Uganda a number of years ago, we did this very serious impact evaluation with IFPRI on the impact of food and cash transfers on early childhood development centres. UNICEF was supporting government in a very poor region of Uganda. They looked at everything: the impact of food security on nutrition even on non-cognitive outcomes. The evaluation was done over a year and a half.

And these early childhood development centres, of which many weren’t formal facilities, but included teachers and children meeting under a tree with very limited study materials, accounted for amazing results.

They found that small amounts of cash transfers had broad impacts across a whole range of outcomes, including positive impacts on household food security, amenia, on participation in the centres, on child cognitive development and even on non-cognitive development. I was just astounded. It was just a very powerful piece of evidence.