At Itad, we want our work to have a positive impact on people and planet. This means thinking explicitly about how we influence our clients through evidence, insights and learning to help them do their best work in the world.
So, as part of our first ever company-wide Knowledge and Learning Day, we convened a special panel event to hear reflections and insights on this issue from some of our closest partners.
Our panelists were:
- Jeremy Astill-Brown, independent consultant and former British diplomat specialising in peace, governance and security. Jeremy has played a technical leadership role on several Itad monitoring, evaluation and learning (MEL) projects since 2018.
- Wycliffe Owanda, Regional MEL Manager for the Conflict, Stability and Security Fund in East Africa, who has worked closely with Itad since 2016.
- Isabel Vogel, independent consultant and expert in learning, evidence and innovation. Isabel has led some of Itad’s highest profile evaluations over the past ten years including of the Global Challenges Research Fund and the Building Capacity to Use Research Evidence
Our panelists shared stories of the impact they had seen through their work with Itad. One very clear theme stood out: impact often stems from long-term collaborations, and through shifting mindsets around the value of MEL.
Long, strong collaborations and mindset shifts
The clearest examples of impact we see are often from MEL projects that involve long-term engagement between Itad and our partners, working in a collaborative way to build partners’ skills and systems to monitor and learn from change. This has the potential to shift mindsets, from seeing MEL as ‘another bureaucratic exercise’ or even worse ‘an accountability stick’, to understanding how it can support strategic thinking, genuine learning and innovation. Jeremy recalled one of his most satisfying MEL experiences:
“We took a bunch of people who were afraid of their results frameworks and got them to see themselves as part of a machine that was delivering strategic gain.”
We achieve impact in these situations by taking the time to understand culture, dynamics and challenges, and providing bespoke support to partners on their ‘MEL journey.’ This often boils down to fairly simple tools, alongside the much harder task of helping partners make the time and space to genuinely learn and reflect. Isabel recalled a conversation she had with a partner who had started off feeling fairly ambivalent towards MEL:
“She said: “Ahh I get it now. You want us to stop thinking of MEL as compliance, and you want us to fall in love with it.” She said, “I think we’re not quite falling in love, but we’re definitely dating!”
So, what can we do to maximise our impact?
Our panel discussed many factors that limit the potential of evidence to make a real difference to decision making and practice.
Some of these are outside of our control. For example, we can’t influence whether the decision to approve the second phase of a programme is taken before we have completed an evaluation of the first phase. We are also fighting a losing battle if clients or partners don’t have the time, interest or incentives (and the three are usually interrelated) to use evidence for learning and reflection.
But other factors we can affect. What struck us during the discussion is that although much of the conversation around MEL is often about the technical side – approaches and methods – these are not the things that lead to impact. Two important themes kept cropping up in the stories our panelists shared:
Be robust yet humble
It is important to ensure our study design and methods are robust, but at the same time being humble enough to ‘keep MEL in the back room’.
Robust design gives a client confidence that the evidence we generate will stand up to challenge. We can also build a team that has technical depth in terms of MEL, the subject area, and knowledge of the organisations we are working with. This also helps build confidence.
However, our panel agreed that MEL is not an end in itself, but a means of helping people do their jobs better. This comes with a challenge: how can we avoid the common pitfall of introducing jargon and systems that risk creating new problems on top of the ones our clients are grappling with already?
It is too easy to fall into the trap of developing frameworks that are gold plated, but actually not at all what the organisation needs. Wycliffe recalled that early in his career, with the Conflict, Stability and Security Fund, he faced the challenge of limited capacity of partners to absorb and respond to the learning and knowledge being generated through MEL. It took time to adapt to this and develop systems and processes that were suited to partners’ needs and realities. He reflected:
“Success is best achieved when MEL is well tailored to the context, for practical application, and well fitted to the various users.”
Be empathetic and build trust
We must use our power as evaluators with empathy, in order to build the trust required to drive change. For us, this means delivering evidence and insights in a way that appreciates our clients’ realities and challenges. Without this it is very hard to have impact.
We are privileged as evaluators to be invited into an organisation to ask challenging questions, surface tensions, and make judgments on what is of value or has merit. This gives us power. To use it effectively, we need to take the time to understand where people are coming from, why actions have been taken, and why other actions haven’t been taken. This requires not coming in with predefined ideas on how things need to be done, and listening deeply. It requires us to have empathy.
Empathy also lays the foundations for trusting relationships. If a client doesn’t trust that we are there to support them in moving forward, then no evidence or insights, however clear and credible, are likely to influence their thinking or decisions. Our job is often to deliver tough messages, but we can still do this while showing understanding of the clients’ perspective and constraints. We’ve seen good, credible evaluations fall flat because the team didn’t invest enough in relationships, instead taking the position that they were the independent experts and the client needed to simply accept the evidence and recommendations. This approach does not lead to impact.
Where do we go next?
Our challenge now is to continue learning from stories of success – and of failure – to help us get better at making an impact through our work.
We’ve developed an internal ‘impact tracker’ to help us do this, and are currently working on embedding learning and feedback loops more consistently across our project cycle.
We are also evolving our internal Learning and Development programme to ensure our teams are not only skilled in evidence generation, but also experts in building the trusting relationships with clients that we know are so important to us having an impact.
And lastly, after the success of our first Knowledge and Learning Day we’re planning another one next March as part of our ongoing efforts to be a better learning organisation.