Skip to content

Blog

Three lessons from the 2017 UK Evaluation Society Conference

Itad was well represented at this years’ UK Evaluation Society (UKES) conference

We were involved in four presentations across a range of subjects (though all rooted in this years’ theme of ‘demonstrating and improving the usefulness of evaluation’), and we had several staff members among the general delegates. Here, some of them reflect on the lessons they took away from the sessions.

Ben Murphy, consultant in the Climate Change theme:

The question of who uses evaluations was raised in at least one session – what do smaller, front-line organisations get out of them? We heard that many charitable organisations find evaluations too intimidating to be involved in as they focus on delivery and fundraising. I wondered how well the evaluation community is equipped to understand the perspectives of organisations who are responding to pressing human needs and throwing everything into the response. There’s a difficult balance for the evaluator between an honest assessment and a demoralising report. But is a positive or neutral account of their efforts the best such an organisation can expect? If an evaluation can be facilitated sensitively and with minimal intrusion, a front-line charity can benefit from external perspectives and there is a great opportunity to get learning quickly into practice.

Aoife Murray, consultant in the Governance theme:

This year’s UKES Conference was focused on the theme of evaluation ‘use and usability’. A very topical area, and one that appears to be increasingly prominent in how evaluators structure approaches. Having worked in public policy in the international development area for a number of years, I am aware of how evaluation can impact decision-making – if it is well timed and planned within the design of the particular intervention. A key conference ‘takeaway’ for me was the glaring battle with decision-makers recognising the value in evaluation, the challenge in placing evaluation at the centre of planning and budgeting, and the challenge for evaluators to place ‘purpose’ and ‘use’ ahead of other reasons to evaluate often prescribed by the commissioner.

Chris Perry, principal consultant in the Governance theme:

Our presentation was part of a wider session on adaptive programming, with others presenting their experiences of evaluating adaptive programmes – compared to our focus on building MEL systems within them. Adaptive programming poses some specific challenges for M&E professionals and there is not yet any consensus on how to best apply (or adapt) M&E approaches to the needs of these kinds of programmes. It was therefore very encouraging to see some common conclusions on appropriate methods emerging from the different presentations, including using outcome mapping and harvesting (which Itad is helping implement in PERL in Nigeria) and mechanisms which enable reflection and adaptation.