Skip to content

Blog

Reflections from Development Studies Association Conference 2019

Itad's Ekaterina Shaleva reflects on her recent trip to the DSA conference and recent thinking on logframes and their utility in accountability and learning.

26/07/2019

In June, I attended the annual Development Studies Association (DSA) conference on the theme of ‘Opening up Development’, hosted by the Open University. The goal was ‘to draw attention to shifts in the global political economy; new forms of development intervention and activism; and the call to ‘de-colonise’ the teaching and learning of development studies.’

The thematic stream of panels that captured my attention, while skimming through the booklet, was on ‘impactful development’, which aimed to explore some of the challenges surrounding impact, evaluation and capacity strengthening for development research and practice. Aoife Murray from Itad and Dr. Elizabeth Dávid-Barrett from the University of Sussex presented learning from two DFID-funded anticorruption evaluations in the Caribbean as part of this stream. They shared the key findings of the evaluations and the purpose-built tools they used to measure capacity-building efforts.

Logframe Utility

The discussion after my colleagues’ session focused on the issue of logframes and their utility in accountability and learning. While the importance of having a plan of action during the design stage to achieve impactful development was not dismissed, people felt it had to be fully adapted to the specific context and remain flexible to change throughout the duration of the programme if it were to lead to meaningful learning.

Rigid logframes can exclude certain groups (e.g. small grassroots not-for-profit organisations), which often do not have the financial or capacity resources to commit to filling out pages of reporting forms. It makes sense. As speed and efficiency are prioritised, what can be easily measured almost always takes precedence over what is not clearly quantifiable. This leads organisations with underdeveloped M&E systems to resort to the use of ‘vanity’ metrics such as website traffic and attendance sheets. These metrics are often biased toward the short-term, making them unsuitable for use in strategic decision-making.

Flexible logframes, suited to the needs and capabilities of a wider range of organisations, will shift attention away from their regulatory purpose alone toward the dual objectives of accountability and learning. It will also help traditionally excluded groups to join the reporting cycle and, potentially, scale-up impact.

‘Opening up’ the Audience

While such discussions are not in any way new or innovative in their propositions, I was acutely aware of who was in the room. The conference’s theme of ‘opening up development’ was manifest in the discussion, partly because it was driven by the audience, many of whom were heading such organisations; the absence of donor organisations, however, was evident. Collaboration with donors,  bringing them in the same room as researchers and practitioners, will be essential to ‘open up’ development teaching and practice, and eventually lead to the impact we hope for.

Looking Ahead

The next steps proposed by the facilitator and conference participants was the creation of a community of practice for people who would love to continue the conversation – it was encouraging to see the level of interest to organise and learn more about evaluation practice in development.

Thankfully, the evaluation community is long past the mantra of ‘logframe as the Bible’. The movement of adaptive management or ‘doing development differently’ is gaining followers and attention by high-profile stakeholders. At its core, it emphasises flexible project and activity design, and management that responds to new information and changes in context. If you want to learn more about adaptive programming, we’ve been blogging about it for the last few years – search ‘adaptive programming’ on our website to find out more!