Documenting lessons learned from your programme is a key responsibility for monitoring and evaluation advisers. This is typically part of donor reporting requirements, but should also an important part of internal learning and knowledge sharing. However, all too often this section of a report is left to the last minute and is written under time pressure. This can result in something bland that talks in generalities.
Knowing how to write good lessons learned is essential for both internal and external learning and knowledge sharing. With the advent of Fail Faires and failure reports this is an increasingly important area for M&E advisers to learn about.
So, what things should you consider when assessing and documenting lessons learned?
Actively collect information on lessons as you implement
Analysing and writing lessons learned from your programme is so much easier if you've thought in advance about how to collect relevant information as you go along. You'll be amazed at how much you (and your colleagues) will forget. If you rely on people's memories at a later stage you can miss a lot.
Consider developing a tool specifically to collect data on lessons. Or integrate this with other tools. PRINCE 2 has a specific template for this purpose. I prefer something simpler and typically use the following questions:
- What challenges did you face?
- How were they overcome?
You can easily adapt or expand these types of tools to fit your own context. Depending on the scale of your programme, it may also be valuable to build in specific times when you review and reflect on the responses. This is particularly helpful for identifying common issues affecting different projects in your programme.
Be clear on your audience
This point applies to pretty much anything you might write. The better you understand their context the better you can communicate. Consider the following:
Project and field staff - Provide a practical overview of issues that affect them during implementation. For example, challenges that come up in training workshops that they are delivering and strategies that some of their colleagues have addressed to overcome these. This might be linked to updated guidance on your programme processes and possibly changes in the data collection tools you are using
Partners and other programmes - Focus on operational level analysis of approaches or methodologies used by the programme. For example, ways in which an approach that worked well elsewhere had to be adapted to overcome challenges relevant to your context.
Donors and policymakers - Give a high level overview of challenges or opportunities that have broader relevance to national or international policy agendas. For example, an emerging issue that is affecting your programme and could also affect similar interventions.
Tie it back to your goals and objectives
Your programme intervention is guided by goals, objectives and a theory of change. This sets out what you aim to achieve, how and what assumptions you've made about how this will work. When analysing and reflecting on lessons, consider what implications they have for the programme framework. For example:
- Are the programme's objectives still relevant and appropriate?
- Has anything changed in relation to the assumptions you've made?
- Is the programme meeting it's overall objectives?
- Are specific activities and approaches working?
- Should these activities or approaches continue, stop or continue with changes to adapt to challenges?
Value for money
Programme effectiveness and value for money are key areas to consider. If you have the data consider posing the following questions for your analysis:
- What is the cost associated with the programme delivering specific results?
- How do the costs for the same activity compare across the programme?
- Are there examples of different activities that achieve the same result but cost less?
Programme and partnership dynamics
In larger programmes the dynamics of working with partners are important to consider. Different partners bring different capacities, agendas, strengths, weaknesses and ways of working. Building a strong partnership means learning about each other and finding ways of working well together. Are there lessons you can reflect on around partnership dynamics? Have you found particular models that work well for you?
Include both positive and negative lessons
This shouldn't need to be said. There's no point just looking at the positive issues coming up. Equally (if not more important) are the things that didn't go as expected. Your monitoring may highlight which activities are delayed or not working well. However, you may need to be pro-active in following up with colleagues and partners to understand why things are not working as expected.
Documenting why things didn't work and what corrective action your programme took to address this can be one of the most valuable lessons you can share with others.
Don't forget the un-expected
Keep an eye open for the truly un-expected. In most cases the lessons we document are things that others may have experienced already. They are important, but not necessarily something new and innovative. Alongside this, consider what was not part of the intentional design for your programme. Did you see un-expected results? These can be the start of something truly new and innovative.
Share your lessons learned
Finally, don't forgot to share share your learning with others. Consider creating a section on your website to publish learning reports. You could also share them more widely via knowledge services like Eldis or Zunia.