I listened to the MPA’s recording from Chris Collison on lessons learned in projects today. He provides some great advice on facilitating lessons learned reviews. Many thanks to the Major Projects Authority for facilitating it.
There are some excellent points to take away and I would commend it to those who have an interest in developing lessons learned methods. But I fear that we are walking ground that is well trodden and history has illustrated that organisations struggle to evidence the value that these processes deliver. I have seen numerous organisations that stop-start-stop these initiatives which are all based on common sense but struggle to compete against other organisational initiatives that also require scarce funds. The cost of collecting lessons learned extends far beyond the knowledge manager or PMO, it can consume a huge amount of resource and organisations need to be assured that it is delivering value. If often doesn’t.
I’ve reviewed lessons learned from a wide range of organisations and would suggest that as a profession we need to question whether we stop ‘lessons learned’ altogether or identify a new path. I have pulled together a list of around 80 user stories that lessons learned processes are trying to address and for a lot of organisations, the process will only satisfy between 10-30% of them. It is failing to deliver the value that the organisation needs.
Why? Mainly because a lot of organisations treat it as a tick box process and don’t link the process back to addressing the original intent and ensure that it is evidenced.
Simplistic I also fear that Chris’ approach is a little simplistic. We know that ‘lessons learned’ is a wicked problem so it requires a level of rigour to resolve it. There is a big difference between ‘knowledge’ and the application of that knowledge. There are a raft of NAO reports that summarise project failure and the issues are not associated with the PM’s not having the knowledge. They know how to manage risks or schedule, but they often don’t get the chance to apply the methods as they would want because of real world constraints. Could I make an analogy….if you know that a stretch of road has a lot of accidents then you adapt your driving style to match the conditions. Putting up a sign saying don’t drive at 60mph will not influence some people’s driving style. In projects, PMs rarely have sight of the statistics so use their own intuition to make priority decisions. This isn’t a matter of lessons learned because the environment for each project is different. Its a matter of a probabilistic distribution of factors that influence success and the impact of these factors on key project parameters. We need to move the science on from ‘lessons learned’ to ‘leveraging experience’ based upon data and evidence.
Could I provide a few examples of why I believe that the current processes are too simplistic:
- Six Sigma There is a blurred boundary between lessons learned and six sigma. This is linked to whether an organisation delivers similar projects one after the other or at the other end of the spectrum, projects are a once every 10 year enabler to core business. How organisations leverage this experience will be very different.
- Agile vs waterfall Agile projects have retrospectives which tend to look back at a sprint or major piece of work. They can often lose sight of the bigger picture. How is this experience, particularly the ‘failed experiments’ captured for others who follow? The approach will differ between agile and waterfall projects.
- P3M vs Technical Leveraging experience from P3M is very different to leveraging technical experience. An organisation may decide to codify how they solved a particular technical problem, particularly where it saved the organisation a lot of money. P3M tends to be about the application of a body of knowledge, which is largely stable. P3M failures occur when processes aren’t followed. It may also occur because of emergence in the project environment, which is more of a complexity management issue than the core principles in the P3M body of knowledge. Organisations will leverage this experience very differently.
- Frequency I would recommend that readers review the Public Audit Committee transcript on NHS24. It provides a very valuable opportunity to leverage experience. The ex CEO commented that their project was a once in a 10 year opportunity and therefore they came across problems they haven’t seen before. However, how many call centres have been implemented in the UK in the last 10 years? More than 1? The answer will rarely be found within the organisational ‘lessons learned’ but within the wider enterprise, with data filtered to the particular circumstances of the project. For example, relevant projects who have tested solutions on communities of millions of users, call centre projects, safety critical projects. How users leverage this experience will be very specific to context.
- Positive lessons I’ve seen thousands of positive lessons learned and although the process of identifying them can be very cathartic, many aren’t useful. They are often self promotional and should only be captured where they reset the bar or provide a unique recipe that delivered demonstrable value. Repeating known good practice results in anodyne material and creates unwelcome fog.
- Impenetrable Learning legacies, lessons learned databases and wikis are all extremely valuable but as the volume of material grows it becomes increasingly impenetrable. Which of the 120 records of good practice is the best practice to apply in a specific circumstance? Users need to be able to slice and dice the data to gain insights relevant to their own particular situation. P3M material will be very different to engineering technical knowhow.
- Users The target audience will have a major impact on how this experience is leveraged. Generalising…A 25 year old PM will have a hunger for knowledge, to learn new methods, tools, review case studies and hunt out good practice. Conversely, a 50 year old seasoned practitioner who has managed 20 >£100m projects in their career will be much more reliant on their intuition and personal experience. I accept that there are 50 year olds who have a thirst for knowledge etc, but my point is that how this experience is leveraged will be very specific to the individual.
Chris has another video on youtube on lessons learned that I would also commend. He provides some great insights. However, I don’t agree with the principle that lessons written down can degenerate into a ‘dead butterfly collection’. If the experience is collected in such a way to satisfy the user needs then writing things down can provide aggregated insights that far exceed the value of an individual experience. Its where data analytics and machine learning can really influence how users engage with the data.
I believe that we are reaching a point where we should either pivot what we are doing or abandon the processes of lessons learned and expand aligned methods such as six sigma. The conclusion that I have come to is that the answer is in the data, underpinned by knowledge management practices such as communities of practice, peer assists etc. But without the data organisations cannot evidence the investments needed to avoid the avoidable.