I recently read the lessons learned report (Lessons Learned from UKCS Oil and Gas Projects 2011-2016) published by the UK’s Oil and Gas Authority. It is a fascinating read with some very important insights. However, like the majority of lessons learned reports, its not clear how to exploit it. As Brexit nears and UK PLC focuses on improving its competitiveness, its important that we leverage these insights and translate them into tangible and measurable improvements in project delivery productivity.
- Understand the data The starting point is to get a better understanding of the dataset. By segmenting the areas of technical knowhow from portfolio, programme and project delivery good practice, we can tailor how we manage the lessons.
- Knowhow The OGA report contains a number of lessons associated with technical knowhow. This information is typically disseminated through a range of mechanisms such as learning legacies, online wikis, peer assists, knowledge cafes and personal connections, but its not clear to what extent this happens in practice. The effectiveness of this will undoubtedly be constrained by the extent to which organisations wish to share knowhow which may be regarded as commercially sensitive.
- As the dataset grows it can become increasingly unwieldy and although the latest search tools help to improve the relevance of search results, it can be difficult to identify adjacent knowhow which may be relevant to the project…if only the person conducting the search knew about it. Organisations such as NASA are leading the way. They have a huge dataset of 170,000 lessons and are using tools such as R for modelling the data, Neo4J for visualising the data and Linkurious for visualising the graph database. Recent advancements in data association tools make the job of managing, discovering and consuming complex repositories of technical knowhow related lessons much easier. We are turning a corner.
- Good practice Lessons associated with Portfolio, Programme and Project Management good practice are a little more challenging. Some of the lessons are bland, or for the seasoned project practitioner are regarded as ‘motherhood and apple pie’. Advising a project manager to do their job properly is unlikely to result in targeted action and the lessons reports rapidly become shelf ware. As a profession we have a good understanding of the causes of project failure…. but they keep recurring. The route to resolving this is to forensically examine each lesson and translate the lesson into a series of actions. We can then monitor how these lessons evolve over time, including their impact on project performance. A data driven, evidenced approach.
- Assurance Unsurprisingly, the main theme emerging from the OGA analysis relates to assurance, by a significant margin. Noting that the lessons are viewed through the lens of the OGA rather than the industry, this is unsurprising because its their primary role. However, there are a number of sub themes that need attention, particularly challenging the stability of the foundation of estimates that emerge from FEL, including the fluidity of scope and overall level of uncertainty. The level of governance at sanction is also a recurring theme, but its not clear how the body of evidence on lessons is influencing how organisations undertake assurance to support sanction. The lessons should guide the assurance community to ask probing questions to explore the validity of the estimates and maturity of the overall technical solution. Its a rich dataset that is currently underexploited.
- Scheduling is also a recurring theme. Lessons range from the quality of the forecasts through to implementation of good planning practices. Its not clear how the experience and project performance outcomes of the past influence the future; how planning modules are reused and tailored to similar projects. By way of example, the lesson “Ensure you have a robust, resourced cascading project schedule created by competent planners” is difficult to implement. Multinationals will have teams of planners who I know are generally proficient at their craft and will swear that the lesson doesn’t apply to them. But the evidence indicates that it applies across the industry. As a result, key lessons don’t go anywhere because they lack context and forensic insights into the factors that influence the shape and impact of the lesson.
- Data driven approach I’m currently doing some work to analyse a dataset of nearly 8000 lessons from a range of organisations. With a dataset of this size it becomes easier to extract themes and a taxonomy of lessons, but the themes need to be backed up with evidence. It is also important to quantify the impact of the lesson. Without an understanding of impact it is difficult to engage the interest of senior management and to prioritise investments in addressing the lessons – it becomes a simple cost/benefit decision.
- Lessons become risks If lessons aren’t addressed they should translate into risks for future projects, or positive lessons into opportunities. Very few organisations track risk trends across their portfolio or the success of management/mitigation action. This knowledge has the potential to enable significant productivity improvements and secure competitive advantage. I remain confused why this information isn’t exploited.
A call for action The OGA report is a welcome step forward and will hopefully energise the oil and gas community. But having studied the take up of lessons in other sectors, I doubt it. We need a new approach that is driven by analytics and evidence, where lessons are correlated to project performance and segmented based upon how they are exploited. Emerging data analytic tools make this possible and I believe we are entering a new era.I’d welcome your thoughts and insights into how we exploit the OGA report and inject momentum into the debate. As a profession, it would be great if we could aim to reduce the arisings of known negative lessons by 50% within 3 years, or at least reduce their impact. With access to appropriate data we may also be able to estimate the impact that this will have on overall productivity.
Any lessons data out there? I’ve been fascinated with lessons learned analysis for years and generally frustrated that very few organisations exploit this knowledge. For those organisations who profess to exploiting the knowledge, many struggle to evidence it. We have a wonderful opportunity ahead of us and I’m delighted to help shape it.
In support of this quest, I’m very interested in expanding the dataset of lessons learned and using this to derive insights. If any organisations have lessons data and analysis that they are willing to share (even under the control of an NDA) then please get in contact and I’d be delighted to discuss it further.