I met a senior civil servant from the MOD recently and discussed the potential for advanced project data analytics. His perception was that every project is unique and that there is very little opportunity to leverage analytics. His mind was made up.
This is something I experience on a regular basis.
Although the ultimate ambition for data science is to make project management autonomous, it is a long way off. But there is a huge amount we can do now.
One small component of this is to develop fantasy football lead parameters on each member of your team. How thorough is their risk analysis, often do they update the risk register, close down actions, update the schedule?
The danger of such an approach is to turn everyone into process monkeys at the expense of delivery. But there is a correlation between adoption of process and performance and organisations are beginning to quantify it. There are also correlations between experience and delivery. Many of the risks are foreseeable – yet the risk registers end up in file 13 at the end of a project. There are a number of opportunities to leverage data.
In 2015 I delivered a small contract to develop a schedule for a major MOD project to get through its approval process. I had to dig through pages of process and hold numerous meetings to discover the length of each task. Yet, the MOD were delivering 100s of approvals every year, with each type of project following a similar process. We developed a 3 point risk loaded estimate to highlight how long it would take us. However, we didn’t correlate this back to the distribution of performance of other projects – the evidence just didn’t exist. The ‘inside view’ ruled yet the ‘outside view’ contained the insights on issues that had tripped up other projects. Furthermore, the inside view was largely the MOD view rather than the end to end view.
I have also experienced the perception that because industry would work really hard, work over Xmas and do anything for a fast buck, they could rely on industry to recover any delay. An unqualified and dangerous assumption that stores up issues for subsequent phases; ‘industry developed a plan with a load of holes in it and we just can’t trust them’. Has anyone else has experienced this?
Although MOD projects are all different they follow a similar process of procurement, requirements management, stakeholder management, management of complexity etc.
They spend billions every year.
Yet much of their analysis is inward facing, considering how the MOD procures equipment rather than the end to end supply chain. The latter includes petabytes of data that they will never see, yet it could provide them with access to insights to improve process.
A Case Study
In ~2007 I initiated a research programme to reduce the costs of integrating a weapon onto an aircraft. Each integration was ~£200m and 10 integrations were programmed in the plan. Our challenge was to reduce the cost by 20% with a stretch target of 50%. I bid for £50k of research funding and despite a similar failed investment I secured the funds. This was followed up by a second investment of £120k. This was essentially seed investment to align the supply chain towards a common goal, underpinned by evidence that each of us had. As a collective we confirmed that we could easily save 20% and that the 50% savings target was realisable if we changed processes. The initial £50k investment that I secured could potentially deliver savings of £1bn; all from an idea to securely share data and work collaboratively.
When I was project lead for a $1bn infrastructure project I experienced similar challenges. They are certainly not unique.
This was the days before we were able to unleash the power of advanced data analytics. Imagine what we could do today.
This is a plea to all of the public sector to take a step back and imagine what we could do with such a capability. It has the potential to transform how we deliver projects.