Project managers can often be linear thinkers, I have been one of them. But in the last 10 years I’ve realised that such a mindset can halt the advance of data driven project delivery in its tracks. From the need to see a detailed benefits case to a plan with all the relevant lines of development.

We work with around 50 different organisations and I wanted to share some observations that I hope will help you to shape your own journey.

I see 2 types of organisations, at different ends of the spectrum.

1.      A systems approach. Organisations who spend a lot of time mapping the current landscape, developing a strategy, then outlining all the separate swim lanes of activity that need to be delivered to underpin a data driven project delivery strategy.

I advocate such an approach so that the organisation is able to develop a roadmap, prioritise, then deploy resources. But for many, they then get stuck in a Gordian knot that they can’t do x until they have done y. They need to put a cloud architecture in place, then work out how to manage personal data, then get it approved by security, then integrate it with their supply chain.

They can’t put the data pipelines in place until they have the cloud services live. But they don’t have a right to supplier data because it isn’t defined in contracts, so they have to write this into future contracts and wait for data volumes to build. But they don’t know what data they need because they haven’t understood their use cases, which takes another 6 months of consultation. They also need to train everyone.

This linear thinking can all take years, during which the technology stack continues to evolve.

2.      An experimental approach. I see other organisations who focus on a hearts and minds campaign. Developing solutions that solve specific pain points, demonstrate the art of the possible and build coalitions with like-minded people. They also tease out implementation issues, understand which partners are invested in a similar journey and use this feedback to shape next steps. An agile approach, but underpinned by a roadmap that tackles each of the swim lanes, often in parallel.

One organisation I know has encountered push back from their IT department working more as a gatekeeper than an enabler. They rapidly worked around this by pushing the analytics closer to the point of data creation, embedded within the supply chain. Data pipelines can then be developed to assure the data and help to detect whether it has been manipulated. They are able to leverage existing infrastructure and work at pace. In parallel, they can explore emerging data pooling arrangements such as data trusts, building confidence and teasing out issues as they go, rather than a big bang approach.

Such an approach tends to be driven by visionary individuals, who often fight against the system. They see the world differently and understand the emergent nature of the challenge.

There is a place within data driven project delivery for linear and non-linear thinkers, but I would urge linear thinkers to use architecture and roadmaps as a handrail rather than a straightjacket. It is almost impossible to get all of your ducks in a row from the outset, particularly when some of this is pushing unexplored boundaries.

Use the roadmap as a mechanism to identify all the parts of the jigsaw and prioritise. Even better if you are able to do this in collaboration with your partners and clients. Take them on the journey with you, looking outwards rather than introspectively. When you crack this nut you’ll start to unlock end to end data pipelines and align them with the challenges that you are wrestling with.

Run pilots, experiment and iterate, but aligned against a master plan.

Such an approach is exactly what we are advocating via the Project Data Analytics Task Force where we are developing function-based visions for 2025, then developing modules incrementally to underpin each vision. By working collaboratively, we can accelerate the rate of change for the benefit of all.