Breaking the Cycle of ICT Project Déjà Vu

I’ve recently spent a lot of time reviewing NAO and GAO (US equivalent of the UK’s National Audit Office) reports regarding lessons learned on project delivery. I found the report from the GAO on ICT projects particularly compelling, especially when we look at the scale of the challenge (see the summary of projects below). If we add it up and combine it with similar reports from the UK, Canada and Australia its the equivalent of the GDP of a small nation!
I’ve copied below some observations that the GAO made on the reasons for project failure.

  • These and other failed IT projects often suffered from a lack of disciplined and effective management, such as project planning, requirements definition, and program oversight and governance. In many instances, agencies had not consistently applied best practices that are critical to successfully acquiring IT investments.
  • Federal IT projects have also failed due to a lack of oversight and governance. Executive-level governance and oversight across the government has often been ineffective, specifically from chief information officers (CIO). 
  • They align with hundreds of reports that have gone before, including the APM’s 12 project success factors and a wealth of supporting research.But does this come as a surprise to anyone?

    Assurance reports and lessons learned reports are full of words of wisdom on why we should get better at risk management, scheduling, requirements etc. We all know this, but we fail to do very much about it, so it keeps happening and we keep writing down in our lessons logs that we need to do better next time. Its a vicious circle that is costing the economy billions and we are obligated to break it. Lessons learned logs often give the illusion of learning, but the evidence very rarely supports it.

    Lessons learned are all grouped into one big bucket, but they are often very different; the lessons on how to manage interfaces on safety critical rail infrastructure are very different to the lessons from not implementing the P3M manual properly. They require different methods and the investment in those methods is driven by priorities and return on investment. Both of these are driven by data.

    The starting point is to understand the size of the problem, how frequently lessons arise, the impact of the lessons, root causes, trigger events, how they correlate and snowball. All this needs to be captured within an evidence driven data model, otherwise, we keep looking at snapshots from assurance authorities, lose sight of the overall picture and fail to create the case for change.

    We can then apply the dataset to particular types of projects within a defined project environment and characterise what is likely to trip them up. A cost/benefit assessment can then be applied on where to prioritise investments. Without understanding the data we’ll never know what it is costing to keep repeating mistakes, how much of it is avoidable and what it costs to avoid the negative lessons. We will continue to fail to create the impetus for the transformational change that is required. Is it time for us to combine our forces and do something about it?

    I’ve included a summary of some of the more newsworthy ICT projects below, but there are hundreds more. Its frightening stuff.

    • The Department of Veterans Affairs’ Scheduling Replacement Project was terminated in September 2009 after spending an estimated $127 million over 9 years.
    • The tri-agency5 National Polar-orbiting Operational Environmental Satellite System was stopped in February 2010 by the White House’s Office of Science and Technology Policy after the program spent 16 years and almost $5 billion.
    • The Department of Homeland Security’s Secure Border Initiative Network program was ended in January 2011, after the department obligated more than $1 billion to the program, because it did not meet cost-effectiveness and viability standards.
    • The Office of Personnel Management’s Retirement Systems Modernization program was canceled in February 2011, after spending approximately $231 million on the agency’s third attempt to automate the processing of federal employee retirement claims.
    • The Department of Veterans Affairs’ Financial and Logistics Integrated Technology Enterprise program was intended to be delivered by 2014 at a total estimated cost of $609 million, but was terminated in October 2011 due to challenges in managing the program.
    • The Department of Defense’s Expeditionary Combat Support System was canceled in December 2012 after spending more than a billion dollars and failing to deploy within 5 years of initially obligating funds.

    In austere times don’t we owe it to society to do something about it?