Lessons Often Not Learned: Reporting

I’ve acquired a dataset of nearly 10,000 lessons from a wide range of projects and programmes. There are some fascinating insights from this dataset that I will be sharing over the coming months. One topic of interest to a lot of the project management community is reporting. The following bullets summarise the raw (negative) lessons extracted from the dataset and associated reference material; how many apply to your organisation and how is it impacting the performance of your portfolio?

I’ll draw on positive lessons and good practice towards the end of the article.

Lessons identified

    • Failure to define reporting processes, drumbeat, templates or exemplars.
    • Failure to agree tools, level of automation and ensure compatibility. Lack of standards or standardisation.
    • Failure to clearly define reporting lines and/or reporting requirements.
    • Failure to follow reporting processes.
    • Failure to issue reports when required.
    • Failure to ensure that there is regular and careful progress (time, scope, cost) monitoring and review throughout the project.
    • There is a lack of confidence (including lack of measurement) in the data quality, completeness, timeliness, integrity and consistency that underpins the report.
    • Failure to provide key management information to support decision making.
    • The project is tracked on the basis of key milestones rather than aggregated milestones (you get what you measure).
    • There is no measurement of the overall project ‘vector’.
    • Failure to identify key issues within the report. Bad news is either not included, disguised or masked. Failure to report bad news and ensure open and honest project reporting. Key information is omitted.
    • Ignoring information that might show that the project is struggling, reinforcing built in bias. (tends to be driven by the PM’s bias rather than deceit).
    • Failure to ensure that information is current and relevant. Reporting is not backed up by evidence from the schedule or similar. Reports are resubmitted with stale data.
    • Failure to provide evidence that the reporting is actually being used to influence decision making.
    • Failure to recognise and manage information overload.
    • Reports are produced IAW process but are not read and there is minimal benefit of their production.
    • Failure to challenge and assure the reporting process.

    Portfolio level reporting

    • Failure to aggregate reporting to a portfolio level and extract portfolio level insights, resource conflicts, priority conflicts, hot spots, key interdependencies etc.
    • Reporting tools lack the ability to drill down. Reporting tools are overly complex and can take minutes to update in real time usage.
    • Lack of consistency in reporting across the portfolio.
    • Failure to flow project level information upwards and filter appropriately. Key data and analysis is omitted because of agendas, bias or a mistaken belief that the issue will be resolved before the next reporting period.
    • Failure to provide adequate management information to manage the portfolio.

    (I’ll blog separately on project controls in due course).

    The key take home points from this analysis are:

    • Senior level engagement. The greatest influence on successful reporting is senior level engagement. If the reports are used by senior management to manage projects or portfolio then the underpinning data tends to be challenged, quality improves and the reports enable effective governance.
    • Develop a reporting architecture otherwise you’ll be swamped with more and more data, often overlapping. Ensure that this aligns with the drumbeat otherwise data may be extracted at different times leading to inconsistencies which are often easy to detect – they become an unwelcome distraction. Appoint a single point of accountability for designing the reports and ensure they receptive but also resistant to the inevitable criticism.
    • Develop the reports in an agile way. Explore what works and iterate. Be careful about it becoming an industry in its own right to the detriment of managing project delivery. It requires a careful balance which comes with experience.
    • Reporting costs money. Many organisations use reporting as a comfort blanket to give the illusion of control rather than demonstrating control. In these circumstances the people producing the reports can view it as a thankless task because there is no feedback loop; quality rapidly degrades. Ensure that it is scaled to the demands of the business and the capacity of the management team to absorb it. Periodically review the value that reporting delivers for your organisation.
    • Automation. Some organisations have automated reporting systems that extract data from corporate systems such as scheduling tools, project accounting, risk tools and resourcing. They are supported by narrative and insights. This saves a lot of effort, ensures that the reports accurately reflect reality and reduce bias. Tools such as PowerBI and Tableau help with to visualise the data, particularly at a portfolio level where it can be difficult to see the wood for the trees. Try and limit Powerpoint engineering – as a senior manager you may not appreciate the burden that this places on your business.
    • Visualisations. There is an industry associated with visualisations covering symbology, colours and text. When you have designed the layout, the challenging part comes in automating it, particularly if you would like to create bespoke symbology (I’d caution against it – it can become very costly to maintain). There is plenty of advice available on this – try github, PowerBI forums and meetups. We wary – bespoke reporting tools can run into millions, with a significant ongoing maintenance burden.
    • Deep dives. You may choose to supplement your regular reporting by deep dives into specific themes such as resource management. Ensure that your reporting and data architecture considers this because ad hoc reports can create a huge burden on the delivery teams.
    • Report horizon. The reporting horizon is important and is something that tends to be taken for granted. There is a balance to be struck between looking backwards and looking forwards. I’ve worked on projects where the past is often forgotten because ‘things have changed’ yet trend analysis can often give a good indicator of future performance, particularly when mapped against comparables within the portfolio. The vector is often more important that current conditions. There are a multitude of methods for forecasting the future and I’ll cover these in a separate blog, but its important to recognise that they can be heavily influenced by inherent bias and group think.
    • Spin off benefits. The process of creating a report can often encourage project teams to revisit actions, risks and issues which may otherwise to be unmonitored. But would you prefer your team to manage these according to priorities or because a report is due? The latter can often result in superficial updates to appease the reporting process rather than to resolve the issue. Maintaining metrics on action performance (completed vs due date, amount of times the milestone was rebaselined etc) can often provide a better indicator of how well the team is managing actions, risks and issues.

    Project summary. It can also help to maintain a separate 2 page summary of the project for those who may be new to it, supplemented by a more comprehensive potted history of key events, decisions, facts and figures. This enables the author of report to keep it brief. You may view all this as motherhood and apple pie and that your organisation has learned its lessons. But a 2015 review from the Canadian Auditor on the SAMS project helps to bring reality to life “Project staff told the Executive Committee that SAMS had 418 serious defects, and that 217 of them could be handled by just 27 workarounds. However, we found that SAMS actually had 737 serious defects. Ministry staff explained to us that the remaining 319 serious defects were not shared with the Executive Committee because they had started developing solutions or fixes for them. They also explained that these fixes were in various stages of development or testing, however they were not fixed before SAMS was launched and therefore continued to have an impact on SAMS. Project Staff Conducted Fewer Tests than Reported, Results Incorrectly Stated. Executive Committee Did Not Know One in Eight Interfaces Not Tested. Executive Committee Did Not Know that Pay Runs Not Fully Tested. Executive Committee Did Not Know that Converted Data Was Not Fully Tested And How Many Errors It Contained”. Are you sure it won’t happen to you?

    Good Practice
    Ultimately, the report needs to be appropriate for your business. If it isn’t, governance suffers and project performance will degrade. There is a wealth of information out there, but one of the best points of reference is the Crossrail Learning Legacy website. Their reporting page provides a useful summary and is supported by procedures, templates and guiding principles. If your organisation is going to invest heavily then I’d recommend attending some of the data analysis meetups and share good practice (the London Business Analytics community is very supportive and provides a gateway to a wealth of useful videos). Safford Black’s series of reporting blogs also provides useful guidance. Google ‘project data visualisation’ images and you’ll have a wealth of inspiration. But I would urge you to scale the approach to the conditions of your business otherwise it won’t endure. Good practice is relatively easy to identify, but sadly, poor practice is commonplace.