What is the key to effective project management?

7 min read

In a world increasingly governed by data, how many organisations truly understand the power of analytics?

In programme management, much as with business management, each time a project is managed from one stage to the next, much of the data generated, such as schedule, risk registers, budgets and lessons learned is stored, but it’s rarely used again within either the project it initially comes from and even less likely across the business or sector.

Imagine if historical data could be harnessed to benchmark future delivery. The approach to projects and programmes could be completely transformed and new possibilities discovered.

While the concept of data analytics isn’t new, it is clear that while the levels of data we have access to has increased significantly, the way it is used hasn’t evolved at the same pace. As a result, businesses are sitting on goldmines that hold the very answers to the challenges that keep them up at night.

Knowing the art of the possible and overcoming the barriers to data analytics is the key.

Standardising our tools

Too often our data is held in disparate and an ever-expanding number of tools.

Historic development of in-house or proprietary tools and systems often means data does not have the necessary standardised coding or attributes embedded within those systems to allow for the easy extraction and presentation of information for analysis. As a result, a significant effort is required to simply extract and cleanse data before being able to undertake analysis.

By adopting a standardised coding of datasets, we can spend more time extracting value and emerging trends from our data rather than cleansing and collating data that project managers already have.

We have seen the push for open-source coding for Building Information Modelling (BIM); is there really any reason why the same principle can’t be applied to key data used in project performance measurement and wider business operations?

Trusting the data

For project managers to confidently rely on data findings, they first need to trust the data sets and recognise the output.

To extract value from data and analytics, we must employ, train and develop not just the coding specialists, but also the “translators” who combine data competency with industry and functional expertise.

In this way, we start by identifying and articulating the business need and project impact, rather than from the view of a solution looking for a problem. As architects of the output, project managers and project leaders are able to buy into and therefore trust the insights being provided.

Secondly, the means by which data is gathered, analysed and presented must be transparent. And indeed, company leaders will not buy into insights generated in a non-transparent manner, especially if those insights run counter to a project manager’s “gut” feeling.

Visualisation will prove to be the cornerstone of trusting data insights and analysis. As trust in the data sourcing and quality grows, so will the volume of data and its complexity. Distilling this data will be essential to make the insights of data analysis digestible for project and client leaders.

War on aggregation

Real-time and near real-time data is becoming much more common across the full spectrum of projects.

But highly complex programmes – particularly large-scale infrastructure programmes – produce such a range and mass of data points that aggregation becomes the norm for reporting and analysis. That approach brings with it its own disadvantages.

There is an inherent difficulty in making valid multi-level inferences based upon a single level of analysis. Similarly, too little data grouped by too few variables runs the risk of making macro-level analysis from micro-level relationships. Equally unhelpful.

While the draw of using a mass of data can offer us unexpected insights, it can also sometimes be the enemy and it is worth taking time to consider grouping hierarchy of any data sets to ensure your analysis is not lost in the noise.

As noted in a 2016 McKinsey Global Institute Report, the Age of Analytics: Competing in a Data-Driven World, when humans make decisions, the process is often muddy, biased, or limited by our inability to process information overload.

Data and analytics can change all that by bringing in more data points from new sources, breaking down information asymmetries, and adding automated algorithms to make the process instantaneous. As the sources of data grow richer and more diverse, there are many ways to use the resulting insights to make decisions faster, more accurate, more consistent, and more transparent.

The link between data and project success is inextricably linked. The ever-growing volume of data that's being made available provides a springboard via which organisations and analysts can optimise their activities. But it’s our capacity to manage this data - identifying the useful amidst the useless and turning that into coherent and informative headlines - that will usher in a new age of project management.


data centres
I want a better perspective on