Data in construction: getting the value

Partner: Atkins

Large scale engineering projects can give rise to many different unexpected issues that result in delays and higher than expected costs, but they also provide vast amounts of data.

Traditionally for large engineering projects, collecting the data, identifying the useful elements and then analysing it all was a time and labour-intensive job. Even if this was done successfully then using the data to make changes to a project that consists of many different teams and contractors was nearly impossible.

Atkins, a world-leading design, engineering and project management consultancy have been finding a way of putting all the data their projects produce to good use. By making use of machine learning and advanced analytics, they have developed an initiative called Lighthouse to streamline the sorting of data and using artificial intelligence to propose effective solutions.

Data management

Atkins CEO Richard Robinson says major engineering projects, those that cost at least £200m, can become like a series of multiple large projects. Each part collects large amounts of data but different teams work mostly independently from one another.

Providing all areas of the project large amounts of data coming from a centralised place allows all the different parts of the project to have a single truth about the state of the project.

Tom Goldsmith the product manager for the Lighthouse describes the issue Atkins was facing before implementing Lighthouse. “We have a lot of different specialists, we have our project controls teams, we have our designers, we have project managers, a lot of different people who collaborate and interface together on design projects. What we found is that in some cases, perhaps not all, but in some cases, there’s a degree of siloing, between those different teams. And teams are often creating highly refined processes within their own silos.”

However, to understand where deployment of data driven solutions may be most effective, first the Lighthouse team had to understand the data that its staff were producing.

They identified three different major data sets. Firstly, data on the schedule of the project, then the design data produced through the course of design and lastly the project’s cost data.

As Tom Goldsmith puts it, “if we can triangulate between those three systems then we have a really clear picture of what’s going on now, and then what that might mean downstream for ourselves and different disciplines that are interacting with each other, and the construction partner or other stakeholders who are who are interfacing with the design teams.”

To make the most of this structured data the Lighthouse team have been helping their employees develop new skills. Senior experienced data engineers lead junior team members often from more traditional engineering backgrounds, this combination allows for an

understanding of what it’s like on location for the engineers and also develops the engineers understanding of data science.

Gyan Mahate the operations lead for Lighthouse describes how the this process went, “So we’ve got some what I’d call hardcore data scientists, data engineers that have done it through and through, who are probably leading some more junior members of the team that come from the more traditional engineering backgrounds, the people that have that digital aspiration to get involved in it, but come from a traditional infrastructure engineering, design background, then we’re kind of coaching them on that journey. And that’s where we get the benefit of them understanding what’s it like, what’s it like on the coalface, what actually needs to get done to deliver X, Y, and Z to a client and the project. But they’re also understanding the vision of data science and data engineering from those experts and in more research or theoretical spaces.”

Machine learning

After creating a large data set from across an entire major project, with an interface available to all the teams from design to construction contractors, Lighthouse were not satisfied with simply being able to alert project managers to an issue but also to help them resolve it.

“It’s not sufficient for us to just present a black box back that says, “your project’s a mess”, we need to be able to give them the tooling that allows them to click through in the level of detail that any user—and users might be different—can kind of satisfy themselves that what’s going on under the hood works.” says Tom Goldsmith.

Atkins are connecting together three different data sets, the schedule that has been created for the project, the design deliverable data and the cost management data.

Using all the data collected from across the entire project’s network along with machine learning algorithms Lighthouse can be used to forecast potential issues downstream and share that information across the project, including with 3rd party teams.

Before Lighthouse the different teams would create what Tom Goldsmith describes a ‘highly refined siloing process’. This meant each department was creating effective systems and processes within their own teams, however not having a way to interface between teams.

Lighthouse allows teams to share and see the same data and then it can provide data driven solutions and allow the engineers and designers to see and understand how the system is working.

Beyond project data

However, Lighthouse could not just use project generated data it also had to incorporate business rules that Atkins follow and wider real-world considerations. For example, regulations or contractual requirements, even issues like labour shortages in a certain sector or supply chain issues for an independent subcontractor.

Marianna Imprialu, Lighthouse’s lead data scientist explains the issue, “Business logic is stored in the heads of experienced staff. And it’s not necessarily the most easy thing to do, to extract this and make it translatable to a machine. We have found a way of introducing people in to the project that actually represent this business logic and are responsible for liaising with multiple other people that are not directly part of the project, but are contributing to the project and explain and provide little by little revealing exactly what is this complex system of rules that we have in our heads.”

By combining all the data from a project with business logic the Lighthouse system will be able to offer solutions to any issues that may arise, and a data driven approach works differently to humans.

This process is similar to computers playing chess. Rather than taking the human approach of using experience and intuition to see how the game will develop, an AI chess machine can take every possible move and see how every future move and countermove will impact the game all the way through to the end.

Marianna Imprealu says “The analogy with chess is quite good. The analogy of what is your next movement is quite good. We call it in data science ‘the bin packing problem’ as well. And it is again, yes, it reminds of a game that can be played in 1000s different ways, but there is one that it’s by far the best, that would be optimal.”

Testing the system

That all works well in theory, but Atkins had to find a way to test Lighthouse in practice. It is not easy to test an experimental analytics system in a live situation when working on a major project for a client.

Atkins decided to run a test by having a team of engineers use the Lighthouse system while mirroring the work of their colleagues on a highway project who were using traditional project management methods.

Will Squires, the project director for Lighthouse, explains the testing process, “We were testing the insights that our sort of AI models were creating, alongside the insights our design managers were coming up with, to very much provide that real time feedback in terms of what’s correct or not.”

The results of the test have not been showing that advanced analytics are superior to expert human design managers, but instead a design manager with access to advanced real time analytics to help make their decisions is more effective.

Will Squires views the role of Lighthouse as “a way to give our design managers and engineering managers body armour, or you know, one of my colleagues referred to it’s kind of the Iron Man suit, and a way to help give us better insights, better decision support, to supercharge their ability to make decisions.”

On top of assisting decision making, it also frees up time for people to focus on more important tasks then collating, analysing and presenting data to other teams, as everyone has the same data available to them in real time which is easy to view and understand.

Richard Robinson views these factors as overwhelming evidence of the benefits of bringing advanced analytics and machine learning into the construction industry. “There’s an awful lot of time and effort that can be reassigned to something more value add in that process. And then by assigning that high level resource to something more value add, I think you start to help the leaders and the decision makers of the business have proper insight they need and therefore the ability to make decisions.”

ARTICLES
Build

The right kind of social value

Partner: Atkins When it comes to adhering to standards, working within regulations, and ticking boxes, companies are pretty great. It’s how they are set up,

EPISODES