The New Way to Plan a City

Author: Alex Conacher

Partner: Atkins

On 11 December 1998, NASA launched its Mars Climate Orbiter. It was a robotic space probe intended to study the climate, atmosphere and surface changes on the Red Planet. It blasted off without issue and cruised the 669 million kilometres over the next nine months before beginning its orbital insertion.

It was then that the orbiter smashed to pieces and either burned up in the Martian atmosphere or skipped off into heliocentric space – the remains of its intricate, carefully designed components forming just so many tiny satellites of the sun, lost forever.

In a complex machine like a space probe, hundreds of thousands of components and processes have to work perfectly to achieve a successful result. Years of planning by some of the most intelligent people, backed by the budget of the world’s largest economy were behind this mission. Troubleshooting every outcome, considering every eventuality. But the error, as is so often the case, was human.

A system supplied by an industry partner was supplying data in Imperial units, while the Nasa system it was communicating with, expected SI units. Specifically the thruster impulse was being provided in pound-seconds, while it was expected in Newton seconds. The miscalculation was fatal for the craft.

It often doesn’t matter how many problems you catch, if that one you miss is major enough, it makes all of the others academic. Nasa said later that it should have checked the figures were matching what was expected. But the answer was likely somewhere, lost in the documentation… buried in some dark corner of a PDF.

Coming back to Earth, what if we examine the most complicated machine we know of? The Mars Climate Orbiter was designed from scratch and self-contained… but what if we look at a city, with its chaos of interacting utilities and structures… overlapping industries and interests… blurred boundaries and ceaseless change. A city built on top of itself, going back to a time before history. What if we look at London?

For the episode this article is based on we partnered with Atkins to look at a system that will allow us, for the first time, to make planning decisions while fully aware of what will be involved.

It will put the impacts of planning decisions in front of planners. The data that underpins our built environment will be unlocked, and we will be able to assess capacity, define needs, and even the oldest assets, dating back centuries, will be transparent and understood.

This system could be considered a digital twin, or perhaps it is better to think of it as a system of systems… a digital twin of the planning process itself. A policy response to the rise of the digital twin.

To understand this we will speak to the Head of Change and Delivery at the Greater London Authority, but first we need to take a step back, and understand what the digital twin is to the built environment.

Digital Twins Explained

Neil Thompson is currently seconded to the Centre for Digital Built Britain as the programme lead for the Construction Innovation Hub programme, but his day job is with Atkins as the Director for Digital Construction. Most of his time is spent on his work at the Centre for Digital Built Britain, a joint venture between The University of Cambridge and BEIS, the Department for Business, Energy and Industrial Strategy.

Neil says, “They have been working on the government’s BIM agenda, and that’s sort of where its origin comes from, the original task group for enabling UK Government to procure data and now [we have] moved into the much broader space of digital twinning, connect connecting infrastructure to the internet and being able to connect interoperable data together.”

Neil is also the co-host of the Digital Twin Fan Club podcast. This is run by a team of professional enthusiasts who interview key figures in the digital twin space – a really wide-ranging group of people, right up to the man who came up with the original idea, Michael Grieves.

The podcast is interesting for anyone who wants to take a deeper dive into the cutting edge, and the future of digital construction or manufacturing. And Neil is the perfect man to explain the digital twin to us. Many readers by now know that it is a digital representation of something in the real world, but Neil sees the linking of twins as an extension of the internet in a very real sense.

“With the internet,” continues Neil, “the first platform was connecting thousands of applications and millions of people together via mainframe computers. So the earliest form of the internet was connecting these big, chunky computers together mostly in government research and corporate environments. Then, as the internet progressed, it started to think about instead of sort of connecting mainframe computers together can reconnect smaller computers together to PCs together.”

This kicked off a process of creating the World Wide Web… connecting documents together, giving it a more human touch, as we know it today.

 “And so that gets us into sort of the hundreds of millions of people and tens of thousands of applications being pulled together. And what we’ve seen in more recent times is that that medium moving from personal computers into mobiles. So mobile phones enabled us to connect and billions of people together on the internet. And we’ve got millions of apps out there that enable lots of different types of services. And we’ve seen, you know, the banking sector transform, media transform, we’ve got media in the sense of music, media in the sense of publication, and TV, you know, we’ve looked at YouTube and Netflix, for example, it’s and you know, SoundCloud, and podcasts and all that type of stuff.”

This is known as the third platform for the internet.

“While I’m talking about here is a proposed fourth platform for the internet, which it starts off as a little bit of a little bit from the Internet of Things. So the Internet of Things is where we’ve now seen the connection of consumer products to the internet. So light switches, fridges, and, you know, certain types of environmental controls for buildings. But this is sort of the next steps. The next step for me for the walk or the extensibility is the internet is moving that into the built environment. You know, we’re going beyond connecting things to connecting the environment. So you know, on the first floor of the internet will connect to millions of people and thousands of apps. And we’ve worked up into connecting trillions of things and connecting quadrillions of connections and of data and what have you.”

This Internet of Twins is the eventual result. We have seen it coming for a while, but there has been convergent technologies… advances in a number of fields that means this has only been possible in the last five years or so. The brute force computation of cloud computing, for example.

But even so, construction has been slow compared to other sectors, and in many ways it still uses computers as a way to augment the paper process… sticking its toe into the shallow end of what the digital world has to offer, instead of diving in.

And we know that human opinions do not necessarily give us good outcomes, and it is desirable to use machines and data to support our decision making. Although it is not necessarily fair to look at all other fields and compare construction with them. Putting books on the internet is slightly different to putting, you know, wastewater treatment works on the internet, for example.

There are challenges with construction, and sharing data is all very well and good, but so much of the information is from a pre-digital age.

A lot of our existing assets and infrastructure are old, some stretching back centuries, and much of it has been retrofitted to modernise.

Neil says, “It’s sort of boiling down to two main areas is the How far do you go with the digitalisation of the past? How far do you go down of understanding what the asset performance condition is right now? So there’s a time horizon. How much of collecting the historical data of a road is of use to you? And how much of understanding as much about that road right here and now is of use to you.”

This information, and information of all sorts of assets in the built environment, is currently an incredibly obscure environment, but is currently being looked at by planners. The processes in this space are more than ready for updating, so, let’s look at the city.

The Planning London Datahub

Peter Kemp is the head of change and delivery at the Greater London Authority in the planning team.

He claims what is possibly one of the most interesting jobs in that he gets involved in the day to day managing of the process of the planning system for the Mayor of London. But also more importantly, he looks after lots of data projects on behalf of the Mayor of London and joining together a lot of the work of all the planning authorities was London to centralise data and information.

And Peter is currently working on the Planning London Data Hub, or PLD.

The planning data hub is actually a database of planning applications. So rather than it being the constraint information that you take into account during the determination on application, it’s actually the data about the developments proposed in an application,” says Peter. “So let’s just say for example, a moment if you were trying to find out if you live next to a site where there’s a planning application, for more than one or two buildings, you would struggle to find a collective set of data about that development proposal… let alone how that cumulatively adds to all of the other developments that are happening around you.”

And there’s a number of reasons for this, predominantly going back to how the planning acts are set up – they are very individual application led. But secondly, the way that the data is received in planning applications didn’t make it very accessible. Data is often hidden deep inside PDF documents. And so the planning data hub actually unlocks that data and puts it in a format that enables you to interrogate it, and to drill down through it to look to cut the data in different ways. So you could look at individual applications, or you could look at numerous applications together within an area. And the logic behind it is it enables better monitoring, but also far more strategic oversight of development happening across a wider area, where you might take into account in developing policy. But you also might take into account when considering other applications.

This is really significant. Up to this point, data has been locked away, inaccessible except for those with specific knowledge, and the time (or financial imperative) to actually look through the mountains of documents.

Peter adds, “There’s numerous other uses for these data sets. Because the private sector often have to produce datasets, and that’s in both in the preparation of planning policy, but also in preparing development proposals. Infrastructure providers struggle to access data about where future development proposals are and the scale of them. And also transport. So to name just a few use cases where they’re desperate for this data set. So whilst we’re using it at the moment, with the local planning authorities over the coming weeks, it will be rolled to all the other users that are waiting for this data set to be available to them.”

They might use it to assess the capacity of existing transport infrastructure and define what additional infrastructure might be needed. So say for example, if we’re granting consent for 500 more units, but across 20 different sites in an area, it might need require the increasing frequency of buses. Or it might heaven forbid, require the increase in frequency of trains. But you get the important point because it enables people to model existing capacity and future capacity and future capacity at different points in time.

The need for near-impossible accuracy

Will Squires, he works for Atkins as Associate Director for Digital. He has focused his career on cities and infrastructure and is the Project Lead for the Delivery of the PLD. His team designed, built, and deployed the technology platform for the GLA. He says that there’s a nice example here from one of the kick-off discoveries they ran for the project.

He explains, “A major utility provider’s head of forward planning was there. And speaking to him, he made the comment that utilities are required as part of their regulatory commitment to forecast within 2% the capacity they expect to supply to an area. And largely for utility companies that’s driven by the number of homes, he said that they have to be able to predict, I think, 10 to 15 years in the future. And they’re expected to be in within a really, really tight margin, margin of error. But actually, during the period [of 20 years] there had been 17 different housing ministers with different levels of authority, and different ideas for how and where homes should be built.

“So if you take this idea of 17 different housing mirrors over the same time period, you’re meant to deliver it within 2 or 3%. This sort of fixed idea of what’s going to be built where and how am I going to supply it, you start to see how the present system that relies on paper and documentation and deeply codified technical knowledge begins to fall down. And you can start to imagine how clearer access to that information and the ability to pull that information and often do different things with it. So bringing it into a utility company’s corporate systems and allowing it to compare them against their capacity and improvement plans will take it to the regulator is a nice use case that sits outside… the more sort of close to home example of a citizen wanting to know what’s going to be built on the site opposite.”

Under the present system, the planners were collecting and codifying information from just 8% of planning applications that were submitted. The swathes of information were just too vast.

And between the 32 boroughs of London, each with their own terminology, process requirements and data formats… comparing between areas was impossible without a lot of time spent manually validating different sets of data. In short, the various subtleties and requirements across boundaries meant that planners have been working to an impossible task.

Will explains,When you’re a public body, trying to make decisions, the burden of processing that information to allow you to dig through all the documents and have summary statistics is quite, it’s quite a significant burden. There are a lot of companies who act in this space, you effectively scrape or read documentation and provide a measurable fraction of what PLD will provide. But there’s also this point here that a lot of remarks and theories and ideas about particularly the housing industry, are often very hard to quantify. You can read it in any newspaper in the country, you know, we’re meant to deliver this home, this number of homes, we deliver this number of homes, planning is quite complex.”

The first step is building the initial infrastructure to bring this data to the fore – what Will describes as getting the plumbing right.

“What could this data allow us to do to model equality? What could this data allow us to do to understand the case for building a new pocket park somewhere in a way that is much more repeatable is much, it allows us to make better decisions because the burden of collecting that evidence is much lower. So we can spend more of our time on thinking about good things to do, as opposed to acquiring data to understand if those good things are good.”

Peter adds that the underpinning principle of this is having an undisputable baseline of facts. By having this data set, you have a level of facts. And so everybody can say that that’s a fact. Because it’s collected in a clean, robust manner. Whereas the planning system often relies a lot on individuals research and opinions at the moment. And we’re hopeful that project like this will enable us to move to a space where the planning industry can operate based on a shared evidence base. And as such a series of facts to start the discussion. Now that in the long term, for me professionally, would speed up the planning system, because it will speed up the delivery because for that as well points out for the first time you can actually very cleanly identify where the Relax in the process are we often get a lot of criticism in the press. And you get a lot of criticism in all stages of the planning system that the planning system is a bureaucratic system and is slowing the world down. Well, actually, this might unlock that discussion a bit further.”

Not least because Peter is a strong believer in bureaucracy and knows that it saves lives. What can be done with the PLD depends ultimately on smarter, younger minds than ours.

Will adds, “It will soon be open to a broader public, including the disruptive tech providers in that space as well. So any list of small start-ups or research academics can start to play with it, I was having a conversation with my alma mater, the Centre for advanced spatial analysis at UCL about how we can get four or five students playing with this, because ultimately, that’s the sort of thing I dreamed about when I was there. How could I have a really great data set that described the entire city and its planning functions, and my old head of department, who kindly tolerates me as an honorary research associate is, is sort of exploding with excitement at the moment. So thinking about like, your academics are keen to get their hands on this, because who knows what you’re going to find in there. You know, Peter, and I can hypothesise a little bit. But I’m really, really looking forward to being comprehensively bamboozled by what somebody cleverer than I do does with it.”

The horror of cold facts

Peter thinks that an early realisation might be one of horror. The moment that we realise what misconceptions we have been working under. And that data has been relied upon at different times over the years to make decisions. But then secondly, they’ll have some quite detailed and different insights into how the city is changing. And whether that’s going to have a more profound effect on whether policy becomes more dynamic and whether conversely the policy gets less dynamic, because we’re seeing trends that will enable us to get that bigger insights. So say for example, the housing number conversation, we’re going to have it on a screen in the reception to City Hall. We’ll know how many schemes we’ve got consent, how many of them are being built out at any time. And so that will take away a lot of these suggestions of under or over supply of housing. Because you can see instantly the figures so it might just move the conversation on.

And further than this, Will says that there will be less flexibility for less than honest interpretation of the rules.

While Peter highlights the cold, hard reality of data, and fact, “There’s lots of risks attached to this. As whenever you release data that wasn’t previously available. Some of the obvious ones are around politics. What if this data set creates a narrative that’s different to ones that our political leaders have previously used. And the second one is around accountability. Making an open data set, by virtue of its existence, means that residents can access it. And residents may hold us as professionals accountable for the change that’s happening, or not happening as a result. And then the third kind of just as a practical piece, there’s a massive data ethics risk. Because when you make data available, whilst it’s not personal data, is data that may have an impact on people’s lives, or may enable you to interpret what’s happening in someone’s life. There’s some really strong ethical questions about its availability. So we’ve been we’ve been carefully managing that journey throughout the project, to try and understand where the risks attached to each of those stand. And there are a number of other risks. But you can kind of imagine, heaven forbid, a scenario where one borrower suddenly discovers as a result of this data set, the floorspace they’re creating for residential extensions within their borough, which is a dataset that’s never existed anywhere before. is having such a significant impact that they need to worry about their infrastructure, their school capacity, their hospital capacity, and all of those things that suddenly become quite terrifying. So there’s all sorts of questions around accountability there as well.”

Peter and the GLA have taken a robust approach to this, firstly not pulling any personal data, and secondly by consulting with data ethics specialists in academia. This is a new world and warrants caution, but it seems that the most sensitive data is already out there and is not being shared anew.

In the end, this isn’t a story about an underlying technology. It’s about how systems shift and shape around them. People get very excited about the all new technology… the 3D, 4D, 5D, 6D, 7D model… but it’s actually about what that model allows you to do differently.

Will concludes,There’s an example on the national underground asset register project that we’re working on about. And this is a project that seeks to connect all of the underground asset networks of the country together so that when people are out digging for your pipes outside your house, they don’t hit a hidden gas main. Now, there’s a big part about data and technology to all of that. There’s also a big part about what does that do to the business process of the guy on site? He’s not looking at paper anymore. He’s looking at a piece of technology. How does that change the sort of skills you need in that workforce, the sort of people who will work on these projects, the sort of things these will look at and maybe find exciting, the impact of what you use the digital twin for and how it affects the world of today is often the most important part of this and the Planning London Data hub is a nice example of this, we were talking about rewiring a system that has been defined through legislation through the decades, but hasn’t really realised the benefit of modern technology. And this is the exact same case with the National Underground Asset Register project, the men and women who are outside conducting works on pipes outside your home, are not necessarily harnessing the latest and greatest technology. It’s how we change these processes that affect the built environment through the medium of digital twins, that excites me.”

ARTICLES
Long reads

Norway’s Christmas gift to Britain

Every December in London a magnificent Christmas tree is unveiled in Trafalgar Square. The Norwegian spruce is cut from the forests surrounding the Norwegian capital

EPISODES