Gaming tools and engineering

Gaming engines have been developed to support one of the world’s biggest entertainment sectors. But, as tools to create real time simulations, they can also be used by engineers, to explain ideas in public consultations, and to bring together work developed in established engineering software.

When putting together a planning proposal, engineers will typically use drawings. These are well understood and easy to use for engineers and planning officers. But—as Atkins Realis’ discovered when working on a flood management project on the UK south coast—they can be confusing to the public.

“Cockwood Harbour is a little town in England, near Exeter,” says Matt Dunlop, senior immersive designer, at AtkinsRéalis.  “It’s a beautiful tourist town. But it floods quite often, which is expensive to repair, and impact transport and tourism in the area. There’s a key transport link, a railway, running through the town.”

AtkinsRéalis was working on a plan to build a new harbour wall, and supporting the planning proposal for this. However, respondents to the consultation were concerned about how this would affect the character of the pretty coastal town.

“If you tell a lovely English seaside town, that depends on tourism, ‘We need to build a 12 foot wall around your bay to stop this flooding issue’, then that’s terrifying,” says Dunlop. “In most people’s minds, that would be heard as, ‘OK, that’s our tourism gone. No one’s going to want to come to our harbour to see a 12 foot concrete wall’.”

In a consultation, part of the challenge is to understand your audience. “You’re talking to a small town, with people that aren’t necessarily technically minded in the world of engineering,” says Dunlop..”And you’re trying to convey the point of how important this seawall is to protect their properties and their towns, but also how it won’t impact tourism.”

In the initial stages of the consultation, the material provided did not meet the needs of all these stakeholders. “Initially, they were approached with a load of 2D drawings, showing them the height of the old wall, and the new height of the wall,” says Dunlop. “They were asked, ‘Doesn’t that look fine?” To which the public said, ‘We don’t understand.’”

As Tom Greener, principal real time developer at AtkinsRéalis says, “There was a bit of a misunderstanding, from the engineering drawings, that a massive wall was going to be built, that it’s going to block off the harbour.”

Locals were rightly worried about the impact on their local economy. “But in reality a large amount of this wall was going below the ground,” says Greenr, “The actual wall wasn’t going to change too much in its height from where it was.”

Bringing ideas to life

The AtkinsRéalis creative design team decided they needed to make use of new tools to show the town what the changes would really look like. “We actually used real time technology to build the entire town,” says Dunlop. “We had the entire area of Cockwood harbour in a real time model, with a little slider at the top of the screen—we’re trying to keep this as simple as possible. The slide at the top showed normal conditions, and then the flooding that had happened in the last 10 years. And what that did, when you clicked that toggle, it raised the water level, from the normal level to the flooding level, and then all the houses, all the houses in the area flooded. And you can see that by toggling this on and off.”

That showed what might happen if the flood defences weren’t installed. But what would happen to the town’s tourist economy if they were? The creative design team put in place a virtual viewpoint, showing one key aspect of the town.

“There’s a nice little pub that sits along the harbour, people like to sit there, drink, watch the boats, watch the trains go by,” says Greener. “There was a big comment that asked, ‘What’s my view from the pub going to be like?’” 

If visitors have seen a nice Instagram post of the harbour, they might stop by for a pint, or even stay in town.  But what if the view were blocked by the harbour wall?

“They might not do that,” says Greener. “They might say, ‘Well, if I can’t go enjoy a nice pint in the sun and watch the boats go by, I’m not going to go to that pub.’ We hadn’t thought about that. We hadn’t thought about that in design, and we didn’t think that was going to be a question that was asked.”

By using Unity, the team were able to quickly respond to this unanticipated question. “Later that day, ahead of a consultation the next day, we put a viewpoint from that pub,” says Greener. “So they could directly go there, they could view what it was going to be like, from that point. That answered a lot of questions for people, immediately. “

An approach like this allows engineers and project owners to respond quickly to public questions. They can ease the logistics of consultations, and make them more flexible. And they can even shape their presentations to meet the needs of different users.

“We can build accessibility directly into these applications,” says Greener.”If you’ve got questions from people with physical disabilities—colour blindness is a great example of this—we can build various systems into these applications to accommodate this, much more so than you can by, say, putting a poster up.”

Gaming heritage

The tools that Greener and Dunlop used on this project have wide applicability in engineering, but were originally developed for the world of gaming.

“It’s the same underlying technology that runs through them,” says Greener. “What we’re using games engines for, is for the renderer and the physics system. Games have put a lot of money into that over the last 20-odd years. And about about seven, eight years ago, there was a bit of a realisation that you can utilise that same technology to drive their use in architecture and engineering.

“What we started to do, is then utilise that technology but in a slightly different way. At their core, there’s not a lot of difference. It’s all what the developers, what the visualizers, put into those engines to give you output. Games are focused a lot more on the fun driven narrative, whereas what we do within the architecture and engineering sectors is a lot more practically driven. “

Jack Strongitharm is a solution engineer with Unity, the gaming engine which Atkins uses for many of its simulations. He’s no coder, but he first came across the technology as a user. “I had to build a tradeshow event exhibit, to make people interested in what I was doing at the time. And I came across Unity as a great way to take content, which was 3d scans at the time, but also CAD-models, and build a VR treadmill. So something to get into, walk around and experience, see it in virtual reality. And it all came together with Unity so easily, because I could just learn how to do it on things like YouTube.”

The tools Jack used are designed for community development. Enthusiastic amateurs can learn to use them online, and quickly get to work creating.

“What you’re really greeted with is just kind of an empty world,” says Dunlop. “It’s actually set up for you just as an environment that you can start to interact with. “We’d normally start sourcing three main data sets that we’d bring in. Let’s say that we’re building a school on a brownfield site. We’d bring in the context area: satellite imagery, like Google Maps, where you have all the elevation data, and then you have aerial photography layered over it. 

“We then talk to the architects and the mechanical engineers, and bring in their 3d school design that would have been produced in something like Rhino, Bentley, Revit, or AutoCAD. 

“Finally, we would potentially add all the trees and the shrubbery for the area. This needs to be accurate to the correct species, the height and the radius. So we would then talk to the landscape team,—who normally works in 2d, top down—to then start understanding what kind of vegetation they need placed, we could bring those designs in, and then place those around the area.”

It’s almost as simple as pasting text into a Word document, or adding a layer in PhotoShop. “You right click, Import, and import the 3d,” says Dunlop. “You can just drag it into the screen, and you’ll see your 3d object pop up in that world.”

But these tools are not just used to create a static environment. They can incorporate simulations of real world physics, and have elements interact over time. Once, developers would have to define these rules each time, and make them work on whatever platform they wanted to use.  

That has real benefits for designers like Dunlop. “In the past, you could be spending months or even years designing how lighting systems work and how a shadow is created from that lighting system: where the rays of the sun come down, how cloud systems would work, and then how an interface would work, how you could talk to operating systems like Windows and Mac, iPads and Android, and that could cost you hundreds of thousands of pounds, just to develop that initial framework. What the real time engines have done is they’ve created that framework already. “

The physical rules developed to make gaming fun, or challenging, can find uses in much more practical applications. “You might use a wind model in a game, to show how an arrow flies,” explains Greener. “It puts resistance, it puts pressure on it, one way or another, it makes it drop faster. We’re going to use that to steer water, to make trees blow, in a more realistic fashion.”

Similar tools can be used to model fluid dynamics. In a game, that might make it more fun to captain a pirate ship. On a project like Cockwood Harbour, that can show how floods will interact with harbour defences. Immersive designers can model how trees will change through the year, or how noise from a motorway will pass through or be blocked by different barriers. 

They can even model how human bodies will be affected by their environment. In a game, that might set the gamer the challenge of working out how much water to carry—and how many other resources or tools they then have room in their bag for—before heading out to explore a desert. In the real world, it could impact how designers consider shade in an urban planning project.

“We were looking at a project for shade,” says Greener. “We were dealing with 40° plus temperatures, and it was looking at how a person’s potential health could be impacted over a journey, depending on what facades are in place on this building. So we would programme the AI to favour shade. It might be a longer walk, but actually they are spending 80% of their time in the shade. And therefore their health parameter that we programmed in, would stay much higher, where someone who—an AI that was programmed to—go “nope, I’m just gonna get there as fast as I can”, would actually get there about five seconds faster. But their health metric that we programmed in would be much lower, because they’ve literally just walked all this distance in a 40 degree heat.”

Shared assets and digital twins

Much of the development work on simulation tools like this is done by an enthusiast community. Fans design assets, and upload them to be shared or sold. But the libraries created by these gaming fans are not exactly ideal for engineers.

“You get a lot of gun racks, you get a lot of rundown spaceships, and stuff like that, because that’s what the games industry uses,” says Greener.” You’re not going to find London Paddington on an asset store somewhere that’s perfectly ready for you to walk around. But luckily, we’ve got that in-house here, at AtkinsRéalis. We’ve got all the different engineers, the architects, all the different disciplines that make up the company, to give us those models. They’ve got them, they’ve built them for the projects they need, so we can effectively add additional functionality and an additional use case to assets that are already being produced by the company.”

At the same time, the professional software that AtkinsRéalis engineers use is increasingly being designed around open or interoperable file formats. That lets the creative design team find new ways to aid collaboration between global teams. 

“There is definitely a waking-up in the large software providers to the idea that they need to start opening up their formats,” says Dunlop. “Nvidia and Omniverse are pushing a 3d format that’s completely open ended to any software. And they’re encouraging software providers like Bentley and Autodesk to use that file format as well.”

“The idea is to kind of complement that technology and take content from those design products, and then bring them alive in our game engine,” says Strongitharm.

To reap the most rewards from the virtual assets and data they already have, engineers and clients need to think about how they might be used, as soon as they are created,

“It offers a lot more value, if you get in day one on one of these assets, and follow the lifecycle of the asset,” says Greener. “As the asset grows, the dataset grows, the applications grow. When we engage with our clients early, explaining this technology to them, that helps them get the most out of their digital assets.”.

Simulation in engineering has moved from 3D CAD models, to BIM and digital twins, which incorporate additional data to project designs, to what Strongitharm calls ‘operational digital twins’, which live alongside the project. “Operational digital twin is the next level. That’s where we’ve had the most interest, actually, which is obviously quite bespoke, in everybody’s use case. People see a game engine as a way to provide a bespoke application. Because their requirements don’t live in a software product they can buy on the open market.”

Immersive simulation in the real world

Tools like Unity were first designed to bring this sort of interoperability and platform neutrality for game developers. As they have developed, so has hardware like AR/VR glasses, and, at the same time, the ability to use a ‘thin client’ model, where applications run in the cloud and stream to a low power device like a phone. That opens up the potential for their use in the real world.

“Let’s say you’re a gas engineer, or water engineer, or you’re just resurfacing a road” says Dunlop. “At the moment, it can be quite hard to actually understand exactly where the pipes are. And often, you’re going off previous reports and documentations, which could actually be slightly dated, or wrong, quite frankly. Now imagine if you could put on a headset that could tie into the the infrastructure databases that show where all that piping and cabling is, and if you stood on that road, with your team around you, you put your headset on, and you could see the depth and exact locations for the cabling, or the water pipes.”

The technology allows for training in dangerous environments that doesn’t just explain the risks, but shows them to users and allows them to practise how they will respond in an emergency. 

“We’ve built response procedures for nuclear power plants in virtual reality,” says Dunlop. “Its a huge cost just to get into the sites of some of these nuclear reactors, and a huge amount of paperwork. And in some of them, you need to suspend certain services while they’re in there. Now, you have to train people on how to deal with these environments and what to do when it goes wrong. But what we found extremely beneficial is you can build that environment in a digital realm, and you let them go through that multiple times.”

Understanding how a physical environment changes, how new hazards develop over time, is vital to training emergency workers. But in a public place—like a football stadium, or underground station—planners must also consider the behaviour of crowds.

A crowd isn’t an undifferentiated mass. It is a collection of individuals. Some of them will read the warning notices. Others will stay calm under pressure. But many will be confused and panicked. AI is enabling crowds to be modelled in ways that reflect that they are made up of individuals.

“With new AI systems, where the crowds are independently thinking for themselves, they’re looking at, ‘What’s the best thing to do? What rules have I got? And how likely am I to break those rules?’” says Greener. “An example of this could be a fire drill in an office building. Everyone knows, you don’t use the elevators, you go down the fire escapes, you walk calmly, everyone in single file out the door. But we can start looking at AI and ask, ‘Well, actually no, what happens if there is a fire, what percentage of people are likely to break protocol? If 5% of people break protocol, does that actually lead 20% of people to break protocol, because the 5% are panicking, and they need to adapt to that.’ We can ask if these scenarios that have been thought out, that make perfect sense on paper are realistic in a real world scenario.” 

The next step for AI, Jack says, might not be used within the simulation. Instead, tools like ChatGPT can now be used to do much of the initial work of developing the bespoke logic needed for engineering applications.

Video streaming allows immersive simulations to be run on lightweight apps. But new web standards are going further, allowing these to run directly in a browser like Chrome. And tools like LIDAR sensors on phones, along with increasingly lightweight and cheap VR and AR glasses, will allow users to view simulations in the real world.

Together, these technologies will make the use of real time simulations a routine part of professional life.  The game now is to find new ways to use these tools.

ARTICLES
Environmental

Cleaning the seas with mussels

© image courtesy of Montgomery Groundbreaking research suggests that one of the humblest creatures in the ocean, the blue mussel, could be the secret to

Build

The Roads Locking In Carbon

A new plant-based bitumen is being incorporated into asphalt mixtures being developed by Aggregate Industries in the hope is that roads will act as carbon

EPISODES