Adrenaline is running high as you snake your way through the top-secret research facility. At the end of a long corridor, you duck into a lab filled with black-topped tables. Something isn’t right, but you aren’t sure what it is. You take a moment to look around and notice that half the room is missing air hoods. You look up, hit a button on your game controller, and the ceiling peels away as if it were the skin of an onion. Someone forgot to add several key parts of the mechanical system into the model.


Article Index:


This scenario is but one of many that a design or construction professional might encounter while using a video game to experience a virtual construction project. The operative word here is “experience.” The construction industry has widely adopted 3D tools such as building-information models, but BIMs are still experienced in a flat, 2D environment when rendered on a screen or a piece of paper. Video games are taking these models to a new level of immersion.

The first-person shooter games of the 1990s that parents, politicians and social activists came to despise—such as Doom, Half-Life and Quake—made way for more sophisticated games that serve as the backbone of virtual-reality design and construction tools. And the unruly hooligans who played them when they should have been studying? Many are now hacking into their childhood toys to make the construction industry a better and more exciting place to work.

Driven by so-called video-game “engines,” or the software that runs in the background, the setup usually includes a map of the project model, with textures and visual elements that render and update in real time as you “walk” through the scene. Clash detection, which helps construction teams identify potential problems before work is executed in the real world, is a feature already built into most video-game engines.

Skilled gamers and attention to detail are often required to make this possible, but the learning curve is  flattening daily. A user might start by exporting a Revit model into Autodesk Maya to enhance the 3D look and then add rich textures created in Photoshop. Then, the file might run through the Unreal or Unity game engines. Instead of carrying sawed-off shotguns and searching for bad guys to blow up, the design and construction user is armed with visual tools that allow her to experience the model as a virtual, physical mock-up.

The time needed to render is no longer a limiting factor. “We can pump out a two-minute animation probably in a couple of hours,” says Lucas Richmond, senior creative studio manager for Gilbane Building Co. “The traditional way [took] two to three days.” Portability is also not a problem. Perhaps the best part? The costs are insanely inexpensive. When we caught up with Richmond at the ENR FutureTech conference last fall in New York City, he demonstrated projects on an Alienware laptop. With a price tag of up to $4,000, that’s the most expensive piece of the puzzle.



Video-game engines can be downloaded for free, and once a model is imported into the game, you can walk through the scene on a monitor. But if you want to “reach out and grab things,” as Richmond likes to say, you’ll also need a VR headset.

This July, a $600 consumer version of the Oculus Rift headset will be on the shelves. It will include a screen sensor, remote cables and an Xbox controller. Other headsets are here or coming. Google Cardboard is an open-source VR headset that can be had for a few dollars. It turns your smartphone into a stereoscopic viewer, and construction suppliers tell ENR they plan to give away such low-tech swag at trade shows to drive interest in VR. Additionally, Samsung’s new Galaxy S7 and S7 Edge will pop into the $100 Samsung Gear headset, released last fall. Even social media apps such as Facebook plan to release VR capability. What does all this mean for construction? Game on.


How Immersion Drives Decisions

Ricardo Khan, senior director for project solutions at Mortenson Construction, Minneapolis, says the company has used immersive virtual-reality tools on more than a dozen projects. He says jobs best suited for such tools are large, complex facilities with tight time constraints and spaces that generate revenue. Types include health care, sports, convention centers, hospitality and higher education, although Mortenson is starting to use the technology on smaller projects, such as those under $30 million, he adds.

At present, advanced visualization is most beneficial for driving early decisions in design that optimize construction, including the use of modular elements. “It wasn’t really the construction phase that was the pain point. If we were to take this gaming technology and move it into the design phase—since we were doing so much work based on negotiated price—if we could help the designers make their decisions super early, it would help us,” Khan says.

Once those decisions are made early enough, “it increases the opportunity for construction optimization,” Khan adds. “It helps with modularization, and the more time we have to plan and procure our material and help our trades understand what we are doing, the better.”

Everyone on a design team believes the “beautiful rendering,” as Khan calls it, is the most important factor that drives a decision. “But when you look at something on a screen or on printed paper, you are interpreting something flat. Interpretation drives confusion, and confusion drives cost. We are trying to eliminate that, using this technology,” Khan says. “We need to be able to show our customers the full scene, so they can walk around in it and turn around and look the other way.”

After experiencing the design in a computer-assisted virtual environment (CAVE), the donor of the Pegula Ice Arena at Penn State University realized that the railing around the rink was at a height that would block the spectators’ view of the skaters. By changing the rail height before—rather than after—installation, the design team calculates that VR saved $475,000. 

Even so, some experts are quick to point out that CAVE immersion has its limitations, its chief constraint being lack of portability. “Why do you need to build this huge room when you can have a laptop or an Oculus?” Richmond says. “You really don’t gain much walking in a 6-foot square than sitting at an Xbox.”

Gaming tech is not just for building projects: Difficult industrial turnarounds and civil projects are finding ways to use these tools. Last summer, a laser-scanning firm working for a confidential owner contracted with Chicago-based startup VIATechnik LLC to bring a point-cloud scan of a nuclear power plant into the Unity game engine. The problem was how to manipulate a 70-ft-long replacement reactor vessel through the plant without breaking other components or requiring extensive demolition.

“Before we gave them the functionality to test multiple paths, we also gave them the original task,” says Danielle Dy Buncio, president and CEO of VIATechnik. “They didn’t really trust the plan 100%, which is why they engaged with us.” In this situation, it wasn’t necessary to model the game in super-high definition. But the result was valuable because the original plan would have run the vessel into a structural column.

“The reactor vessel would have gotten stuck,” Dy Buncio explains. “Every day the plant is shut down is a million dollars.”

Besides logistics, safety is another important arena for games. Trying to find a place for video-game tools in its toolbox, Skanska USA is developing new safety applications for the underlying technology. “The truth is, if you give [project team members] the model in Navisworks, they might not use it,” say Albert Zulps, virtual design and construction director for Skanska Buildings USA. “But give it to them in a gaming engine—even with preset views—it breaks down the barriers. Gaming engines offer that.”

Zulps is spearheading Skanska USA’s dive into video games, with a focus on getting multiple stakeholders on projects to try out models in a virtual space. “It’s a common interface today, something like WASD [the keyboard keys used for low-tech games] and a mouse to move around,” he says. “If you can play Minecraft or Doom, getting into a model can feel very intuitive. It’s no longer someone else driving. You’re immersed in that space and can form your own interpretations of the design model.” Skanska recently hired a computer programmer skilled in the 3D Unity game engine, which it plans to use to re-create jobsite accident scenarios to help educate workers during safety training.

Recalling a construction accident in which a worker was killed by a piece of falling steel, Zulps says Skanska initiated a global safety stand-down to go over the details and show a few renderings. The future addition of playable scenarios could fundamentally change safety training. “Now, we can build the incident setting accurately in Revit [and] bring it into Unity 3D. And because we have the game engine’s physics enabled, we can ‘play through’ the scenario.” Initiatives like this take the pre-shift toolbox talk to a new level, allowing workers to walk through situations virtually and learn how their decisions lead to different outcomes.

The immersive experience could help save lives. “If we have an accident on the job, it’s one thing to tell everybody someone got hurt. It’s another thing to show what happened in a visual model and show the root cause,” says Bill Flemming, president, Skanska Buildings USA. “We’ve found that, with workers, it’s much better to show a video simulation of what occurred.”


The New ‘Mixed Reality’

The next big thing in construction is a shift away from just staring at screens all day, says Aviad Almagor, director of the mixed-reality program at Trimble. “With construction, we still consume data in a very simple way—with a 2D screen,” he says. “With mixed reality, we have the opportunity to take data out of the screen and present it in the world in context.”

“Mixed reality”—another term for augmented reality (AR)—allows users to map images onto objects in their field of vision while wearing the technology-laden DAQRI construction helmet or other specially designed visors. Another advanced version of this technology is the yet-unreleased Microsoft HoloLens, which is being developed as a consumer product so that users can play video games such as Minecraft with the game’s graphics mapped onto the world around them. HoloLens also promises to be a tool for business.

VR hasn’t yet reached the trades, Khan says, but he expects that AR will. “What will affect the craft worker is wearable technologies and the integration of augmented-reality safety gear,” Khan says. “Imagine a craft worker with a smart helmet or vest that drives improved awareness of their environment, where sensors track particles in the air, sound decibels, moving vehicles or fall hazards. Now, combine that with AR, and the worker can pull up work instructions in the place of work through [a] visor and overlay the instructions where he or she is going to work.”

Additionally, Khan believes craft work will begin to change when development kits hit the market for devices such as DAQRI. “We have been waiting for almost 10 years on AR, and now it’s time,” Khan says.

Microsoft sees potential for HoloLens in multiple fields and, with Trimble, is developing it as a design and construction tool. “Imagine I can take a project’s data and put it on the table beside me and walk around it—and bring partners and stakeholders in to look at it,” says Almagor. “I can bring this content into the physical world—take it and project it into the meeting room, where we can all walk around it and discuss it.”

Integrating the technology into the construction workflow could go well beyond the design process. At the moment, HoloLens is limited to working within controlled indoor environments, and it has trouble projecting its images onto the visor in direct sunlight. But as these technical challenges are addressed, Almagor envisions mixed reality reaching a point at which workers on the jobsite will “see” the design plans overlaid onto the actual building locations. “If I need to pour concrete on this floor, I want to make sure the sleeves for the pipes are located in the right place,” he says.

Heads-up displays and intelligent labeling of environments are features already found in video games, and users familiar with these interfaces would be able to adapt quickly to these new workflows, says Almagor. “Some customers we’ve talked to want to move to a hyper-reality environment. They want to see all the data: wind simulations, energy simulations—go way beyond the design model,” he notes.

The mixed-reality technology in development by Microsoft and other vendors still lacks that level of information density, but Almagor reports that more than a few of Trimble’s construction-sector customers are already eager to be able to play with their models in real environments. “The technology is mature enough right now to begin delivering benefits to the customer,” he says. “But a lot of the people who experience the current tools—after an initial positive reaction—what they say is, ‘I want more.’ ”