Photo by Steve Hill for ENR
Todd Wynne (left) and Joe Williams from Rogers-O'Brien Construction prepare to demonstrate their invention, which blends plan sets into navigable, digital maps.

 

Attendees at ENR's FutureTech East conference in New York City at the start of October explored the industry's future and looked inside a mid-size construction firm's innovation culture, including an invention that some are calling the Google Maps of construction.

Rogers-O'Brien Construction is testing the software created by Joe Williams, director of technology, and Todd Wynne, construction technology manager, on projects. The two realized that by stitching plans together edge to edge and including contextual data, such as layering multiple trades on the map, the system they call Project Atlas can bring an entire multibuilding campus project to life on an iPad in a Google Maps-like fashion. As the user zooms in, the view progressively delivers greater detail, down to dimensions and fixture data.

Williams said Project Atlas now has been spun off as a separate entity, with Williams and Wynne as majority owners. Rogers-O'Brien is fully supportive and holds a small stake, he said, but the tool is not yet market-ready.

Project Atlas has been in rapid development for a year. New features debuted at FutureTech, including real-time integration with an indoor positioning beacon system, from Red Point Positioning, and a 360º room view. Wynne admitted that maintaining a current plan set in a Project Atlas view is a challenge, though, and still has to be done by "brute force."

One of the inventors' goals is to automate the 2D plan-stitching-and-assembly process. The most efficient way, they say, would be to generate directly from a 3D BIM, but 2D plan sheets are the form still required for many projects.

Human-Computer Partners

Brad Hardin, chief technology officer for Black & Veatch, spoke earlier that day on machine learning. He said continuing improvement in computer processing speed and the plunging cost of storage has brought us to the point where technology is no longer a limiting factor in developing systems that can independently research engineering problems and offer optimized solutions. Black & Veatch is a strategic partner with the team developing business applications for IBM's Watson cognitive computing system. Hardin says he doesn't like to apply the term "artificial intelligence" to cognitive computing because "intelligence is intelligence. When it is captured by a machine, it is still our intelligence."

Hardin proposed that humans and computers should partner, and each do what they do best: Cognitive computers can observe, interpret, evaluate and offer optimized alternatives, and humans can spend less time researching and calculating and more time addressing questions with creativity and imagination.

Hardin said the human brain can store about 10 to 100 terabytes of data, based on a calculation of its roughly 100 billion neurons with about 1,000 connections. "If you extrapolate that to data points, it's about 1 trillion data points, or about 100 terabytes of data. We are pretty high-end on the storage side."

However, humans are weaker on the processor side. "Our processors work on more of a kilohertz speed rather than megahertz," Hardin said. "That means there is significantly more processing time than a computer." An advanced computer such as Watson can read a million books per second, but humans are good at creative thought, seeing patterns, planning logic and pulling together abstract concepts.