Photos of construction projects are as common as hardhats and work boots on a jobsite these days.

But what if construction crews could use photos and videos to assemble an updated 3D building information model of their project in seconds, using that overlay to check the building's progress against a 4D BIM schedule?

That's just one example of the possibilities and new applications bubbling up from a hybrid 4D augmented reality (4DAR) platform developed by construction engineering professors and students at Virginia Polytechnic Institute and State University. The technology is creating buzz among innovators and attracting developers, thanks to its ability to merge jobsite data with virtual BIM data, in nearly real-time and with precision accuracy.

Developed by Prof. Mani Golparvar-Fard, Prof. Jules White and research student Hyojoon Bae, the technology has been licensed from Virginia Tech to PAR Works Inc., a start-up launched in August by Golparvar-Fard and White and funded by a $1-million investment from venture firm Allied Minds, Boston. The platform works with algorithms developed by the researchers that let smart-device owners use images to retrieve customized, information tagged to any building, product, street scene or object.

They call the platform the Mobile Augmented Reality System (MARS). Unlike other types of augmented-reality applications that work best only outdoors for satellite reception, the technology doesn't need any GPS, wireless or markers for localization. "Generating a physical model from construction-site imagery is computationally expensive and can take hours," the researchers wrote in an award-winning research paper on the technology.

"Producing a physical model from a set of construction photographs requires non-linear multi-dimensional optimization as well as exhaustive matching of the photographs in the data set. A specific aim of our proposed work is to overcome these challenges, speeding up overall time of 3D reconstruction and localization by developing and optimizing enhanced structure from motion algorithms," they note. Golparvar-Fard says MARS provides augmented-reality overlays at millimeter-level accuracy on images or other data types.

The overlay of information may not be in real time, but it's close enough. The interval between uploading the image to the server and augmenting it with context information such as BIM data, is one to four seconds. The platform's precision in rendering the data over the image sets a new benchmark.

"This should address some of the challenges of using AR for the construction industry and can significantly improve field reporting systems that benefit from iPads, such as Autodesk [BIM 360 Field]," says Golparvar-Fard, who is settling in as a faculty member at University of Illinois at Urbana-Champaign after developing the platform at Virginia Tech. The company says the platform's bread-and-butter use is for converting 2D files to 3D point-cloud models. From there, the possibilities can grow as thousands of workers start taking images of a jobsite and integrate them with BIM data.

"A lot of our technique is how you support that [overlay] efficiently and quickly," says White. "The overlays can have any kind of information attached to it. It can be a video, it can be another photograph, it could be information from a BIM model or a schedule for a contractor.

Each overlay essentially has an identifier associated with it that you could attach to any [type] of information you want," he adds. John Serafini, vice president and general manager at Allied Minds, says PAR Works is looking to attract more developers of AR apps for the MARS platform by holding an application development contest. The developer whose AR app is judged best—even out of this world—gets $25,000 in cash and featured at the South by Southwest (SXSW) conference in March.