The Harvard Business Review’s depiction in a 2012 article of data mining as the “sexiest career of the 21st century” may strike some as an odd pairing, but the field’s gain in corporate status and its allure to current and future practitioners has been clear in many business sectors.

Now, with vastly improved tools to collect and store enormous amounts of data collected from project stakeholders and the growing need to analyze and interpret it for public-infrastructure project management and planning, construction-sector companies and software firms are embracing data science—and those who are good at it—in new ways.

The U.S. alone could could face a shortage of as many as 190,000 people with deep analytical skills, said a followup Review article in 2014, sourcing it to a study by the McKinsey Global Institute. "The study also found a looming need for over 1.5 million managers and analysts who understand big data and how to apply it to business operations," the article authors contended.

“The rise of data mining for engineering purposes is fast becoming the solution for better project delivery and enhanced asset performance,” says Aidan Mercer, government and utilities marketing director for software vendor Bentley Systems. “For governments, it is seen as the solution to improving the planning and design phases by offering options and alternative ways to streamline processes, recognize efficiency gains and provide a better return on investment for the public.”

Randy Smith, facilities management director at Gilbane Building Co., says that large corporate and government entities are using big data analytical tools to assess their faciliy infrastructure data in the cloud. Tools such as IBM Watson Analytics along with the real-time data from “smart building” sensors allow analysis of facility use data over time that can be incorporated into top management decisions about future lease and capital investment in new facilities, he says, noting that the U.S. Patent Office found that based on data analysis, it could cut its building square footage dramatically and still effectively support its mission. "IBM Watson Analytics will tell you what to look for. It has a smart, intuitive sense of who to analyze your facilities data," he says, emphasizing that the right information must be provided to the cloud.

Carlos H. Caldas, a University of Texas-Austin construction management professor who has researched the field, admits that, in construction, “a lot of data is lost. There is a huge potential to use more."


Big Data, Big Benefits

For more public-sector asset owners, unlocking the secrets of “Big Data” is critical “as federal and state agencies are challenged with shrinking budgets and eroding infrastructure,” says Steve Barber, a vice president of Michael Baker International.

The firm recently completed a year-long $7-million project to collect millions of data point clouds from each of 8,623 traffic signals—owned and managed locally across Pennsylvania, the only state in which this occurs. The intersection data, mapped in three-dimensional point clouds, will populate a new statewide transportation department platform that will standardize now-disparate maintenance and design practices. I

It also will “establish a real-world test bed” for advances in vehicle-to-infrastructure technology on which the firm is collaborating with Carnegie-Mellon University, says Robert Hanson, Michael Baker senior vice president.

“Data really enables us to put together the puzzle of what is going on … so we can make decisions not based on subjective input from the field,” says Burcu Akinci, the university’s engineering school associate dean for research and codirector of its Smarter Infrastructure Incubator. “It’s an exciting time for engineering and construction to move toward making data-driven decisions.”

The efforts are making a difference for owners. Rocky Kearney, deputy director of New Mexico’s Public School Facilities Authority, estimates that the authority’s use of a cloud platform, hosted by software firm e-Builder, has saved it $200 million in maintenance and capital costs for the $19.5 billion in assets of its 89 districts. He says the system can track multiple funding sources and allows change-order tracking in real time. “It stopped a lot of litigation,” says Kearney. “We can run a report and know all costs.”

Use of Big Data “creates a lasting digital asset that improves the way [physical] assets are managed over their life cycle,” says Chris Bell, e-Builder vice president.

The city of Toronto’s transportation services group created a new Big Data innovation team in 2015 “to leverage emerging transportation data sets, in particular GPS probe-based data sources,” Scott Fraser, group program manager, told a Transportation Research Board gathering last month in Washington, D.C.

The U.S. Dept. of Energy’s Los Alamos National Laboratory in New Mexico standardized on a Locus Technologies cloud-based platform to manage environmental compliance and monitoring for multiple stakeholders of a nearly 40-sq-mile site where radio­active and chemical contamination occurred during more than seven decades of nuclear-weapon production and research.

“It’s one of the most massive monitoring programs on Earth, but all information is available in real time to the public, with no password required,” says Neno Duplan, Locus Tech CEO. “Scalable databases can now analyze billions of records via self-learning algorithms and package the insights for immediate use.” Nita Patel, DOE’s environmental data program manager at the site, says the extensive data, once heavily siloed, now can be accessed “seamlessly without depending on data stewards.”

But Bentley’s Mercer notes, that with more government caution in having data “housed in new, unfamiliar environments,” providers and users are ramping up security. 

That concern was a major factor in the Port Authority of New York & New Jersey’s extreme vetting of e-Builder’s “government cloud security solution” to manage data for the revamp of LaGuardia Airport, says Dareen Salama, deputy project controls manager for its chief consultant, STV. The project now is estimated at $7 billion. Salama points to data flowing from more than 40 involved contractors “and at least 800 people requiring real-time access to information.” The management information system scaled up from a pilot the agency is using for the $1-billion Bayonne Bridge renovation in New Jersey.

The LaGuardia redevelopment program "was new in so many ways," says Emanuel Ciminiello, Port Authority senior resident engineer. "As a public-private partnership, we needed to ensure that traditional systems of recordkeeping and document control protocols were still maintained; however, we understood also the need for real time platforms that would allow us to present information to all users ... to allow construction to maintain schedule and not interfere with the operation of the airport." 

Even so, firms struggle to absorb changes in traditional practice and adjust to Big Data’s scale, which has introduced “petabyte”—defined as equal to 1 million gigabytes—into the construction-sector lexicon.

“In essence, the engineering workflow changes from the well-established, step-by-step process, which involves stage gates and a lead engineer that can take days or weeks, to a full turnkey product processed and delivered within hours,” says Michael Baker’s Hanson. “Processing happens faster, and most engineering project data is temporal.”

George Fink, vice president of engineering firm MBP, a consultant on the Bayonne project, says needed skills “require a combination of creativity, organization and the ability to present complicated information in an easy-to-understand manner. The old method of solving a problem and coming up with a specific right answer is not that skill set.”

Proponents see a mastery of advancing Big Data analysis as a competitive advantage. “It already has become a differentiating service,” says Christopher Sherry, chief operations officer at design firm Merrick & Co. “To handle the massive data sets that we work with, much of our 3D data-processing software had to be custom-developed in-house,” he says. “Being able to identify potential issues sooner is often an unforeseen benefit of faster processing throughput. We have scaled our capabilities exponentially over the years with the use of disruptive technologies.”

Sherry points to the firm’s work, along with that of other engineers, on a multiyear project for the U.S. Geological Survey to map the entire country with Light Detection and Ranging technology (LIDAR) to boost infrastructure planning. “The data volume for this effort is expected to be between seven and nine petabytes,” he says.

“The upshot is a tailwind for construction firms that embrace new technology and requires growing new capability to adapt to the rapidly changing software landscape,” says Patrick Copeland, a former Google Big Data executive who now is vice president of research and development at IT vendor PlanGrid, which says its cloud-based platform has been used on more than 500,000 construction projects and stores over 50 million sheets of digital blueprints.

Adds e-Builder’s Bell: “Today’s engineering and construction firms are at an inflection point.”


Transformation

But the industry’s shift is more than apparent. “Five years ago, our executive search efforts for senior manager or business-development positions had zero reference to Big Data, algorithms or dashboards,” says Mick Morrissey, principal of management consultant Morrissey-Goodale. “Now, some 50% of our searches at this level nationally are looking for experience and competencies in these areas.”

Contracting giant Bechtel has adopted the approach, last year creating a Big Data & Analytics Center of Excellence at its Reston, Va., base “to access and harness this volume of information, learn from it and use the knowledge gained to transform the way we operate and compete,” says CIO Carol Zierhoffer, noting productivity challenges and industry impacts from new technologies such as virtual and augmented reality, digital fabrication and the internet of things.

The effort is staffed by 11 full-time employees, including data and business analysts and engineers of various disciplines. “These roles and skill sets are new to Bechtel’s organizational structure, as we have put greater emphasis on change management and integration,” Zierhoffer says.

To date, the center has relied on outside contractors to do its data mining, “but we are currently transitioning to a self-performing model,” she says. The firm hired its first staff data scientist just this month and will bring on more this year.

In its job posting on multiple online job-recruitment sites, Bechtel says: “If you are passionate about Big Data analytics and are willing to embrace the challenge of dealing with several petabytes of data on a daily basis, we want to hear from you. We need you to use our multiple platforms to turn this information into valuable insights and help us to predict the likelihood of safety incidents in our jobsites, analyze productivity trends and better manage construction indirect cost, among many other challenges.”

Merrick’s Sherry emphasizes that coming changes are “not in the destructive way that many fear.” He says new data types and methods such as LIDAR can boost workers’ capabilities and efficiencies. “The use of LIDAR over the past 15 years has created new opportunities for surveyors, rather than displacing them,” says Sherry, noting that more are also adding drone-pilot licenses to their résumés. “We do more now with our highly skilled employees than ever before,” he contends.


Communication

Even so, some see disconnects in the digital acceleration. “Some industry experts argue that we may need to have different genes to work on construction projects in the digital era,” says David Jeong, an associate professor in Iowa State University’s Dept. of Civil, Construction and Environmental Engineering. He pointed to the concern at the recent Transportation Research Board session that engineers and tech experts “use almost completely different languages.”

Jeong sees a solution in the next generation, “who recognize this changing environment.”

He says his program soon will offer a new department specialty area, intelligent infrastructure engineering, that will allow students “to deeply understand and learn Big Data, machine learning and sensor technologies to more efficiently design, construct and operate civil infrastructure systems.” Jeong says similar specialties have started at the University of California-Berkeley, University of Michigan and New York University, among others.


“Some argue we may need to have different genes to work on construction projects in the digital era.”

David Jeong, Iowa State University


Carnegie-Mellon’s Akinci says her program “embeds components of data into the curriculum. Students study advanced systems in acquiring data, managing data, mining data and how to visualize it.” She adds, “We don’t teach specifics. We teach the underlying principles. The technology is evolving so fast that engineers need to know how to adopt whatever technology comes their way.”

In a test program of its Big Data platform, PlanGrid is expanding this coming May a pilot outreach effort to universities and other users that will include time with on-site specialists.

Training now is underway at Stanford University, the University of Florida-Gainesville, Texas A&M University and Honolulu Community College, as well as at several building trades’ sites and a high school in Indiana, says Emily Tsitrian, the software firm’s director of consulting. With “over 20% of PlanGrid’s team” being AEC-sector veterans, “we understand the pain points of our customers, “she contends.

“As other industries have discovered, the value of Big Data is … how different sources can be combined and from the insights they can deliver,” company vice president Copeland adds.

“It’s time for the construction industry to embrace its calling for digitalization,” e-Builder’s Bell points out. “Technology has revolutionized the manufacturing and financial services sectors, and they have never looked back.”