Concordia researcher hopes to use big data to make pipelines safer

November 27, 2019

Oil and gas pipelines have become polarizing issues in Canada, but supporters and detractors alike can agree that the safer they are, the better.

Unfortunately, integrity and health are ongoing and serious problems for North America's pipeline infrastructure. According to the US Department of Transportation (DOT), there have been more than 10,000 pipeline failures in that country alone since 2002. Complicating safety measures are the cost and intensity of labour required to monitor the health of the thousands of kilometres of pipelines that criss-cross Canada and the United States.

In a recent paper in the Journal of Pipeline Systems Engineering and Practice, researchers at Concordia and the Hong Kong Polytechnic University look at the methodologies currently used by industry and academics to predict pipeline failure -- and their limitations.

"In many of the existing codes and practices, the focus is on the consequences of what happens when something goes wrong," says Fuzhan Nasiri, associate professor in the Department of Building, Civil and Environmental Engineering at the Gina Cody School of Engineering and Computer Science.

"Whenever there is a failure, investigators look at the pipeline's design criteria. But they often ignore the operational aspects and how pipelines can be maintained in order to minimize risks."

Nasiri, who runs the Sustainable Energy and Infrastructure Systems Engineering Lab, co-authored the paper with his PhD student Kimiya Zakikhani and Hong Kong Polytechnic professor Tarek Zayed.

Safeguarding against corrosion

The researchers identified five failure types: mechanical, the result of design, material or construction defects; operational, due to errors and malfunctions; natural hazard, such as earthquakes, erosion, frost or lightning; third-party, meaning damage inflicted either accidentally or intentionally by a person or group; and corrosion, the deterioration of the pipeline metal due to environmental effects on pipe materials and acidity of oil and gas impurities. This last one is the most common and the most straightforward to mitigate.

Nasiri and his colleagues found that the existing academic literature and industry practices around pipeline failures need to further evolve around available maintenance data. They believe the massive amounts of pipeline failure data available via the DOT's Pipeline and Hazardous Materials Safety Administration can be used in the assessment process as a complement to manual in-line inspections.

These predictive models, based on decades' worth of data covering everything from pipeline diameter to metal thickness, pressure, average temperature change, location and timing of failure, could provide failure patterns. These could be used to streamline the overall safety assessment process and reduce costs significantly.

"We can identify trends and patterns based on what has happened in the past," Nasiri says. "And you could assume that these patterns could be followed in the future, but need certain adjustments with respect to climate and operational conditions. It would be a chance-based model: given variables such as location and operational parameters as well as expected climatic characteristics, we could predict the overall chance of corrosion over a set time span."

He adds that these models would ideally be consistent and industry-wide, and so transferrable in the event of pipeline ownership change -- and that research like his could influence industry practices.

"Failure prediction models developed based on reliability theory should be realistic. Using historical data (with adjustments) gets you closer to what actually happens in reality," he says.

"They can close the gap of expectations, so both planners and operators can have a better idea of what they could see over the lifespan of their structure."

Concordia University

Related Engineering Articles from Brightsurf:

Re-engineering antibodies for COVID-19
Catholic University of America researcher uses 'in silico' analysis to fast-track passive immunity

Next frontier in bacterial engineering
A new technique overcomes a serious hurdle in the field of bacterial design and engineering.

COVID-19 and the role of tissue engineering
Tissue engineering has a unique set of tools and technologies for developing preventive strategies, diagnostics, and treatments that can play an important role during the ongoing COVID-19 pandemic.

Engineering the meniscus
Damage to the meniscus is common, but there remains an unmet need for improved restorative therapies that can overcome poor healing in the avascular regions.

Artificially engineering the intestine
Short bowel syndrome is a debilitating condition with few treatment options, and these treatments have limited efficacy.

Reverse engineering the fireworks of life
An interdisciplinary team of Princeton researchers has successfully reverse engineered the components and sequence of events that lead to microtubule branching.

New method for engineering metabolic pathways
Two approaches provide a faster way to create enzymes and analyze their reactions, leading to the design of more complex molecules.

Engineering for high-speed devices
A research team from the University of Delaware has developed cutting-edge technology for photonics devices that could enable faster communications between phones and computers.

Breakthrough in blood vessel engineering
Growing functional blood vessel networks is no easy task. Previously, other groups have made networks that span millimeters in size.

Next-gen batteries possible with new engineering approach
Dramatically longer-lasting, faster-charging and safer lithium metal batteries may be possible, according to Penn State research, recently published in Nature Energy.

Read More: Engineering News and Engineering Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to