THE INTERNATIONAL
PIPELINE RISK MANAGEMENT
FORUM
November 8-9, 2023 | Hyatt Regency Houston West | HOUSTON
plus Training Courses
November 6-7, 2023 | HOUSTON
For program-related questions, please contact Ben Stroman: bstroman@clarion.org, +1 713 359 0016.
1
Risk‐informed Decision Making for Physical Asset Management
The authors introduce a unified framework for risk-informed decision-making in physical asset management. The method combines quantitative risk measures, risk tolerance criteria, and cost/benefit analysis to prioritize projects. It classifies risks into intolerable, ALARP, and broadly tolerable categories using a risk matrix. The necessity for risk treatment is determined for intolerable and ALARP risks, followed by quantifying the risk reduction benefits. Discounted cash flow analysis is used for cost-benefit evaluations. The method's applicability is demonstrated through a numerical example.
2
What the Pipeline Industry Should Know about Probabilistic Risk Assessment and How PRA Is
Different
Guidance issued by the PHMSA in February 2020 provided an evaluation of methods that can be used for determining pipeline risk. Based on the study of risk methodologies, the Probabilistic Risk Assessment (PRA) approach was deemed to be the “Best Practice” for its ability to support a full range of decision-making to ensure safe operations and the management of integrity. The value of the PRA lies in the ability to evaluate the complex interactions in an engineered design and the performance of its systems to define a wide range of potential outcomes.
The purpose of this presentation will be to deconstruct the PRA approach with the goal of advancing the practice of PRA and highlighting the value derived from PRA. The application of PRA for the assessment and management of risk was founded almost 50 years ago to address the risk associated with the operation of nuclear power plants.
The application of a PRA approach to risk assessment and management provides an operator with the flexibility in selecting appropriate, effective, and cost-effective solutions to ensure safe operations and effective integrity management.
3
Establishing Quantitative Risk Criteria to Support Integrity Management Decision-Making
Many companies successfully have successfully implemented a risk model, only to be left with the question “Now what?”. It is important to plan for how decision making will occur based on the risk model output. While integrating a risk model into an existing integrity management plan and/or risk management framework can be straightforward, establishing new risk criteria can be a challenging exercise. This may be even more challenging for companies with pre‐established qualitative criteria for facilities, that are now dealing quantitative results for pipelines. This presentation focuses on different approaches to engage senior leaders in establishing risk criteria. Key questions that will be answered include:
4
Implementing Quantitative Risk Analysis to Benefit Pipeline Integrity and Risk Programs
In the recent PHMSA Risk Modeling Technical Information Document (TID) from February of 2020, PHMSA has commented that Operators need to improve their risk algorithms to better understand their risk and threat levels present on their pipelines. The following statement from the TID executive summary suggests that potential inherent weaknesses need to be addressed through improved modeling:
“The Pipeline and Hazardous Materials Safety Administration (PHMSA) is issuing this report to highlight the strengths and limitations for pipeline risk models, and to support improvements in Gas Transmission and Hazardous Liquid pipeline risk models. Operators establish risk models to address risk and improve safety within their respective pipeline systems. Pipeline risk models are a foundational part of the assessment of operational pipeline risk. Federal pipeline safety integrity management (IM) regulations require pipeline operators to use risk assessments. Based on the results of pipeline inspections and failure investigation findings, both the Department of Transportation’s PHMSA and the National Transportation Safety Board (NTSB) have identified general weaknesses in the risk models used by pipeline operators in performing risk assessments for their IM programs.”
To this point, Quantitative Risk Analysis (QRA) and probabilistic algorithms can provide the insight to pipeline asset risk to help Operators overcome inherent weaknesses of their current and former algorithms. This presentation will discuss the theory behind development of QRA algorithms and how to implement them for maximum benefit to an integrity and risk program.
5
Framework for Calculating Reliability Benchmarks for Highly Volatile Liquid (HVL) Pipelines
The pipeline industry is increasingly utilizing reliability methods as a basis for decision‐making when managing the integrity of pipeline systems. Reliability thresholds for onshore natural gas pipelines and LVP pipelines have been published in pipeline standards (e.g. CSA Z662‐19, ISO 16708‐2006), however limited precedent exists for defining reliability benchmarks for highly volatile liquids (HVL). To address this gap, Flint Hills Resources (FHR) and Integral Engineering (Integral) developed a framework for calculating reliability benchmarks for onshore HVL transmission pipelines to enable FHR to evaluate the performance of their pipelines relative to their industry peers in the area of life‐safety.
This presentation will provide an overview of the framework and methodology for calculating life safety benchmarks at Average and Top Quartile industry historical performances levels. Elements of the framework and methodology that will be discussed include: the development of risk benchmarks from PHMSA failure data, the use of a product‐specific event tree to consider the potential for different hazardous outcomes, the incorporation of detailed hazard area analysis modelling results, and the use of location-specific geospatial analyses to consider population density and surface conditions.
6
QRA Consortium for Advancing the Use of Pipeline Quantitative Risk Assessment
A Consortium of five major US gas transmission operators, originally formed by Enbridge and being led and facilitated by Dynamic Risk, has initiated a collaborative project in late 2022 with an objective to advance enhanced industry use of pipeline quantitative risk assessment (QRA). It is acknowledged that there is a range of modeling approaches and risk measures that, when applied within a QRA framework, can result in different perspectives of safety risks at varying levels of confidence. Uncertainty of risk modelling results can also be driven by data availability and quality; lack of validation of historical results against actual pipeline safety performance; and inconsistent inclusion and valuation of integrity controls and activities mitigating threats and consequences. These elements of uncertainty become particularly important when comparing risk and reliability results to established acceptance criteria to make capital decisions for integrity controls and activities; achieve regulatory compliance; and gain acceptance from public stakeholders when required. Potential use cases for enhanced application of QRA include special permits, class location change, defect management (excavation and re-inspection optimization), mechanical damage management and Method 6 for MAOP reconfirmation (Gas Mega Rule Part I). Through 2023, the established QRA Consortium has been developing QRA guidelines (recommended practice) as it pertains to risk assessment and is evaluating representative pipelines (benchmarking) across a spectrum of threats, pipe types, pipeline conditions, consequence scenarios and use cases. The focus is towards identifying, assessing, and managing sources of uncertainty in order to generate greater stakeholder confidence in risk assessment results and establish consistency in the QRA process. A key component of the project is the engagement of regulatory and industry associations in parallel with the QRA benchmarking studies for input during the project progression. The presentation will focus on an overview of the QRA Consortium work to date including the following key deliverable areas:
7
Pipeline risk assessment results visualization – the essential elements of risk visualization
Pipeline risk assessment plays a crucial role in enabling pipeline operators to make informed decisions necessary to maintain the integrity of their assets. In the pipeline industry, there is a growing focus toward implementing quantitative and probabilistic risk methodologies in an attempt to more accurately reflect the real‐world risk. The available insights from these more complex risk methodologies have drastically increased. However, at the same time it has become more complex for the users of the risk results to accurately and efficiently interpret the results. Without thoughtful and effective risk visualizations, the full potential of a risk program cannot be realized. In working with many operators, each asking different questions of their risk results, DNV has developed guidelines to highlight some of the essential elements for risk visualizations to aid risk experts in crafting their risk visuals. These essential elements are the most basic of guidelines but are essential to avoid the primary pain points of risk consumers.
8
Addressing Excess Conservatism in the Assessment of Pipe Joints with Crack-Like Flaws
The MAT-8 fracture model was developed in a PRCI project of the same name in 2015 and was calibrated using elastic-plastic finite element analysis (FEA). This improves upon the linear-elastic FEA solutions in the API 579 model. The MAT-8 model has been shown to most accurately model the burst capacity of piping with crack-like axial seam-weld flaws. Assessments done using these models are still however often overly-conservative in the treatment of crack-like flaws and can result in significant underestimates of burst pressure. Some sources of conservatism are:
A consortium project was initiated through PRCI to implement MAT-8 into a cloud-based software
application. In this Consortium project, each of these sources of conservatism is being addressed with a
probabilistic implementation of the model and ongoing research and development into these topics.
In this presentation, we will give an overview of the modelling approaches developed and progress to date. These include:
This includes approximately 1000 J-integral test results. These data have been statistically analyzed by vintage, weld type, crack location, and other key variables.
9
Applications of Machine Learning in Pipeline Integrity Risk and Reliability Assessment
An important step for the management of pipeline integrity risk is to assess the severity of the identified threats to a pipeline. The severity of a threat to pipeline integrity is closely related to the probability of a potential pipeline failure. However, there are cases in which it has been challenging to effectively evaluate the severity of all the threats to pipeline integrity that needs to consider both the demand and resistance to pipeline failure. The challenges mainly arise from data acquisition and models or equations available for effectively predicting the demand or resistance to the pipeline failure. The predictability of a model or equation used to assess the severity of a threat is critical to the success of data processing. Machine learning has been widely used to process massive datasets and provide robust model predictability when the model has been trained and developed with sufficient data.
This presentation talks about two applications of machine learning to pipeline integrity assessment. The first application is related to the assessment of dent severity for pipeline integrity risk management. Firstly, we will talk about the development of the machine learning models used to characterize the dent constraint conditions and predict dent damage indicators. With the machine learning models, we were able to rapidly screen the most severe dents for further detailed analysis to make a final risk informed decision on the threat remediation.
The other application is about the prediction of pipeline tensile strain capacity (TSC) for assessing the severity of girth weld flaws for pipelines operating in a geohazard area. The TSC is used to characterize the resistance to pipeline failure due to unstable crack propagation of girth weld flaws under geomechanical loading conditions. The TSC is dependent on geometrical parameters including flaw sizes and pipe geometries in addition to pipe and weld material properties. It is challenging to define a functional dependence of pipeline TSC on all the geometrical and material parameters. The talk will be focused on the development of a machine learning model for predicting TSC.
10
Next Generation Third-Party Damage Fault Tree Model
Pipeline operators often use quantitative risk- and reliability-based methods to assess the risk posed by the threat of third-party mechanical damage, a threat which remains one of the most frequent causes of significant pipeline failures. The application of these methods typically involves use of fault tree analysis models to estimate the hit rate of third-party equipment on pipelines, with most of these models being at least 10 years old. A Joint Industry Project (JIP) was recently conducted to update these models using a collaborative approach based on shared data from six North American pipeline operators.
The next-generation fault tree was developed by performing three primary tasks. The first task updated the structure of the fault tree using previous models as a basis to reflect new damage prevention methods, as well as variations in hit rate between locations and seasons. This new structure also allows for estimation of hits from activities other than surface excavation activities, which have been the focus of most previous fault tree models.
The second task used JIP member and public data to populate the basic events of the fault tree model to reflect pipeline-specific conditions. Key updates were made regarding prevention measure effectiveness and the influence of depth of cover for different activity types. The final task involved validating the model against a representative North American pipeline network to ensure that the hit rates produced are consistent with the hit rates estimated from published pipeline incident records. Findings from this JIP are now informing plans for a second phase. The primary purpose of the second phase is to develop probability of failure given hit models for the new activity types introduced in the fault tree, including agricultural tilling and drilling. These models will allow pipeline operators to fully incorporate the threat of alternate activity types into rigorous quantitative risk assessments.
11
Optimizing Risk Decisions with Imperfect Data
In any integrity management program, there are always competing alternatives for any decision. Such as dig or not to dig, replace or not to replace. If perfect information were always available, like a math problem where everything except the answer is given, risk engineers would be an unnecessary expense. The next best alternative would be an exhaustive corpus of data with frequencies of every outcome and condition combination. However, in all but the most trivial cases these don’t exist either, and the engineer is dealt partial, imperfect information where the true state of nature is uncertain. To deal with this uncertainty in everyday life people develop heuristics, mental shortcuts that allow us to process this information with the least amount of effort and time. But when the probabilities are imprecise and the data imperfect, decisions based on these shortcuts can be fraught with biases and fallacies. However, all decisions carry some risk that is dependent on the (uncertain) true state of nature, and the potential loss associated with a given course of action. This presentation will demonstrate an innovative application of decision theory that incorporates existing knowledge and potential consequences. This will be used to quantify the tradeoffs of competing alternatives in an integrity management program to arrive at a decision that minimizes the risk based on the state of knowledge.
12
Addressing Uncertainty in Risk-Informed Decision Making
Uncertainties in the risk management are broadly classified to be of two types: uncertainty arising from inherent randomness, and uncertainty arising from lack of knowledge. Knowledge uncertainty may arise from several sources during the quantitative risk assessment (QRA) of the pipelines.
Three key components of QRA that may be affected by the knowledge uncertainty are: 1) threats assessed: emerging threats and threat interactions have unknown or misunderstood mechanisms for triggering an impact on pipeline integrity, 2) failure frequency estimation: lack of data and simplified models used for failure frequency may affect the estimated frequency by orders of magnitude due to unaccounted variance and significant model errors, and 3) consequence estimation: simplified consequence models, uncertainty regarding consequence model inputs and lack of framework to address second-order consequence effects, such as reputational costs introduces uncertainty to the quantified consequence. Risk-informed decisions can be enhanced when decision-makers have a full understanding of the estimated risk levels, as well as the uncertainties associated with the assessments. The combined effect of multiple sources of knowledge uncertainties could lead to inflated risk if different types of uncertainty are not distinguished in the quantified risk. For a decision-maker, this lack of distinction conceals the true drivers of risk and implies a level of precision in the risk estimation that may not be warranted in the presence of knowledge uncertainties. This limits the ability for effective decision-making due to lack of transparency in the options between mitigation actions and knowledge-gathering efforts. When assessing the preventative and mitigative actions to address the risk, the possibility of risk reduction due to knowledge gathering and research are not highlighted. Furthermore, if knowledge uncertainty can not be quantified, such as unknown mechanisms of failure, there is no opportunity to express confidence in the estimated risk, and accurate comparison of different types of risks such as safety, environmental, operational, and reputational becomes ineffective.
This presentation proposes to outline the common sources of knowledge uncertainty and the approaches to address it. Application of different methods of quantifying the knowledge uncertainty will be discussed based on the source of uncertainty such as, 1) categorical (e.g., classification of coating type), 2) numerical (e.g., specified minimum yield strength of a joint), and 3) models (e.g., multiple probit functions for safety consequences). In addition, an example application to a hypothetical gas pipeline in an urban area and explicit consideration of knowledge uncertainty on decision criteria will be discussed.
13
Consequences – An Empirical Analysis of Common Assumptions
Evaluating the consequence of failure typically involves assumptions, from the pipeline failure mode to the costs associated with projected impacts. Often, geospatial data is used to inform risk models on potential impacts in the area. While formulas that predict flow/dispersion of commodities, thermal radiation produced by fires, and overpressure produced by explosions have typically been validated, many other formulas and assumptions may not have undergone similar review process. This presentation focuses on utilizing empirical evidence from the PHMSA accident/incident database overlayed with geospatial data at the accident site to validate/challenge common assumptions regarding the impacts of a release.
14
Gas Pipeline Ruptures and Emergency Response Considerations
Transient flow predictions from an ignited gas rupture site depend on rapidly isolating all valves in a valve section. The effect of various actions which can shorten valve closure intervals will be discussed. The closure response time from detection of a line break to valve closure unfortunately has less effect on emergency responder search and rescue initiation inside the PIR than the public may expect. Emergency responders need to know when it is safe to begin search and rescue and to begin protecting property. Ignited pipeline rupture are some of the largest fires these responders will ever have to address. Shutting in the flame by closing the block valves does not immediately extinguish the fire. Physics ensures that the high-pressure gas in the pipeline continues to jet out and feed the fire. The search and rescue wait interval is measured in hours. This paper will discuss the effect of valve closure times for isolating different pipeline lengths. Outflow and related safe levels of thermal radiation will be discussed. This work is based on materials provided by INGAA members to PHMSA.
15
Risk Analysis to Evaluate Benefits of Rupture Mitigation Valves
When unintentional gas release occurs, emergency shutdown practices (e.g., isolation of the impacted segment to limit the gas release) along with other emergency response measures (e.g., firefighting and first responders), may reduce the severity of consequences. The application of Rupture‐Mitigation Valve (RMV) technology (e.g., automatic shut‐off valves and remote‐controlled valves) results in more rapid shutdown compared to activation of a manually operated block valve, reducing delays in making the situation safe for fire‐fighters and first responders to access to the scene.
In the USA, the Pipeline and Hazardous Materials Safety Administration (PHMSA) published a final rule (effective October 5, 2022) that included updates to regulatory requirements for operators of gas transmission pipelines in 49 CFR § 192.935(c) to perform a risk analysis to determine if a RMV would be an efficient means of adding protection to a high‐consequence area (HCA) in the event of a gas release. The rule also included a new regulation (§ 192.935(f)) which requires that these risk analyses be reviewed by the pipeline operator and annually certified by a senior executive of the company.
This presentation will review a recent RMV risk analysis conducted for a gas pipeline operator to evaluate the potential risk reduction with RMV compared to manually‐operated valves in the event of a hypothetical ignited gas pipeline rupture in an HCA. It will be shown how the risk analysis results can inform decision‐making for prioritizing mitigative measures and support compliance with PHMSA
regulatory requirements.
16
Risk Management Learnings and Challenges at PG&E
PG&E’s risk management for gas transmission pipelines is based on three models: consequence area, threat identification and risk model. The threat identification model combined with the consequence area model determine pipe populations that require different types of periodic assessment per federal code, while the risk model is used for prioritization of assessment and mitigation projects as well as cost-benefit calculations that are required for California rate case filing.
The presentation will discuss a few topics that PG&E has focused on for the last few years:
17
Efficient System-Wide Risk Assessment of Large Energy Pipeline Networks Leveraging Citizen Development and Cloud Computing
Large energy pipeline networks are complex engineering systems which traverse thousands of miles of terrain and often consist of pipe segments of different designs, vintages, and operating conditions. Quantitative risk assessment of such large pipeline networks thus imposes great challenges to pipeline operators in terms of data integration, data quality control, algorithm development, computational efficiency, and risk validation.
This presentation introduces TC Energy’s ongoing effort to modernize its quantitative pipeline risk assessment process from relying on an obsolete PC-based application to a cloud-based solution from a major web service provider. The modernization enables automation of data integration and data quality reporting leveraging the platform’s data ETL (extract, transform, load) functionalities, allows more efficient development, test, and validation of risk analysis algorithm via the adoption of Python-based citizen development, and simplifies risk results reporting and communication. The modern cloud computing platform and the Python based citizen development also enable the use of advanced analytics and adoption of the latest machine learning algorithms for further improvement of risk analysis algorithms.
18
Threat Interaction Factor Update and Optimization for the Gas Mega Rule Part 2 Compliance
Threat interaction has been generally modelled using Threat Interaction Factors (TIF), a concept introduced more than a decade ago. The factor is a correction that is added to the calculated probability of failure for a given threat to consider the interaction with another threat. The TIFs are determined from a review of the publicly available PHMSA incident database. Section 192.917 of the Gas Mega Rule Part 2 (RIN‐2) requires the analysis of “the likelihood of failure due to each individual threat and each unique combination of threats that interact or simultaneously contribute to risk at a common location.” This requirement has been interpreted by some operators as the need to come up with a TIF for each possible threat combination. In preparation for RIN‐2 compliance, the TIFs for natural gas transmission pipelines reported in PHMSA DTPH56‐14‐H‐004 “Improving Models to Consider Complex Loadings, Operational Considerations, and Interactive Threats” were updated and further processed to achieve better resolution. This presentation will also discuss why a limited number of threat interactions are
meaningful.
19
Quantitative Risk Assessment of Pure and Blended Hydrogen Pipelines – Challenges and Opportunities
Growing demand for the design and operation of hydrogen and blended hydrogen / natural gas pipelines has introduced both challenges and opportunities for the application of quantitative risk assessment (QRA) to pipeline systems.
Challenges: Unlike conventional pipelines, hydrogen service pipelines do not have an extensive history of operation (and associated integrity data) upon which to base risk assessments. For this reason, QRA of hydrogen pipelines must be based on informed adjustments to QRAs of natural gas pipelines, structural reliability modeling, failure consequence modeling, and experimental data. While there have been significant developments on these fronts recently, there remain technical gaps that require further analysis and testing, including:
Opportunities: Probabilistic QRA methods can provide a means of systematically accounting for the significant uncertainties inherent in hydrogen pipeline operation and provide a quantitative basis to take credit for reductions in those uncertainties as research advances. QRA also provides a means to demonstrate the safety of pipeline operation in the absence of existing prescriptive regulatory guidance regarding hydrogen pipeline design and operation (or when existing regulatory guidance is prohibitively conservative). Further, QRA may be used to quantify the benefits of the mitigative measures (e.g. surface treatments, increased inspection frequencies) and alternative materials (e.g. composites) that are currently being investigated for use in hydrogen pipeline service.
This presentation will describe recent advances made in developing QRA models for pure and blended hydrogen pipelines, discuss opportunities for the application of these models to demonstrate safety and shape research, and identify key research gaps to be filled for this approach to reach its full potential.