In This Issue

Public Meeting on Applying, Interpreting, and Evaluating Data from ILI Devices

Editor’s note: I apologize for the length of the following article, but I think you will find the information very helpful in understanding OPS’s concerns on this topic.

The Office of Pipeline Safety (OPS) is hosting a public meeting to discuss concerns it has with how operators are applying, interpreting, and evaluating data acquired from In-Line Inspection Devices (ILI), and OPS’s expectations about how operators should be effectively integrating this data with other information about the operator’s pipeline. The meeting will be held Thursday, August 11, 2005, from 8:30 a.m. to 4.30 p.m. in Houston, TX, and is open to all interested parties. The meeting location has not been determined yet and will be made available on shortly.

ILI technology has been used for approximately 20 years and has become the preferred method used by pipeline operators to ensure the integrity of their pipeline assets. However, as demonstrated by recent accidents on hazardous liquid and natural gas pipeline systems, some pipelines that were inspected by ILI devices continue to fail. OPS will share its findings from these accidents and from recent Integrity Management Program (IMP) inspections. OPS needs to determine if the problem resides in the technology or in the secondary and tertiary stages of the ILI data evaluation-data characterization, validation, and mitigation. Specifically, is the problem data analysis, peer review of technicians involved in data review, lack of common standards for data review, detection thresholds, data validation, or the understanding of each tool’s strengths and weaknesses? A secondary objective of this meeting is for OPS to understand how the government, pipeline operators, standards organizations, and ILI vendors can help improve pipeline assessment using ILI technology. At this public meeting, OPS will highlight effective practices and use this medium to share these practices with the public.

The preliminary agenda for this meeting includes briefings on the following topics:

  • OPS’s Experiences on Data Extracted using ILI Devices
  • OPS Case Studies
  • Hazardous Liquid IMP Inspection Experiences
  • Views of Pipeline Operators
  • Perspective from ILI Vendors
  • Focus of Independent ILI Data Analysts
  • ILI Standards
    • Personnel Qualification and Vendor Reports
    • ILI Flaw Detection Criteria
    • ILI Data Discrimination
    • Field Evaluation of ILI Data-Statistical Sampling, Flaw Thresholds, and Tolerances
    • Contractual Criteria for Defect Reports
  • Next Steps

OPS is concerned about the secondary and tertiary evaluations being performed after ILI data is acquired because of several accidents that have occurred throughout the U.S. in the recent past. According to OPS’s experience, failures have occurred on pipelines inspected by all types of ILI tools. The following are some examples of pipelines that failed relatively soon after the pipelines were inspected, the data was analyzed, and the findings were reported to the pipeline operators:

  • In 1999, a small hazardous liquid pipeline operator used a state-of-the-art tool and mis-characterized a “wrinkle with a crack” as a “T-piece.” A few months later the pipeline ruptured at the location of this wrinkle. Most appurtenances and fittings like a T-Piece will be welded to the main pipe. However, there were no girth welds on either side of this mis-characterized T-piece as is typical for a T-piece.
  • In 2003, a hazardous liquid pipeline that was inspected just about a year before, failed in service. OPS’s investigation revealed that general corrosion caused the failure. On analyzing the data, OPS gathered that the ILI tool detected some pitting and the maximum pit depth was reported to be less that 50% of remaining wall. However, from a metallurgical analysis of the pipe segment OPS discovered 27 corrosion pits varying from 18% wall loss to 95% wall loss. The pipe failed where the wall loss was 95%.
  • In February 2004, a natural gas pipeline operator launched a geometry pig but the tool missed a series of wrinkles. One of those wrinkles ruptured. During our post-incident investigation OPS discovered that other wrinkles in the pipe were called out as pipe wall thickness changes although there were no girth welds adjacent to the location where the wall thickness changed.
  • Another hazardous liquid pipeline that was inspected seven times with different tools in a span of 10 years ruptured in 2004. The rupture was determined to have been caused by general corrosion. The general corrosion was detected by an ILI tool launched before the most recent ILI run.
  • In October 2004, a hazardous liquid pipeline operator launched three tools-a geometry pig, a corrosion detection pig, and an axial flaw detection pig-in relative succession to conduct a baseline assessment and to comply with the IMP regulations. About six months after these tools were launched, the pipeline’s seam split.
  • In November 2003, incipient third party damage caused another hazardous liquid pipeline to rupture just eight months after it was pigged. An investigation revealed several longitudinal scratches and gouges on the pipe surface that were undetected by the ILI device.

OPS has also learned that pipeline operators do not have a consistent, standardized process to evaluate and assess data extracted by ILI devices. For example, some pipeline operators provide guidance to ILI vendors, contract field inspection personnel, and company personnel on how to assess ILI data. Others rely entirely on the ILI vendor or may actively participate in data extraction, or may even conduct an independent peer review of the ILI data if they have in-house expertise. For corrosion anomalies, pipeline operators use different interaction criteria. Some pipeline operators want only the deepest pit reported on each pipe length. Others want all pit depths reported. One pipeline operator directed the ILI vendor to report all anomalies, especially those with signatures that are indecipherable. OPS believes this to be a good practice, although it is not universally applied.

OPS believes that most of the pipeline failures that occurred on pipeline segments that were inspected with ILI tools could have been prevented with the correct application of technology. The failures that OPS investigated have revealed that the larger problem may be with the machine-man interface during the latter stages of data analysis. Specifically, should the repositories of flaw signatures that ILI vendors use be improved? Must there be more attention expended on the peer review of technicians? Is the sample size used to confirm electronic data adequate or must it be increased? Should the data extraction process be more stringently monitored?

During this public meeting, OPS will seek answers to the following questions:

  • What are operators’ experiences and expectations with the capabilities of ILI technology?
  • Is there a gap in understanding ILI tool data submitted by vendors of this technology?
  • Do ILI technology vendors educate their clients about the limitations of the tool being recommended for the application?
  • What defect detection and report criteria are used? Is it developed jointly by the vendor and the pipeline operator?
  • How are tool defect identification tolerances applied in reported criteria?
  • Is there a formal detection, validation, and mitigation process used to evaluate defects? How is it communicated to the pipeline operator?
  • What process is used to arrive at the number of confirmatory digs to corroborate the data extracted by the ILI device?
  • Are the standards developed for ILI technology appropriate for the current state ILI deployment? Does the guidance meet the needs of the large or small pipeline operator who is the first-time user of such technology?

OPS expects at this public meeting to inform on the following:

  • The technique and criteria used to report defects
  • Information exchange between the ILI vendor and pipeline operator during the secondary and tertiary stages of flaw characterization
  • The currency and adequacy of performance standards for vendors of assessment technologies
  • Sufficiency and relevance of performance standards for ILI assessment technology
  • Stages in data discrimination: Detection, validation, and mitigation