9_Data Quality and Management

Published on January 2017 | Categories: Documents | Downloads: 38 | Comments: 0 | Views: 237
of 10
Download PDF   Embed   Report

Comments

Content

Chapter 9
Data Quality and Management
The importance of data quality and management continues to increase
as we address ever more demanding interpretation challenges and move
irreversibly farther into the workstation environment. In the glare of technological developments enabled by massive growth in computing power, we
must not lose sight of the critical needs to assess data quality as an essential element of every interpretation project and to manage the burgeoning
volumes of data and interpretation products that threaten to overwhelm us.
More than you may realize, these two concerns can affect the quality, timeliness, and ultimately the business value of your interpretation.

Data quality
A discussion of seismic data quality necessarily begins by defining
exactly what is meant by “quality.” In its most general sense, quality is the
degree to which something fulfills its intended purpose. All measures of
seismic data quality are inherently subjective, so it is important to know why
a particular data set was acquired and processed the way it was so as to set
the proper context for assessing its quality. For example, you wouldn’t grade
data from a conventional 3D survey purposely acquired and processed for
deep exploration as poor quality because they aren’t suitable for evaluationing shallow drilling hazards. Similarly, you wouldn’t consider data from a
2D high-resolution shallow hazards survey as poor quality because they are
useless for deep exploration (compare Figures 1 and 2). Given the purpose
for a data set, you evaluate quality based on specific characteristics according to the degree to which the data set suits its purpose.
Assessing seismic data quality is one of the most important aspects of
your job as a seismic interpreter. It is an expectation that you satisfy and a
requirement that you meet in every interpretation project. Your ability to
153
Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 153

05/12/11 2:16 PM

154  First Steps in Seismic Interpretation

t

1.5 s

2.5 s

Figure 1. Line from a conventional 3D time-migrated data set that was purposely
acquired and processed for use in deep exploration (below 4.0 s). Compare this
line to the 2D high-resolution line in Figure 2 (courtesy WesternGeco).

describe and effectively communicate your evaluation of data quality develops over time as you gain experience, expand your knowledge of seismic
data acquisition and processing, and broaden your exposure to different elements of geology in a wide range of settings and environments.
At times, assessing data quality can seem to be no more than a beauty
contest. In a comparison of several lines from the same area or of the same
line with different processing sequences applied, the one that is most pleasing to the eye or looks most geologic or has the highest signal-to-noise ratio
(S/N — perhaps the most common quantitative measure of data quality)
or is visually dominated by smooth and coherent reflections and sharply
defined faults is judged to have the highest quality. Although this view may
often be correct, it should be taken only after you have reviewed the acquisition and processing history of the data and placed the seismic image into the
context of the geologic setting in which the data reside. After all, beauty can
be only “skin deep,” and data processing has been known to turn the occasional sow’s ear into a silk purse. Your view of data quality takes shape and

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 154

05/12/11 2:16 PM

Chapter 9:  Data Quality and Management  155

t

1.5 s

2.5 s

Figure 2. Line from a 2D high-resolution (time-migrated) shallow hazards survey
that was purposely acquired and processed to optimize resolution of the shallow
section. Compare this line to the conventional 3D time-migrated line shown in
Figure 1; both lines are located along the same surface track (courtesy BP).

evolves throughout a project as you look at all of your data. So take care not
to assess and communicate data quality until you have had the opportunity
to view all of the available data. Similar to the perils of the one-line interpretation, assessing data quality based on the view of only one or a small
number of lines from a larger grid or survey can be seriously misleading.
The three primary elements of seismic data quality are detection, resolution, and image fidelity (Schoenberger, 2004, personal communication).
All of the time and effort expended in acquiring and processing seismic data
in one way or another addresses one or more of these elements, with the
intent of providing you, the interpreter, with as near to an ideal representation of the true reflection seismic response to subsurface geology as possible
(refer to the “ideal” seismic response in Figure 1 of Chapter 1). Detection
is determination by the seismic method that a feature is present, or gives
rise to a seismic reflection that is recognizable as signal above the level of
ambient or background noise. The power of a data set to detect subsurface
features of interest is often described in terms of its S/N. Resolution is the
ability to resolve by the seismic method two features that are close together

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 155

05/12/11 2:16 PM

156  First Steps in Seismic Interpretation

(refer to Chapter 6). The so-called resolvable limit, then, for discrete seismic reflectors is the minimum separation for which you can tell that more
than one interface is involved in the observed seismic response (Sheriff,
2002). Finally, image fidelity is the degree to which the processed seismic
response accurately depicts the true subsurface positions and magnitudes of
seismic reflections. It consists of correct focusing and accurate positioning
of reflections.
Usually you assess overall data quality using a range of subjective terms
such as “excellent,” “good,” “fair,” “poor,” and “unusable” (avoid colloquialisms). You should be mindful of and prepared to discuss and illustrate
which of the three elements of data quality dominates in your assessment. It
is good practice in any interpretation project to map areal variations in data
quality, remembering that quality very often will vary with time/depth as
well. In other words, a single data-quality map for a particular project will
not necessarily represent variations in quality and, ultimately, in confidence
of picking from horizon to horizon.
A common product of mapping seismic data quality is shown in Figure 3, in which a green-yellow-red or “traffic light” color sequence is used
to represent quality grades of good-fair-poor. As with any geologic or geophysical map, it is essential to include a legend on a data quality map to
ensure that color and grade conventions are communicated clearly. You
can construct maps such as Figure 3 by hand or in a workstation by picking a horizon that reflects different grades of data quality. The workstation

Good
Fair
Poor

Figure 3. A “traffic light” data-quality map for an interpretation project area. The
legend shows how the colors correspond to the different grades of data quality.

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 156

05/12/11 2:16 PM

Chapter 9:  Data Quality and Management  157

approach involves manually tracking a horizon composed of quality picks
whose values or ranges of values correspond to different grades of data
quality (see Figure 4); the horizon could represent overall data quality or the
quality or confidence associated with picking specific horizons of interest.
This approach requires additional picking effort and concentration on continuous assessment of data quality — well worth the time in terms of value
added to overall interpretation results.
Data-quality maps can be used as stand-alone maps or as overlays to
other maps created during an interpretation project. Do not consider any
interpretation to be complete until it includes an assessment of data quality
in the form of a finished data-quality map(s) and accompanying descriptive text in a final project report. Your conclusions about data quality are a
critical element in characterizing uncertainty and risk for your interpretation as well as providing a basis for recommendations to acquire or reprocess data.

Quality of
blue pick

Color table

Time
(depth)

Map view

Figure 4. Constructing a data-quality map from quality picks associated with an
interpreted horizon. The quality picks are made as a single horizon whose times
(depths) or time (depth) ranges uniquely correspond to specific quality grades.
The color table for the resulting map is designed to portray the different grades in
a traffic light scheme according to the time (depth) of the quality picks.

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 157

05/12/11 2:16 PM

158  First Steps in Seismic Interpretation

As mentioned, grading the quality of a seismic data set is undoubtedly
subjective and depends heavily on your experience and your knowledge of
seismic acquisition and processing. At the same time and in the same way
that you improve correlation skills by looking at as much data as possible,
you also develop the ability to evaluate data quality and effectively communicate the results of a quality evaluation. Simply stated, it is impossible
for anyone who actively seeks to enhance interpretive skills and judgment
to look at too much data.

Data management
Data management involves procedures and practices to help you keep
track of the large quantities of data and interpretation products common
in the oil and gas business. In the modern interpretation setting, data management is primarily an electronic concern. In the preworkstation days,
data management involved activities such as printing and storing maps
and paper sections; losing, misplacing, or physically damaging maps and
paper sections; sharing maps and paper sections with other interpreters;
and nuisances such as running out of a particular shade of colored pencil or
other critical resource. In retrospect, there were recognizable boundaries
in those days — sets of tangible constraints that limited the amount of data
and information a company or individual could amass. In contrast is the
modern workstation environment, in which there appear to be few boundaries, especially in view of the increasing power and efficiency of computers and the seemingly unlimited amount of available electronic storage
space. The quantity of data volumes and derived interpretation products
continues to expand unchecked. It is more important than ever for you to
manage data, that is, to bear responsibility and accountability for what to
keep, where to keep it, how to find it later, and, perhaps most importantly,
what to throw away.
The increasing number of depth-migration projects has placed special
demands on the need for effective data management. These projects involve
multiple stages of interpretation on several different migration volumes
(sediment floods, salt floods, etc.), and in the most complex settings there
may be migrations with different algorithms in a particular stage of the project. For example, as illustrated in Figure 10 of Chapter 5, there are a number of different algorithms, each with its own technical and cost benefits,
that you might use to address a difficult base-of-salt imaging problem in a
subsalt interpretation project. In such a case, your final base-of-salt surface
could be an amalgamation of horizons picked on several different data volumes into a single surface you would then use for depth migration. If you

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 158

05/12/11 2:16 PM

Chapter 9:  Data Quality and Management  159

would ever need to refine or update this surface, you would have to know
which part of it had been picked on which version of migration, which
requires that you properly manage your horizons and migration volumes.
In such a project, it will not always be clear which migrated volume affords
the most accurate base-of-salt image, so your experience often will dictate
which image is most nearly correct (or least incorrect) or most geologically
reasonable in the sense that it is analogous to known geology.
Recommended practices for good data management are independent
of specific workstation systems. Most commercial workstation systems
include basic data-management functionality that varies only in specifics
from system to system. The most important practice might well be selfrestraint in creating and maintaining only those data volumes and products
that contribute measurably to a project and that serve as parent entities from
which you can recreate other volumes and products as necessary; this, of
course, depends on the frequency with which the child entities are used
and the amount of time it takes to recreate them. The practice also involves
communication with other interpreters on the same project to avoid needless
duplication of data and products.*

Nomenclature systems
Effective management of an interpretation project requires that, from
the outset, you establish and maintain nomenclature systems for data sets
and interpretation products. These systems must be intuitive, rigorous in the
sense that they must contain core information without fail, and flexible or
expandable to accommodate information that is unique to the type of data
or product being managed. A well-designed system standardizes nomenclature for any type of data or product that can be categorized: trace data files,
arbitrary lines extracted from 3D volumes, horizons, faults, mapping files,
wells, and culture files (such as lease outlines and political boundaries).
Within a system, you should subdivide any particular category in whatever
way facilitates information storage or retrieval. It goes without saying that
you should thoroughly document all nomenclature conventions and link
this documentation to its project, even if there are corporate standards for
nomenclature, in the same way and for the same reason that every map has
a scale bar, north arrow (or some azimuth reference), and legend.
*See Herron (2001) for a more detailed discussion of seismic data management,
with particular reference to the pitfalls involved in failure to manage multiple data
sets effectively and the inability to conduct an integrated interpretation based on
multiple data sets.

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 159

05/12/11 2:16 PM

160  First Steps in Seismic Interpretation

A hypothetical nomenclature system for interpreted horizons in a workstation project might read as follows:
Each horizon name must have a minimum number of core elements,
each delineated by an underscore (_). These elements must appear in the
following order in each unique horizon name:
• Project identifier
• Numerical designation for the approximate geologic age of the horizon (this number increases with the age of the interpreted horizon and
enables the horizon list to be displayed in order of horizon age)
• Color assigned to the horizon
• Biostratigraphic age of the horizon
• Name of the trace data file on which the horizon was interpreted (for
individual horizons only, does not apply to merged horizons)
• Initials of the interpreter who picked the horizon
Following this convention, the components for a horizon named
X2005_0600_green_A_pl20_ewfa04_DH would be defined as in Table 1.
In our hypothetical system, a process such as interpolation, filtering, or attribute extraction applied to an original (parent) horizon would be specified
in the horizon name after the designation of the biostratigraphic age or the
trace data file but before the interpreter’s initials. In our example, the parent horizon smoothed with a 5 × 5 equally weighted spatial filter would be
X2005_0600_green_A_pl20_ewfa04_sm5x5eq_DH.
Although it is common practice to name trace data files carefully in
2D and 3D workstation projects, many interpreters do not recognize the
Table 1. Components in a hypothetical nomenclature system for an interpreted horizon named X2005_0600_green_A_pl20_ewfa04_DH.
Term

Definition or meaning

X2005

Exploration project that began in the year 2005

0600

Numerical designation for the horizon

green

Color assigned to the interpreted horizon

A

Named lithologic unit (in this case, the designated “A” sand)

p120

Biostratigraphic age of the horizon (in this case, lower Pliocene)

ewfa04

Name of trace data file on which the horizon was interpreted (in this case,
the final gained version of a PSTM data volume with a sample rate of 4 ms)

DH

Interpreter’s initials

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 160

05/12/11 2:16 PM

Chapter 9:  Data Quality and Management  161

importance of using consistent and intuitive nomenclature for arbitrary lines
extracted from 3D volumes. You should assign names to these lines in such
a way that another interpreter can easily grasp the purpose for which the
line was created. For example, if an arbitrary line is a well-tie line, then
somewhere in the name for that line should be the name(s) of the well(s) to
which the line is tied. At the same time, you should specify the orientation
of an arbitrary line in X-Y-space whenever possible. You can easily do this
by naming the line according to its compass direction. For instance, if an
arbitrary line is extracted in the northeast to southwest direction, then its
name should include the characters N45E or S45W to indicate its azimuth.
Compass directions are preferred over the standard abbreviations NE or SW
because often the line direction cannot be described by these abbreviations,
e.g., N60E or S30W. The operating philosophy in nomenclature schemes is
to establish and maintain order and consistency in your data management —
to eliminate obfuscation, if you will.
The importance of systems such as this cannot be overstated. The story
is told of an interpreter who arrived at a new posting and asked which of the
many interpretations in his new project was the best or latest or least incorrect. No one could answer this question, and the previous interpreter had
left no decipherable records (a problem in itself). So the interpreter did what
almost everyone else would do: Judging that it would take longer to evaluate all of the existing interpretations than to do his own, he proceeded with
his own interpretation. The unfortunate result was to further entangle the
project with more picked horizons and faults, many of which were probably
duplications of existing work.
This was not lack of data management; rather, it was data mismanagement. Eventually, data mismanagement has a negative business impact
that may not become evident until many years and interpreters later. The
excuses that “It doesn’t matter because some day this project will have no
value” or “Don’t worry, we can get by” are unacceptable because poor data
management is habit forming, and its effects can be latent, harmful, and
unpredictable.

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 161

05/12/11 2:16 PM

Downloaded 17 Feb 2012 to 198.3.68.20. Redistribution subject to SEG license or copyright; Terms of Use: http://segdl.org/

SEG-SEISMIC-11-0601-009.indd 162

05/12/11 2:16 PM

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close