Skip to main content Skip to secondary navigation
Article Newspaper/Magazine

SEP-152 (2014)

Download

Book (pdf)

Full waveform inversion

Efficient and robust waveform-inversion workflow: tomographic FWI followed by FWI (pdf)

[SRC]
Biondo Biondi and Ali Almomin
In many important practical cases when FWI convergence is uncertain because of the lack of low-frequency data and/or long offsets, we would like to employ waveform-inversion methods, such as tomographic FWI (TFWI), that offer more robust convergence. However, the additional computational cost can be a serious deterrent form applying TFWI to the full bandwidth of the data. As an alternative, we propose the following cost-effective TFWI+FWI workflow: 1) TFWI applied to the low frequencies in the data, 2) FWI applied to the high frequencies starting from the model estimated by TFWI. We tested TFWI+FWI on two synthetic datasets computed from the Marmousi 2 model by comparing the results obtained by full-bandwidth TFWI with models obtained by the proposed workflow when TFWI was applied to data low-passed to a maximum of 10 Hz. The new workflow converged to very accurate models even when conventional FWI with frequency continuation failed. Further cost-saving can be achieved by switching from TFWI to FWI before TFWI reaches full convergence. Depending on the number of TFWI iterations, the quality of the final model is negatively affected but it can be still satisfactory.


Preconditioned tomographic full waveform inversion by wavelength continuation (pdf)

[SRC]
Ali Almomin and Biondo Biondi
Tomographic full waveform inversion (TFWI) provides a framework for inverting seismic data which is immune to cycle-skipping problems. This is achieved by extending the wave equation and adding a spatial or temporal axis to the velocity model. For computational efficiency, the inversion is performed using a nested scheme. However, TFWI requires a large number of iterations because of its slow convergence rate. We analyze the Born and tomographic operators and find two major sources of this slow convergence. The first source is kinematic artifacts in the extended model due to a biased AVA behavior in the acoustic two-way wave-equation. The second source is early imposition of short wavelength updates in the velocity model that are difficult to move. We provide two modifications of the nested inversion scheme of TFWI that mitigate these sources of slow convergence. The first modification is preconditioning of the extended model to balance the AVA behavior of the acoustic wave-equation. The second modification is imposing wavelength continuation by filtering the gradient in the outer loop. We test the new algorithm on synthetic examples. The results of the modified algorithm on the BP model show a great improvement in convergence rate while maintaining the high accuracy of TFWI.

 
Joint full-waveform inversion of time-lapse seismic data sets (pdf)

[SRC]
Musa Maharramov and Biondo Biondi
We present a technique for reconstructing subsurface model changes from time-lapse seismic survey data using full-waveform inversion (FWI). The technique is based on simultaneously inverting multiple survey vintages, with regularization of the model difference. In addition to the fully simultaneous FWI that requires the solution of a larger optimization problem, we propose a simplified cross-updating workflow that can be implemented using the existing FWI tools. The proposed methods are demonstrated on synthetic examples, and their robustness with regard to repeatability issues is compared to alternative techniques, such as parallel, sequential, and double-difference methods.

 
Shape optimization using the FWI objective function for salt body segmentation (pdf)

[SRC]
Taylor Dahlke, Biondo Biondi, and Robert Clapp
Level set methods can provide a sharp interpretation of the salt body by defining the boundary as an isocontour of a higher dimensional implicit representation, and then evolving that surface to minimize the Full Waveform Inversion (FWI) objective function. We can take advantage of the fact that the implicit surface update gradient is based on the tomographic update gradient, and utilize it to update the background velocity concurrently with the salt boundary. Using this approach on synthetic examples, we can achieve reasonable convergence both in terms of the residual L2 norm, as well as the evolution of the salt boundary and background velocity towards the true model.

 
Revisiting absolute amplitude matching in waveform inversion (pdf)

[SRC]
Xukai Shen
In waveform inversion, the high resolution in the inversion results are usually attributed to absolute amplitude matching. Such high resolution is particularly attractive in complex geological settings where conventional ray-based methods fail to deliver enough resolution. In this paper, I reexamine the importance of absolute amplitude matching in waveform inversion. With enough illumination angles, absolute amplitude matching does not bring additional resolution, yet makes the inversions result extremely sensitive to absolute amplitude mismatches. I illustrate this with acoustic inversions of non-acoustic data.

Anisotropy and attenuation

Rock physics constrained anisotropic WEMVA: Part I - Theory and synthetic test (pdf)

[SRC]
Yunyue (Elita) Li, Robert Clapp, Biondi Biondo, and Dave Nichols
We present a regularization scheme utilizing available rock physics data to better constrain the anisotropic wave-equation migration velocity analysis (WEMVA) and to better resolve the ambiguity among the anisotropic parameters. In addition to the spatial covariance to constrain the spatial correlation of each VTI parameter individually, we propose a cross-parameter covariance at each subsurface location to link the VTI parameters. There are two significant effects that this regularization scheme brings to the updates for the VTI parameters. First, instead of spreading the updates evenly along the wavepath, the regularization term allows more updates in the regions where the models are highly uncertain. Second, the regularization term brings extra information for parameter updates from the correlation with the other parameters. These improvements help the inversion converge faster and yield VTI models that are more consistent with the underlying geological and lithological assumptions. We demonstrate these improvements on a synthetic dataset.

 
Artifact reduction in pseudo-acoustic modeling by pseudo-source injection (pdf)

[SRC]
Musa Maharramov
In this work I propose a framework for deriving fast finite-difference algorithms for the numerical modeling of acoustic wave propagation in anisotropic media. The framework is demonstrated in the case of transversely isotropic media, for which I have implemented and tested a kinematically accurate fast finite-difference modeling method. I demonstrate that a significant reduction of the shear artifacts is achieved by the proposed technique in comparison to similar kinematically accurate finite-difference methods. I describe alternative artifact-free but computationally more expensive spectral methods as well.

 
Rock physics constrained anisotropic WEMVA: Part II - Field data test (pdf)

[SRC]
Yunyue (Elita) Li, Robert Clapp, Biondi Biondo, and Dave Nichols
We test our rock physics constrained anisotropic WEMVA methodology on a Gulf of Mexico dataset. Based on the well logs and the previously inverted lithological interpretations, we perform stochastic rock physics modeling to sample the possible ranges of the anisotropic parameters. These modeling results are then summarized by an average model and a cross-parameter covariance matrix under the multivariate Gaussian assumption. When inverting the surface seismic data using the anisotropic WEMVA method, we start from the average model and regularize the inversion using the geological dips and the cross-parameter covariance. The preliminary results show improvements in the migrated image with higher resolution and better definition of the dipping sedimentary layers around the salt in the shallow region. Further iterations are needed to better resolve the VTI model and to properly focus the image at depth.

 
Removing shear artifacts in acoustic anisotropic wave propagation (pdf)

[SRC]
Huy Le, Stewart A. Levin, Robert G. Clapp, and Biondo Biondi
We present a method for removing the shear-wave artifacts that occur in anisotropic modeling under the acoustic approximation. Our method is based on an eigenvalue decomposition of the differential operator in the wavenumber domain. Application in a homogeneous orthorhombic medium shows that the shear wave artifacts are removed completely. Accuracy of the resulting operator is also investigated in different media. We find that as the degree of anisotropy reduces, a more compact operator can be used to achieve an acceptable level of accuracy. We apply the proposed method to a heterogeneous model by using the Lloyd algorithm to select a number of references and computing a table of operators.

 
Wave-equation migration Q analysis (WEMQA) from Angle-Domain Common Image Gather (ADCIG) (pdf)

[SRC]
Yi Shen
To produce a reliable Q model, I present a new method for wave-equation migration Q analysis in angle-domain common image gathers, and develop two ways of choosing the reference images for the objective function: one using the zero angle of each angle gather and one using the zero angle of the reference angle gather. Two synthetic tests of this method demonstrate its ability to retrieve a model with Q anomalies, especially when using the zero angle of the reference angle gather. Compared with Q analysis on the stacked image, Q estimation using pre-stack gathers can obtain a higher resolution result and mitigate the side lobe problems that arise in the stacked gather.

Imaging and inversion

Preliminary results of iterative 1D imaging with the hybrid penalty function (pdf)

[SRC]
Mandy Wong and Antoine Guitton
The hybrid penalty function (HPF) varies from l1 to l2 smoothly, thus offering opportunities for (1) robust optimization when non-Gaussian noise is present in the data and (2) sparseness regularization when blocky or spiky models are needed. Using a solver designed to minimize the HPF and a 1D migration operator, both properties are tested and inversion results compared with those obtained with the l2 norm. The HPF yields sparse, noise free reflectivity series. The choice of parameters controlling the sparseness behavior remains difficult. More realistic 2D and 3D examples should follow.


What Bayes sayes (pdf)

[SRC]
Stewart A. Levin
Bayesian approaches have been applied to many challenges in the Earth Sciences, including earthquake characterization, well log correlation, pollution monitoring, and resevoir history matching. These approaches provide a completely rational and mechanical means for incrementally improving almost any initial prior probability distribution towards the actual distribution as new information is presented. Indeed, in this very report the Stanford Exploration Project is tackling uncertainty in seismic inversion by including additional, probabilistic information from rock physics models. In this report, I explicate the method and some of its limitations, in particular the value of a good prior when, as seems inevitable, we have only a meager supply of new information. concluding that expertise really counts.

Signal and image processing

Automatic default for hyperbolic softclip (pdf)

[SRC]
Jon Claerbout
The hyperbolic penalty function leads us to gain residuals r, initially data d, by gain-parameter g in the softclip function h'(d)=gd/√1+g2d2 producing output in the range ±1, convenient for viewing data and for scaling in an optimization gradient. Annoyingly a numerical value of the scaling factor g must be chosen. Personal judgement with a data set here suggests starting with g as the inverse of the 75th percentile of |d| or |r|. From there I explore a method of finding a g that is optimum in the sense of uniformly populating the output range [-1,+1]. A value of g satisfying our intuitive sensibilities was found minimizing a Jensen inequality involving sums of |r| log(|r|). This suggests an automatic default for the l2 to l1 transition. I hypothesize data fitting iterations will be accelerated by applying softclip to the residual before gradient calculation.

 
Shot-gather angle-domain noise filtering in RTM (pdf)

[SRC]
Mandy Wong, Biondo Biondi, and Shuki Ronen
Unwanted noise in reverse-time migration image are filtered by discriminating in the prestack subsurface angle domain. For each shot image gather, we use a specific set of angles to perform the filtering. This angle range restriction can vary with shots, depth, and midpoint location. We find that noise filtering method can help alleviate migration artifacts or crosstalk noise in the image. A field example shows that prestack angle-domain noise filtering is very useful in least-squares reverse-time migration.

 
Downward continuation of Mars SHARAD data (pdf)

[SRC]
Stewart A. Levin and Fritz Foss
Shallow Subsurface Radar (SHARAD) data from the Mars Reconnaissance Orbiter are acquired approximately 300 kilometers above the Martian polar icecap. In this report we detail how to adapt seismic 3D poststack downward continuation to allow construction of the data that would have been recorded a short distance above the Martian surface, thereby saving significant computational time and storage in subsequent imaging and analysis of the shallow polar subsurface.

Velocity estimation

Efficient velocity model evaluation with multiple shots (pdf)

[SRC]
Adam Halpert
An efficient method for quickly testing velocity models can be useful in the model-building workflow, especially if several discrete models are under consideration. Previous demonstrations of a scheme using Born-modeled wavefields have been limited by the requirement to image only sparsely-sampled locations in order to avoid crosstalk artifacts. Alternatively, performing multiple experiments in order to ''fill in" a larger proportion of the image can overcome this limitation, at the expense of computational complexity. Tests on both synthetic and field data indicate that this may be a worthwhile tradeoff, especially since even a multi-shot approach is still much less expensive than a standard migration of the full dataset.

 
Residual-moveout-based WEMVA: a WAZ field data example. Part I (pdf)

[SRC]
Yang Zhang and Biondo Biondi
In our previous reports (SEP--147 and SEP--149), we have laid the theoretic foundation of the residual-moveout-based wave-equation migration velocity analysis, and we present the test results on the synthetic 2-D BP model. In this paper, we report our efforts on applying this method to an industry scale 3-D marine WAZ data set --- E-Octopus III in the Gulf of Mexico. The 3-D field data poses many challenges for our implementation, including irregular geometry, abnormal traces, complex 3-D salt geometry and more importantly, huge data volume and large domain dimensions. To overcome these hurdles, we apply careful data regularization and preprocessing, and employ a target-oriented inversion scheme, focusing on the update of sediment velocities in a subsalt region. This target-oriented scheme significantly reduces the computational cost, allowing us to keep the total computation load manageable on our academic cluster. Our preliminary result shows that, even though the angles of illumination on the subsalt sediments are very limited (does not exceed 25 degrees) in this data set, the moveout on the angle gathers are still measurable, therefore can be used for the RMO-based WEMVA update.

 
Multiple realizations of equiprobable velocity fields using sequential Gaussian simulation (pdf)

[SRC]
Guillaume Barnier and Robert Clapp
For a given seismic dataset, there may be many equally reasonable seismic velocity models that could be used as input into imaging algorithms such as migration. Small variations in velocity models can have a substantial impact on the seismic image structure that is produced, and eventually interpreted. Here, we focus our study on building a geostatistical workflow that will later enable us to better assess the impact of velocity uncertainty on seismic image structures. Using sequential Gaussian simulation, we produce a range of equiprobable and geologically consistent realizations of RMS velocity fields.

Accelerated imaging

Demigration and image space separation of simultaneously acquired data (pdf)

[SRC]
Chris Leader and Biondo Biondi
Separating simultaneously acquired seismic data is the link between more efficient acquisition and conventional imaging techniques. Acquiring multiple source locations concurrently, without waiting for full energy dissipation, can provide cheaper and denser acquisition. However, to integrate with current production scale imaging it is necessary to separate these data into their conventionally acquired equivalent state. Many algorithms give successful separation but all stringently require random source sampling in time and space. Herein an image-domain transformation is used to isolate and remove noise from overlapping shots for both randomly delayed and linearly delayed simultaneous data; an inverse transform is then used to recover separated, conventional data. Results show that this process is not dependent on a well constrained velocity model if the extended image space is used to preserve data kinematics.

 
Phase-encoded inversion with randomised sampling (pdf)

[SRC]
Chris Leader and Robert Clapp
Inverse imaging and full-waveform inversion can be accelerated using phase encoding. By combining subsets of these data a series of super shots can be created, reducing the dimensionality of the problem. For certain geometries this can lead to more efficient data-space residual reduction, relative to, say, unencoded least-squares reverse time migration. Using an iteration-dependent random subset of these super-shots reduces the cost of each iteration while preserving the macroscopic convergence characteristics of conventional phase-encoding. Consequently, the cost of the system as a whole is significantly reduced while favourable residual reduction is maintained.


Simultaneous source separation as a sequence of coherency pass operations (pdf)

[SRC]
Daniel Blatter and Chris Leader
In conventional seismic data acquisition, sufficient time must elapse between seismic sources to prevent inteference. As a result, conventional surveys suffer from a balance between higher cost or insufficient spatial sampling. Seismic surveys acquired with multiple sources activated concurrently offer the potential advantages of a combination of denser sampling and reduced cost. In order for conventional algorithms to clearly image the data, however, they must first be deblended. Some deblending algorithms rely on a sequence of operations, including filters, transforms, mutes, and even data domain sorts, which use some criterion to remove interfering energy with minimal harm to the desired signal. The effectiveness of data deblending using these operations depends on the amount of intefering energy, the spatial proximity of the interference, the timing of the interfering sources, and other factors. This paper will demonstrate that an effective deblending algorithm makes clever use of a number of filters, mutes, transforms, and data sorts to almost completely remove interfering energy.

 
Compression for effective memory bandwidth use in forward modeling (pdf)

[SRC]
Eileen Martin
A common bottleneck in seismic imaging is moving data. Lossy compression during wave propagation simulations may be a useful tool for decreasing the amount of data that must be moved. In the future, hardware compression may be added between main memory and cache memory, so I explore the application of the fpzip algorithm to compress small sets of data at each time step of acoustic wave propagation. To make the most of modern architectures, I investigate the use of wavelet compression of entire wave fields before writing to disk. The experiments presented show limited promise for using fpzip in seismic imaging and potential for using wavelet and curvelet compression when writing to disk.

Elastic modeling and passive seismic

High-frequency surface and body waves from ambient noise cross-correlations at Long Beach, CA (pdf)

[SRC]
Jason P. Chang, Sjoerd de Ridder, and Biondo Biondi
The density and duration of the Long Beach, California passive seismic array makes it well-suited for exploiting ambient noise to estimate inter-station Green's functions at frequencies (3--9 Hz) that are well beyond the microseism band. Using the ambient noise cross-correlation technique, we are able to recover and identify Rayleigh-wave and body-wave energy at these frequencies. From virtual-source gathers along a line of receivers running north-south, we find that Rayleigh-wave energy is primarily generated by traffic noise on busy streets and Interstate 405. Group-velocity dispersion images and analysis of group arrival time as a function of virtual source-receiver azimuth suggest that this Rayleigh-wave energy has the potential to be used as input to a 2D transmission tomography workflow. By summing over different virtual source-receiver azimuths, we determine the direction the body-wave energy is arriving from. By looking at correlation results from various patches of virtual source-receiver pairs, we find that body-wave energy is strongest in certain parts of the array.

 
Stochastic variability of velocity estimates using eikonal tomography on the Long Beach data set (pdf)

[SRC]
Taylor Dahlke, Gregory Beroza, Jason Chang, and Sjoerd de Ridder
In this work, we create spatial fields of distributions for the local wave speed. To get the wave speed parameter for these distributions we use cross-correlations from a Long Beach region data set of ambient seismic noise recordings to perform eikonal tomography for phase velocity. We use Delaunay triangles as our basis for discretizing the model and represent the velocity model as a distribution for each triangle. We take advantage of the fact that our data provides extensive numbers of correlations for each receiver to build distributions of phase velocity for each triangle. From these distributions we generate maps of average velocity, which clearly show fault lines that are known to traverse the survey.

 
Body-wave extraction and tomography at Long Beach, CA, with ambient-noise interferometry (pdf)

[SRC]
Nori Nakata, Jason P. Chang, and Jesse F. Lawrence
We retrieve diving P waves by applying seismic interferometry to ambient noise records observed at Long Beach, California, and invert travel times of these waves to estimate 3D P-wave velocity structure. The ambient noise is recorded by a dense and large network, which has about 2500 receivers with 100 m average spacing. In contrast to surface-wave extraction, body-wave extraction is much harder because body-wave energy is generally much weaker than surface waves in the regional scale (maximum offset is ~ 10 km). For travel-time tomography, we need to extract body waves at each pair of receivers separately. Therefore, we employ two post-correlation filters to reject noisy signals (which are unusable for body-wave tomography). The first filter rejects traces based on low P-wave correlation with the stack of all traces at that distance. The second filter measures coherent energy between all retained traces and suppresses incoherent noise in each trace. With these filters, we can reconstruct clear body waves from each virtual source. Then we estimate 3D P-wave velocities from these waves with travel-time tomography. The velocities show high-resolution structure.


High-order elastic finite-difference modeling (pdf)

[SRC]
Gustavo Alves and Biondo Biondi
For many years, short-offset data have been a cornerstone of reflection seismic imaging and amplitude estimation methods such as Azimuth versus Offset (AVO). However, longer offsets have increasingly become more available due to new acquisition geometries and to a greater emphasis in refraction seismic, stimulated in part by inverse methods like Full Waveform Inversion (FWI). We focus here on some of the limitations encountered in short offset versus long-offset data, specifically reflectivity estimations for higher reflection angles. We then turn our attention to a finite difference implementation of the elastic two-way wave equation, which is a necessary modeling step for truer amplitude estimation. Finally, we implement a 10th order in space and 2nd order in time finite-difference scheme and show a few propagation examples.


Six-component seismic land data acquired with geophones and rotation sensors: wave-mode separation using 6C SVD (pdf)

[SRC]
Ohad Barak, Priyank Jaiswal, Sjoerd de Ridder, John Giles, Robert Brune, and Shuki Ronen
In a three dimensional world there are six degrees of freedom: three linear displacements and three rotations. Current multi-component acquisition systems have geophones or accelerometers that provide the linear motion, and hydrophones that provide the pressure, but without rotations the data are incomplete. We acquired a small seismic survey recording all six components. We deployed three-component geophones and three-component rotation sensors measuring the pitch, roll, and yaw. We measured the pitch independently with closely-spaced geophones to validate the rotation-sensor data. We compare the pitch measured by two independent methods and find that they fit after instrument designature. We then demonstrate that the data provided by rotation sensors have additional value because they can be used in a singular-value decomposition analysis to identify and separate ground roll and body waves.

Miscellaneous

SEPHTML5: HTML5 and interactive visualization (pdf)

[SRC]
Robert G. Clapp
The growing move to a cloud-based/server-processing-dominant processing paradigm poses a problem for viewing and interacting with seismic data due to X11's high latency and large bandwidth requirements. HTML5's flexibility, support for strong client-server paradigms, and low latency design allows for a platform independent solution to visualizing seismic data. SEPHTML5 is written on top of HTML5 for interacting with seismic data. I present two different applications, a 3-D viewer and multi-slice comparison tool to highlight the potential of a HTML5 approach to viewing seismic data.

 
SEPVector: a C++ inversion library (pdf)

[SRC]
Eileen Martin, Robert G. Clapp, Huy Le, Chris Leader, and Dave Nichols
SEPVector is a library of C++ classes, methods, and simple interfaces for solving geophysical inverse problems. From the beginning this library was designed to allow users with relatively little coding expertise to work on heterogeneous computer architectures, a feature that is becoming increasingly critical to modern library development. Verification of the code features is done through a thorough set of unit tests. Although it is written in C++, Fortran users can easily learn to use the SEPVector interface through a series of simple examples included in the package.

 
Seg2Mat: SEG-2 to MATLAB file converter (pdf)

[SRC]
Stewart A. Levin
To support SEG-2 data import into MATLAB®, I have modified a SEG-2 converter from the Colorado School of Mines Center for Wave Phenomena Seismic Unix package to convert SEG-2 files to corresponding MATLAB files.

 
CESLib: an object library for building scalable inversion applications (pdf)

[SRC]
Musa Maharramov
Application of alternative inversion techniques to practical problems depends on our ability to quickly adapt existing algorithms to different optimization methods, model and data spaces, operators and boundary conditions. This paper discusses a new object-oriented Fortran library for computational earth sciences (CESLib). I describe scalable model-space and operator hierarchies, and an optimization abstraction mechanism that are implemented in the library, and demonstrate a specific application to joint time-lapse inversion. In particular, I demonstrate how using the implemented object framework reduced the amount of time and effort in converting a single-model full-waveform inversion application into a simultaneous inversion package without affecting low-level code for computationally-intensive processing.

Publication Date
April, 2014