Oil Exploration: Researchers Invent New Way To See Underground

March 16, 1998

BERKELEY, CA. -- It's one thing to strike oil, another to get it all out of the ground. Reservoirs are often prematurely abandoned because of "oil behind the hole" -- pockets of hydrocarbons that wells completely miss. Better ways of seeing what's down there are crucial to efficient use of the nation's energy reserves.

A new method for making images of the subsurface has been devised by Ki Ha Lee of the Department of Energy's Lawrence Berkeley National Laboratory. Lee and his colleagues in the Earth Sciences Division use data generated with electromagnetic energy (EM) in a novel way that can visualize the subsurface at depths ranging from gas and oil reservoirs thousands of meters deep to plumes of pollutants in shallow soil.

"Electromagnetic energy tends to propagate without loss in free space, but in the ground the story is very different," Lee says. "The ground attenuates the signal severely, especially at the higher frequencies. Instead of propagating like a wave in space, an EM field diffuses into the ground almost like heat does through a solid medium."

Traditional EM imaging techniques involve transmitting signals, then acquiring and processing the data to map the distribution of subsurface electrical resistivity. Signals can be sent between boreholes, between the surface and the borehole, or from surface to surface. Signals vary in strength according to the electrical resistivity of the material through which they travel; oil-bearing sand, for example, can be ten times as resistive as surrounding clay or carbonates.

Unfortunately, differences in resistivity can be detected confidently only at close range. Lee compares the process of detecting diffuse EM signals to "measuring the temperature" at various points underground. Away from the measurement points the spatial resolution of resistivity is usually limited, but drilling boreholes close together for better resolution is prohibitively expensive.

While Lee uses low-frequency, diffusive, electromagnetic data sent between boreholes, his method allows better images to be constructed at greater range, significantly reducing drilling costs. He employs a clever mathematical transformation to convert diffusive EM signals into "wavefields;" the resistivity image is constructed using the transformed wavefield's travel times.

Examples of wavefields include familiar seismograms, which make use of earthquakes or other earth-shaking events, and sonar, which is based on underwater sound waves. Radar works in a similar way, using electromagnetic waves which travel well in air or vacuum but can penetrate only a little way into the ground. All depend on signal travel time to construct images.

In converting low-frequency EM data to wavefield form, Lee transforms the real-time electromagnetic fields to mathematical equivalents which are easier to manipulate. Thus he can apply familiar methods for using time-dependent data to generate tomograms -- maps of two dimensional slices -- of the geological structures between the receivers and transmitters.

In a typical use of the technique, a receiver is held steady in one borehole and a transmitter is moved down a different borehole to different locations, at each of which it sends out a signal. Next the receiver is moved to a new fixed position; the transmitter again transmits from different locations.

The result is a series of signals emitted at various depths, each of which is also detected at various depths. Rays can be drawn connecting each transmitter point with each receiver point, representing discrete paths of the signals; from the travel times along these raypaths the image is constructed.

In a heterogeneous medium the raypaths are usually curved, because the travel time between a given transmitter-receiver pair depends on the distribution of differences in resistivity along the path -- the more the electrical resistance, the faster the travel time. To construct an image, all the raypaths are combined as if generated at the same time. Thus the distribution of resistivity between the boreholes is constructed as a tomogram representing local geological conditions.

Lee and his colleagues first tested the new method in a homogeneous medium, a block of graphite in the laboratory. The success of these tests led to construction of a full-scale transmitter, which has been field-tested at the University of California's Richmond Field Station. A combined transmitter-receiver field test is planned this year with the help of Western Atlas Logging Services in Houston. A final data-processing phase will follow.

Only about a fourth of the oil and gas in known reservoirs is recovered, and some 40 percent of the nation's reserves reside in already-discovered reservoirs. The new imaging technique developed by Lee and his colleagues will make it easier to find what's been bypassed. The same methods can be used to characterize environmental problems and monitor their remediation.

The Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California.

DOE/Lawrence Berkeley National Laboratory

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.