DOE_OAG_9569793595_2b3457e204_o.jpg
DOE_OAG_10995684753_54014fd682_o.jpg
DOE_OAG_9569726779_d9fbb02381_o.jpg
DOE_OAG_9578392988_6cc8e484cb_o.jpg
DOE_OAG_9569793595_2b3457e204_o.jpg

energy R&D, on the move: shale gas


SCROLL DOWN

energy R&D, on the move: shale gas


George P Shultz on the US shale gas revolution:

Recent advancements in horizontal drilling and hydraulic fracturing of shale formations have changed the landscape of US oil and gas production. There were over 48,000 oil and gas wells drilled in the United States alone in 2012. Fifty percent of these new wells were horizontal wells; most of them drilled in unconventional oil and gas plays; and over 95 percent of them hydraulically fractured. The result of all this is now well known: dramatically increased domestic production of natural gas. These subsurface innovationsenabled in part by new technologies such as downhole imagery, microseismic imaging, and slick water fracturinghave both driven down natural gas prices and strengthened the contributions of the gas sector itself to the US economy.

Lower natural gas prices have led to significant environmental improvements. For the first time in decades, the US electric grid in mid-2012 was supplied by approximately equal shares of gas- and coal-fired power generation, though coal’s share recently increased in part due to rising gas prices. Gas-for-coal substitution of course is attractive from a climate change perspective (and, in fact, according to several recent studies, coal-to-gas switching dwarfs the marginal contributions of renewable energy), but it also has hugeand largely overlookedlocal environmental and health advantages. 

DOE_OAG_10995684753_54014fd682_o.jpg

Oil and Gas - Available Today


When you have an advanced technology available, the policy
decisions become much easier.

— Julio Friedmann, Lawrence Livermore National Laboratory chief energy technologist

Oil and Gas - Available Today


When you have an advanced technology available, the policy
decisions become much easier.

— Julio Friedmann, Lawrence Livermore National Laboratory chief energy technologist

photos: DOE/flickr

Claiming credit for delivering the US shale gas boom has become an energy policy parlor game, and the truth is that today’s shale gas boom has many fathers. Mitchell Energy’s tenacious trial-and-error experimentation with new and unproven field techniques persevered despite years of subpar returns. Serendipity—high gas prices, well-timed supply contracts, and convenient geologies—allowed this experimentation to con- tinue long enough for costs to be gradually driven down. The United States’ private-property rights regime, almost unique in the world in terms of its aggressive assignment of private property rights over mineral resources, offered an exit opportunity to compensate for early investment risks in a sector in which operational advances often spill over. A synergistic corporate acquisition combined key know-how—Mitchell’s shale slick water fracturing techniques with Devon Energy’s gas horizontal drilling capabilities—that made the process economic throughout gas basins outside of Texas’s Barnett. And existing gas gathering and pipeline infrastructures, built over decades, ultimately provided ready legs to achieve today’s shale gas-production scale.

Crucially, many of these contributions were enabled by years of industry, consumer, and government-sponsored research and collaboration. Some scientific and technical contributions were dead ends, and others took decades before their value was fully recognized. But without this R&D, permits for siting liquefied natural gas (LNG) import terminals—and not today’s backlog of applications for hotly contested export licenses—would likely still be on top of the US energy policy agenda.  

Related: Resources for the Future on shale gas R&D through the years →

Related: An oral history of shale gas R&D from the Breakthrough Institute 

Hear professor Mark Zoback describe shale gas technology development at a 2012 Stanford Energy Seminar


IT-enabled “smart” oil fields

Data-hungry oil and gas companies have long been early adopters of IT. The first digital signal-filtering methods developed at MIT in the 1950s were actually used to improve underwater seismic surveys. Adoption of computing technology really took off in the 1960s with the transition from vacuum tubes to transistors as the industry migrated to digital systems. Texas Instruments actually grew out of a geophysical services consultancy, oil majors regularly commissioned custom IBM machines for decades, and today BP operates the largest commercial supercomputer.

The past decade, however, has seen a sea change not just in computing hardware, but also in the software and supporting operational capability available to oil producers. So-called “integrated asset management” systems—or more simply, smart oil fields—now make the mountains of computational data (drilling logs, seismic maps, well flow models, production levels, and equipment status) available in real time to project engineers in the field or distant control centers. The result? Increased recovery rates, improved safety, less downtime, and more efficient capital deployment. Chevron, for example, uses integrated IT today to standardize reservoir operations across its hundreds of fields globally that use waterflooding-based recovery.

The multidisciplinary advances to enable smart oilfield systems have drawn from both industry and academia. At the University of Southern California, for example, researchers in computer science and electrical engineering have described novel databases for better on-site access to existing field scenario analyses. Such systems—developed in partnership with industry—even allow field engineers to update existing predicted models in real time based on actual well production. Elsewhere, USC researchers in informatics, with original funding from DARPA and the US Air Force, have developed their research into overlaying semantic and text-based information onto geospatial maps—a system then called TILES—into commercially available geographic information system software that is used across industries.

DOE_OAG_9569726779_d9fbb02381_o.jpg

Oil and Gas - Near at Hand


There is a need for fundamental and chemical mechanisms controlling flow from the nanoscale to the basin scale....This kind of research is going on in a number of universities around the country, and the question of how much gas we’re actually going to be able to recover boils down to nanoscale processes that are not yet well constrained. We have these large maps. We draw circles around large basins. But actually on an individual well basis, we don’t understand these processes well enough to answer the question of whether or not these wells are going to last five years, or if they’re going to last twenty-five years. This research is absolutely essential. . . . the private sector is very heavily involved in all these things and it’s a combined responsibility to move the ball forward.

— Mark Zoback, Stanford University Benjamin M. Page professor of geophysics 

Oil and Gas - Near at Hand


There is a need for fundamental and chemical mechanisms controlling flow from the nanoscale to the basin scale....This kind of research is going on in a number of universities around the country, and the question of how much gas we’re actually going to be able to recover boils down to nanoscale processes that are not yet well constrained. We have these large maps. We draw circles around large basins. But actually on an individual well basis, we don’t understand these processes well enough to answer the question of whether or not these wells are going to last five years, or if they’re going to last twenty-five years. This research is absolutely essential. . . . the private sector is very heavily involved in all these things and it’s a combined responsibility to move the ball forward.

— Mark Zoback, Stanford University Benjamin M. Page professor of geophysics 

Downhole electromagnetic monitoring of proppants 

UT Austin

A UT Austin researcher has shown that strategically timing successive fracturing operations in a shale formation can greatly enhance fracture efficiency.

Hydraulic fracturing in shale formations induces a large number of microseismic events in the rock surrounding the production zone. These induced fractures are so narrow that they generally do not contain proppant from the main hydraulic fracture and therefore do not produce gas. Induced, unpropped fractures do, however, result in a “stress shadow” that affects the direction and extent of propagation of subsequent fractures. One unwanted consequence of this is the propagation of subsequent fractures into the unpropped fractures induced earlier, leading to waste of frac fluid and proppant.

University of Texas professor Mukul Sharma has demonstrated that strategically timing spatially contiguous fracturing operations over a few hours reduces the spatial extent of the stress shadow and, therefore, fracture interference. This method of timing of fractures ensures more efficient fracture stages without use of specialized tools and results in maximum reservoir exposure through fracturing, allowing more efficient drainage at no increase in cost.

Sharma has developed a method based on a novel downhole electromagnetic (EM) tool that allows imaging of the propped portion of induced fractures. His tool and method of EM logging provide a way to accurately determine the distribution of proppant throughout the induced fracture network, unlike current methods such as microseismic monitoring that only detects locations of shear failure. This allows one to map propped fractures up to several hundred feet away from the wellbore in both open-hole and cased-hole completions, and provides the only way to map propped fractures at a much lower cost than existing diagnostic methods. 

Credit: Cary W. King for the University of Texas at Austin Energy Institute, 2013

Novel membrane reduces fracing water consumption 

UT Austin

University of Texas research has developed a process to reduce the amount of fresh water used in hydraulic fracturing by recycling up to 50 percent more water than existing techniques.

Flowback and produced water from oil and gas production are toxic. They contain oil, salt, and an array of minerals and chemicals that must be removed before being returned to the environment or reused in further drilling. One of the biggest problems in the oil field is what to do with this water—treat and reuse the dirty water or inject it into disposal wells? Cost-effective membranes for water treatment can avoid new water acquisition and the need to dispose of produced water.

Professor Benny Freeman’s state-of-the-art filtration system developed at UT takes produced water and pumps it through two specially coated membrane filters. The first is coarse and removes oils, chemicals, and large contaminants like rocks. The second uses reverse osmosis, the same technology used to desalinate seawater, removing its salts and minerals. The result is a supply of recycled water ready to be injected back underground to free more rock-bound pockets of oil and gas. Typically, cleaner water yields a more efficient fracing process, as much as 50 percent higher with these treated membranes. Freeman perfected his membrane coatings while trying to increase the efficiency of filtration systems to clean oily water from naval ship engines.

Their new coating, called polydopamine, is a hydrophilic coating, meaning that water is more attracted to the filter, while rejecting oil and other contaminants. The filter thereby resists clogging, or “fouling.” Membranes coated with polydopamine increase water throughput, thus using less energy. The membranes are also easier to clean and last up to twice as long, thus they are now in the process of being commercialized. 

Credit: Cary W. King for the University of Texas at Austin Energy Institute, 2013

Ambient seismic oil field-monitoring technology

Stanford

Research out of Stanford University could help oil companies improve recovery techniques and let drillers more effectively monitor existing oil fields in the North Sea.

“Many fields worldwide have shown problems in the overburden,” or their ability to withstand the pressure compressing an oil reservoir, says Sascha Bussat, a Norwegain researcher for the energy company Statoil. Traditionally, oil companies monitor and manage this problem by relying on expensive and time-consuming active seismic surveys, in which boats float out into the great, stormy maritime expanse with guns full of compressed air, which blast sound miles deep into the seabed.

Stanford graduate student Sjoerd de Ridder aims to bridge the data disconnect presented by big-ticket, active seismic testing by offering a more affordable model: continuously recording fainter seismic waves naturally occurring at the ocean floor as water moves along the crust of the earth. These passive, or ambient, seismic tests could better pinpoint where and when expensive surveys should optimally be used. The value of de Ridder’s ambient seismic approach lies in harvesting data in real time and for almost no additional cost above normal operations. “That makes any value equation good,” he says.

BP, the operator of Norway’s Valhall oil field, has been moving to drive ambient seismic testing research forward. When de Ridder began diving further into mapping Valhall’s oil deposits in 2008, scientists had known about ambient seismology for decades, but they had not yet applied it to monitoring oil reserves. De Ridder spent the summer of 2012 in Norway where initial tests proved successful, and expected that, within a year, the proper infrastructure would be in place to monitor seismic signals in the North Sea 24/7.

Credit: Julia Barrero for Stanford Peninsula Press, 2013

Professor Makul Sharma, UT Austin

Professor Makul Sharma, UT Austin

Professor Benny Freeman, UT Austin

Professor Benny Freeman, UT Austin

Assistant Professor Sjoerd de Ridder, University of Science and Technology of China

Assistant Professor Sjoerd de Ridder, University of Science and Technology of China

Hear professor John Deutch describe the findings of the Secretary of Energy Advisory Board Shale Gas Subcommittee in a 2011 talk at the MIT Energy Initiative

DOE_OAG_9578392988_6cc8e484cb_o.jpg

Oil and Gas - On the Horizon


The potential to increase recovery from existing fields—so no new surface infrastructure—is about 5–15 percent enhanced recovery. Subsurface micro and nanosensors—a third remote sensing platform, different from well logging and seismic—are being developed to address this EOR target.

— Scott Tinker, Bureau of Economic Geology director at the University of Texas at Austin 

Oil and Gas - On the Horizon


The potential to increase recovery from existing fields—so no new surface infrastructure—is about 5–15 percent enhanced recovery. Subsurface micro and nanosensors—a third remote sensing platform, different from well logging and seismic—are being developed to address this EOR target.

— Scott Tinker, Bureau of Economic Geology director at the University of Texas at Austin 

Millimeter-wave directed energy drills 

MIT

Accessing critical resources such as geothermal energy and natural gas drilling is an expensive, energy-intensive, messy process with today’s technology. But researchers have been looking into a more elegant approach. Instead of grinding rock to bits, they would use a continuous beam of energy to vaporize it and then blow out the tiny particles that form with a high-pressure stream of injected gas. Using a device borrowed from nuclear fusion research, Paul P. Woskov, of MIT’s Plasma Science and Fusion Center, has vaporized rock for the first time, confirming the feasibility of his proposal to use energy beams rather than drill bits to access underground energy resources.

For industry audiences, a key source of skepticism about the proposed approach is the absence of “drilling muds”—fluids that are carefully tailored to both remove debris and strengthen the walls by plugging up pores in the rock and providing back pressure that counters inward pressures on the hole. “Deep drilling without drilling mud is unheard of,” says Woskov. However, he suggests that the glassy, or “vitrified,” walls in his system could be strong enough not only to keep the hole open during drilling, but also to withstand the extreme pressures on the finished borehole with no added cement or metal liner.

“I think we have the potential to revolutionize drilling, but it’s a completely new approach that will throw out a lot of conventional wisdom. That’s the problem,” says Woskov. He likens it to the 1940s, when the aircraft industry was working with mechanical rotary engines and propeller technologies. When the jet engine came along, it was a completely new approach that took away all the mechanics and worked on a jet stream of air. Says Woskov, “It’s the same thing. Drilling technology is in the mechanical age right now, and we want to move it into the jet age”—in this case, the age of directed energy.

Credit: Nancy Stauffer, ©Massachusetts Institute of Technology, used with permission, 2012

Carbon dioxide storage linked to enhanced hydrocarbon production  

UT Austin

A variety of research programs at the University of Texas are looking at the geologic storage of carbon dioxide (CO2) in hydrocarbon-producing formations.

For example, the US Gulf Coast is well known for its oil-producing offshore rigs, but it also has unique geology that can store large quantities of CO2 more than a mile underground in hot, salty fluid. The distinguishing feature of this brine is that it is saturated with dissolved methane. Bringing the brine to the surface with extraction wells allows for the recovery of vast amounts of geothermal energy, and once CO2 is injected into the brine, it forces out significant amounts of methane. The CO2-laden brine can then be sent back down for permanent storage. Recent calculations by professor Steven Bryant show that enough deep brine exists along the US Gulf Coast to store one-sixth of the country’s CO2 emissions and to meet one-sixth of its demand for natural gas annually.

Other research, led by UT research scientist Katherine Romanak at the Gulf Coast Carbon Center, has developed a new soil gas method for characterizing near-subsurface CO2. By discriminating between CO2 that potentially leaked from an injection location versus CO2 generated in situ by natural processes, this method can identify CO2 that has leaked from deep geologic storage reservoirs into the shallow subsurface. Whereas current CO2 concentration- based methods require years of background measurements to quantify variability of naturally occurring geologic CO2, this new approach examines chemical relationships between in situ N2, O2, CO2, and CH4 to promptly distinguish a leakage signal. The ability to measure in this way, without the need for background measurements, could decrease uncertainty in leakage detection at a low cost. Trials have already been performed as part of the investigation into CO2 seepage from the Weyburn enhanced oil-recovery project in Canada.

Credit: Carey W. King for the University of Texas at Austin Energy Institute, 2013

Paul Woskov, MIT

Paul Woskov, MIT

Katherine Romanak, UT Austin

Katherine Romanak, UT Austin

Professor Steven Bryant, UT Austin

Professor Steven Bryant, UT Austin

 

 
What are the game changers that are inherently “left field”? Have we not covered something that we don’t see—or that is coming around the bend but it’s not quite there yet—that will create discontinuities in our energy landscape and trajectory?
— Arun Majumdar, Google vice president for energy*