Skip to content Skip to navigation

Research

Our research has a broad spectrum, but key ingredients are the applied nature, aiming for practical impact, at the same time relying on foundational development in data scientific methods. Our publication list provides an overview of the various academic contributions. In addition, we work closely with Earth Resources companies and government entities to define specific collaborative sponsored projects. Some recent company projects are listed below as well as our current student research topics. 

 

  • Groundwater management in Denmark (Danish Government)
  • Production planning for a structurally complex reservoir in Libya (Wintershall)
  • Reservoir appraisal for a deep-water turbidite reservoir from seismic data in West Africa (Hess)
  • Designing uranium contaminant remediation in the USA (SE3)
  • Big data predictive analytics for optimizing shale reservoirs (Anadarko, Repsol, NGI)
  • Assessment of low-enthalpy geothermal systems for heating buildings (Belgian government)
  • A strategy for rapid & realistic forecasting of reservoir performance using process-based models (ENI)
  • Direct forecasting and uncertainty updating at the appraisal stage without building new models (Chevron)
  • Implicit dynamic uncertainty for automation of data fusion & prediction in mineral resources evaluation and planning (BHP)
  • Multiple-point geostatistics and bayesian evidential learning for gas reservoir management (Edison)

 

Examples of student research topics

Assessing oil migration pathways in petroleum systems

In oil exploration, significant uncertainty exists on the paths for oil migration. To assess suchuncertainty on the accumulation outcomes in a petroleum system, direct modeling of oil generation and migration are applied. As case study we consider CO2 vs Oil accumulations within the Campos basin. Marcelo Silka

Automated uncertainty quantification of geological model using Bayesian Evidential Learning

Uncertainty quantification is at the heart of decision making during reservoir appraisals. Based on Bayesian Evidential Learning (BEL), this project aims to develop an automated framework for uncertainty quantification of geological model for the reservoir appraisal. Under this framework, when new wells are drilled, multiple components of geological model need to be updated jointly and automatically by means of a sequential decomposition following geological rules. During the updating, we extend the direct forecasting to perform joint model uncertainty reduction using new well observations, thus leading to the automation. This enables updating the geological model uncertainty without conventional model rebuilding, which significantly reduces the time-consumption. David Zhen Yin

Bayesian evidential learning

The ultimate goal of collecting data, building models and making predictions is to make an informed decision. In the subsurface realm, such decisions are subject to considerable uncertainty. I research a new framework termed “Bayesian Evidential Learning” (BEL) that streamlines the integration of these four components: data, model, prediction, decision. This idea is published in a new book: “Quantifying Uncertainty in Subsurface Systems” (Wiley-Blackwell, 2018) and applied to five real case studies in oil/gas, groundwater, contaminant remediation and geothermal energy. BEL is not a method, but a set of principles derived from Bayesianism that lead to the selection of relevant methods to solve real decision problems. In that sense BEL, focuses on decision-focused data collection and model-building. One of the important contribution of BEL is that is a data-scientific approach that circumvents complex inversion modeling such as history matching and dynamic data integration and instead relies on machine learning from Monte Carlo with falsified priors. The case studies illustrate how modeling time can be reduced from months to days, making it practical for large scale implementations. Components of BEL are global sensitivity analysis, Monte Carlo, model falsification, prior elicitation and data scientific methods that reflect the stated principle of its Bayesian philosophy. Jef Caers

Book: Quantifying Uncertainty in Subsurface Systems

Building informative priors and fast simulation methods for subsurface flow applications

Uncertainty quantification of flow in the subsurface is traditionally hampered both (i) by a lack of use of prior knowledge in geostatistical simulation methods and (ii) repeated solution of costly reservoir flow equations. Computational sediment transport models and an abundance of inexpensive remote sensing data represent two potentially important sources of information about depositional patterns. By building up a database of prior knowledge and combining this with modern statistical and machine learning methods we aim to create fast and realistic simulation methods where there is less room for subjective decisions about uncertainty. Erik Nesvold

Combining data science methods with flow simulation in shale resources development

There are two approaches to uncertainty quantification in petroleum reservoirs in general: data-driven modelling and reservoir simulation. In shale reservoirs data-driven modelling has become favorable, mainly due to two reasons. Firstly, the high number of wells drilled leads to large data volumes and necessity for rapid decision making. Secondly, physical processes of hydraulic fracturing and shale production, its connection to natural fractures are poorly understood. In this research, I would like to determine compatibility of data-driven modelling and direct reservoir simulation results in shale reservoirs. Do we get the same answers for well placement and identifying most important for production factors using two different methods? Does forecast quality suffer without reservoir simulation? And as the result I would like to create synthesis of two methods, combining ability of working with large datasets of data-driven modelling and better physics from reservoir simulation. We have started by applying data-driven approach to Niobrara, Eagle Ford and Duvernay plays, the next step is reservoir simulation for particular wells. Alexander Bakay

Integrating Geostatistical Modeling with Machine Learning for Production Forecast in Shale Reservoirs: Case Study from Eagle Ford

Conditioning categorical models to hard data using a Gibbs sampling of a truncated multi-variate Gaussian model

From a set of categorical model realizations that don’t necessarily match data, how do we generate categorical model realizations that fully honored data? This research proposes an efficient method to generate posterior categorical model realizations that match data 100% from prior categorical model realizations generated through any method. The proposed method is variogram-modeling free and relies on signed distance functions, principal component analysis, and Gibbs sampling. Francky Fouedijo

Groundwater management in Denmark

High quality drinkwater is an important Earth Resource, but global change, in terms of climate, energy needs, population, agriculture under such change will put considerable stress on fresh-water supplies . Denmark provides a unique case because of a clear formulation of governmental objectives, the appropriate definition and involvement of the various stakeholders, the collection of high quality data such as Airborne EM data, the specific nature of geological heterogeneity, and the general desire to do it the “right” way: leveraging science and community involvement into a sustainable future water supply. Our project in collaboration with scientist in Denmarks looks at how decision analysis, uncertainty quantification and the assessment of data collection methods can help local authorities in the management of the system, balancing needs for water vs protecting the environment. Jef Caers, Celine Scheidt, Troels Vilhelmsen.

Book: Quantifying Uncertainty in Subsurface Systems

Learning from big data for rapid uncertainty quantification of exploratory ore deposits

Mineral resources forecasting requires integrating drill-holes and geophysical data with the guidance of geological knowledge. However, the cost of collecting more data in a specific local area can often be high, in an exploration setting. Uncertainty models built based solely on sparse local data have very limited predicting power. Would it be possible to learn from a large amount of already explored sites and rapidly build predictive models in the local area with sparse local data? To answer this question, we need to develop efficient statistical inference and modeling techniques. We take advantage of the most recent development in computer vision and computer graphics, called level set methods, for efficiently generating local uncertainty models constrained by data and geological rules, with parameters learned from big data. We aim to reduce the cost (time, money and labor) in mineral exploration with this approach.  Liang Yang

Mapping incised-valley-fill deposits in the Central Valley

The Central Valley, where one third of all U.S. produce is grown, relies heavily on groundwater to sustain its agriculture. As climate changes and water resources become more scarce, there is an enhanced need for characterization of groundwater resources. To improve our understanding of groundwater recharge pathways in the Central Valley aquifers, I am mapping the highly permeable incised-valley-fill deposits in the Central Valley and studying their shapes and connectivity structure. To do this, I quantitatively integrate models of sediment deposition, well data, and time-domain transient airborne electromagnetic data in a Bayesian framework to create geologically-constrained hydrologic models. These high-quality hydrogeologic models can be used to devise strategies for aquifer recharge, pumping, and management. Alex Miltenberger

Monitoring low-enthalpy heating systems using time-lapse ERT

Low-enthalpy geothermal systems are increasingly used for climatization (heating/cooling) of buildings, in an effort to reduce the carbon footprint of this type of energy use. The main idea is the utilization of the subsurface, whether rocks, soils, saturated or unsaturated as a heat source or heat sink (cooling). The design as well as monitoring of the shallow geothermal system, like many other subsurface applications requires a multi-disciplinary approach, involving several fields such as geology, hydrogeology, physics, chemistry, hydraulics engineering design and economics. Characterization of heat flow, temperature changes and its effect on the ambient environment requires characterizing geological heterogeneity, combined fluid and thermal. In this research, we focus on the use of a specific method, namely Electrical Resistivity Tomography (ERT) and its time-lapse variety to characterize temperature and its changes under shallow geothermal exploitation and monitoring. We develop methods that allows for efficient characterization of the temperate change during production which is important in the design of heat pumps. Jef Caers, Thomas Hermans

paper: Uncertainty Quantification of Medium‐Term Heat Storage From Short‐Term Geophysical Experiments Using Bayesian Evidential Learning

Optimization of decision-making in the context of CO2 sequestration monitoring

Assuring the security of CO2 storage in the subsurface is key for successful geological sequestration and long-term emissions reduction. For a safe and efficient long-term storage of CO2 in geological formations, adequate monitoring of the state of CO2 is needed to prevent leakage from the storage reservoir. My research involves the optimization of the design of CO2 monitoring strategies using Markovian modelling and Monte Carlo-based sensitivity analysis as well as the application of Value of Information (VOI) to carbon storage monitoring. My research aims at improving decision-making under geological uncertainty and uncertainty in the state of the CO2 sequestered. Clothilde Venereau

Optimization under uncertainty for enhanced geothermal systems

Enhanced Geothermal Systems (EGS) generate electricity by creating an artificial subsurface heat exchange circuit. Cold fluid is injected into the subsurface and heats up as it circulates though hot rocks in a hydraulically fractured reservoir. The fluid is then pumped out of production wells and powers turbines to generate electricity. There are many uncertain parameters when creating an EGS, including: the stress field, the location and orientation of pre-existing fractures, the temperature field, and the physical mechanisms governing fracture creation. Therefore, the reservoir's response to stimulation and operation cannot be predicted with certainty. My research explores the optimization of EGS design parameters, such as the spatial configurations of the laterals, the number of fracturing stages, etc., given the subsurface uncertainty. Ahinoam (Noe) Pollack

Predicting favorable locations for geothermal development

Development of renewable geothermal energy resources is hindered by high exploration risk due to limited economic profitability and high subsurface uncertainty. In order to reduce exploration risk, conventional methodologies have sought to predict geothermal resources using spatial aggregation of data thought to be indicators of geothermal potential. However, this approach is problematic because it requires a large dataset of catalogued geothermal resources which does not exist. In our approach, using Bayesian Evidential Learning, uncertainty is reduced by directly learning how data, such as shallow temperature wells or geophysical data, is related to temperatures deeper in the subsurface. Noah Athens

paper: Sensitivity of Temperature Predictions in Basin-Scale Hydrothermal Models

Quantifying uncertainty in geological boundary modeling for mineral resource estimation: application to a porphyry copper deposit

Mineral resource evaluation aims to accurately predict the grades and tonnages of a mineral deposit that will be exploited during a specified time frame. Such a goal requires the definition of geological domains and their associated boundaries. Moreover, the geological boundary locations are uncertain and can be one of the primary sources of risk in a mining project, and because of that, they must be assessed. The definition of geological domains and their associated boundaries is the first-order decision and will influence all subsequent steps in the mineral resource evaluation process. Thus, a poor definition of geological domains and their associated boundaries could deteriorate the accuracy of the resource estimates. This research proposes an uncertainty quantification method to assess and visualize the uncertainty of 3D geological boundaries. The proposed method is fully automated, reproducible, data-driven, geological-driven, variogram modeling-free, and account for the uncertainty in the parameters. It includes many methods from geostatistics, statistics, machine learning, and data science. The proposed method is applied to a porphyry copper deposit. Francky Fouedijo

Quantifying uncertainty on flow and transport of nitrate using geophysical data in the Danish aquifer system

Excessive application of fertilizers in agriculture becomes the main cause of increasing nitrate concentration in groundwater. In Denmark, nitrate presence is highly influenced by the heterogeneity of geological structures, hydraulic parameters, and nitrate reduced potential at the field scale. Therefore, instead of imposing nitrate reduction regulation uniformly, spatially targeted regulation at the local agricultural scale becomes a promising management strategy: meet both the reduction target effectively and a maximal economic benefit. This research proposes an automated workflow to jointly quantify global and spatial uncertainty of nitrate reduction prediction at the local agricultural scale, using groundwater flow and nitrate transport as forward models. Uncertainty is constrained by multiple sources of observed data, including high-quality ground-based tTEM data, hydrological and geochemical information for nitrate distribution. The expected results of this project will provide a more rigorous uncertainty quantification for a practical and quantitative decision making of targeted nitrate management. Lijing Wang

Quantifying uncertainty using level sets with stochastic motion

In many geosciences applications, predicting requires building 3D models that are complex and cumbersome. As a result, often a single model, or some small variation of it, is built. One of the main bottlenecks lies in the creation of a high-resolution grid (million to billion cells) to provide enough detail and to constrain to data. Uncertainty is provided by generating many realizations, as sampled from some posterior distribution. Built into this common notion of modeling lies a fundamental inefficiency: building many accurate high-resolution models, each one of them being a poor approximation of actual reality (the truth). In this research, we provide a fundamentally new view on the same problem. Consider some geological geometry (ore, reservoir, fault) whose uncertainty needs to be quantified based on geological understanding rules and data. Instead of modeling these geometries and their uncertainty with many high-resolution models drawn by Monte Carlo, we will model their uncertainty by means of motion (velocity), representing uncertainty by means of an 3D velocity, as a stochastic process. To model 3D surfaces (e.g. faults, horizons, bodies) and their uncertainty, we will integrate this stochastic velocity into the level set equation. Level sets are an ideal way to represent mathematically complex surfaces without explicit grid-representations. This idea is applied to uncertainty quantification for groundwater systems, mineral resources and oil/gas reservoirs. Jef Caers, Liang Yang

Recognition of sub-resolution stacking patterns from seismic data

Geological modeling and characterization of energy reservoirs greatly relies on the interpretation of seismic data due to its high spatial availability. Owing to many factors, including subsurface complexities, heterogeneity and limited seismic resolution, accurate description of subsurface geological configuration becomes a major challenge. This research is focused on quantitative interpretation of individual seismic traces and extraction of information relevant to recognition of various geological parasequences.  The workflow proposed includes stochastic modeling of stratigraphic patterns involving thin sub-resolution beds, and subsequent application of statistical pattern recognition techniques to deduce those patterns from their respective acoustic responses. Riyad Muradov

Seismic reservoir property estimation and uncertainty quantification with statistical learning techniques

Seismic reservoir characterization has conventionally necessitated estimation of subsurface elastic properties. Inverse methods employed to such an end generally run into theoretical, numerical and computational complexities given the high dimensional nature of seismic data. The research question we pose: is such a step warranted when the end goal is reservoir characterization? We employ statistical learning techniques to learn the implicit relationship between seismic data and reservoir properties. Such learning is subsequently exploited for property estimation and uncertainty quantification without an explicit seismic inversion step. Applications include reservoir net-to-gross estimation, geo-modeling parameter estimation and reservoir facies estimation from seismic data. Anshuman Pradhan

Uncertainty quantification and global sensitivity analysis for reactive transport models, application to uranium remediation

In addressing subsurface contamination, it is important to understand the efficacy of remediation practices. Do these practices achieve what they are intended for? How can we monitor this by means of sensors or geophysics? A major challenge is to address complexities in both geological heterogeneity as well as uncertainties about the biogeochemical system. In this research, we developed a general workflow to address such questions and demonstrate it on the uranium remediation experiment at the rifle site, Colorado. Groundwater contamination caused by the leftover uranium mill tailings after the cold war is a vital environmental concern in the United States. In Colorado, attention has been given to a number of hazardous sites adjacent to the Colorado River because the elevated concentrations of contaminants that can be harmful to young-of-year fish using backwater channels as habitat during late summer. Over the last couple of years, acetate injection has been proposed and tested at the Rifle pilot site to examine the effectiveness of in-situ bio-remediation.  Our research addresses how both geological as well as geochemical (reactions) uncertainties can be integrated to quantify the efficacy of acetate injection as a remediation practice. Jef Caers, Kate Maher

Dissertation: Holistic strategies for prediction uncertainty quantification of contaminant transport and reservoir production in field cases