Skip to content Skip to navigation

Research

Our research has a broad spectrum, but key ingredients are the applied nature, aiming for practical impact, at the same time relying on foundational development in data scientific methods. Our publication list provides an overview of the various academic contributions. In addition, we work closely with Earth Resources companies and government entities to define specific collaborative sponsored projects. Some recent company projects are listed below as well as our current student research topics. 

 

  • Groundwater management in Denmark (Danish Government)
  • Production planning for a structurally complex reservoir in Libya (Wintershall)
  • Reservoir appraisal for a deep-water turbidite reservoir from seismic data in West Africa (Hess)
  • Designing uranium contaminant remediation in the USA (SE3)
  • Big data predictive analytics for optimizing shale reservoirs (Anadarko, Repsol, NGI)
  • Assessment of low-enthalpy geothermal systems for heating buildings (Belgian government)
  • A strategy for rapid & realistic forecasting of reservoir performance using process-based models (ENI)
  • Direct forecasting and uncertainty updating at the appraisal stage without building new models (Chevron)
  • Implicit dynamic uncertainty for automation of data fusion & prediction in mineral resources evaluation and planning (BHP)
  • Multiple-point geostatistics and bayesian evidential learning for gas reservoir management (Edison)

 

Current Research

Bayesian evidential learning

The ultimate goal of collecting data, building models and making predictions is to make an informed decision. In the subsurface realm, such decisions are subject to considerable uncertainty. I research a new framework termed “Bayesian Evidential Learning” (BEL) that streamlines the integration of these four components: data, model, prediction, decision. This idea is published in a new book: “Quantifying Uncertainty in Subsurface Systems” (Wiley-Blackwell, 2018) and applied to five real case studies in oil/gas, groundwater, contaminant remediation and geothermal energy. BEL is not a method, but a set of principles derived from Bayesianism that lead to the selection of relevant methods to solve real decision problems. In that sense BEL, focuses on decision-focused data collection and model-building. One of the important contribution of BEL is that is a data-scientific approach that circumvents complex inversion modeling such as history matching and dynamic data integration and instead relies on machine learning from Monte Carlo with falsified priors. The case studies illustrate how modeling time can be reduced from months to days, making it practical for large scale implementations. Components of BEL are global sensitivity analysis, Monte Carlo, model falsification, prior elicitation and data scientific methods that reflect the stated principle of its Bayesian philosophy. Jef Caers

Book: Quantifying Uncertainty in Subsurface Systems

Building informative priors and fast simulation methods for subsurface flow applications

Uncertainty quantification of flow in the subsurface is traditionally hampered both (i) by a lack of use of prior knowledge in geostatistical simulation methods and (ii) repeated solution of costly reservoir flow equations. Computational sediment transport models and an abundance of inexpensive remote sensing data represent two potentially important sources of information about depositional patterns. By building up a database of prior knowledge and combining this with modern statistical and machine learning methods we aim to create fast and realistic simulation methods where there is less room for subjective decisions about uncertainty. Erik Nesvold

Characterizing heterogeneity of resource plays

Successful exploitation of mature resource plays relies heavily on data acquired from the hundreds to thousands of wells already drilled. Emerging plays lack data spatially and so reliance on data oriented analysis is not possible. The research focuses on characterizing the heterogeneity of organic rich mudrocks at different scales using integrated data analysis from core lab analysis, well logs, and seismic. Current studies focus on the quantitative seismic interpretation workflow to obtain geochemical properties. Rock physics templates are used as a prior to construct multiple realizations. Mustafa Al Ibrahim

Combining data science methods with flow simulation in shale resources development

Two approaches exist to uncertainty quantification in petroleum reservoirs in general: data-driven modelling and reservoir simulation. In shale reservoirs data-driven modelling has become Firstly, the large high number of wells drilled leads to large data volumes and necessity for rapid decision making. Secondly, physical processes of hydraulic fracturing and shale production, its connection to natural fractures are poorly understood. In this research, I would like to determine compatibility of data-driven modelling and direct reservoir simulation results in shale reservoirs.  Do we get the same answers for well placement and identifying most important for production factors using two different methods?  Does forecast quality suffer without reservoir simulation? And as the result I would like to create synthesis of two methods, combining ability of working with large datasets of data-driven modelling and better physics from reservoir simulation. Application will be done datasets provided by Repsol. Alexander Bakay

Global sensitivity analysis for multi-phase flow in the presence of spatially distributed parameters

Sensitivity analysis is performed with uncertainty quantification to estimate how response uncertainty is apportioned to model uncertainty.  Model uncertainty can be either from global or spatially distributed parameters. The challenges of conducting sensitivity analysis and uncertainty quantification are due to high dimensionality of model variables and large computational cost of flow simulations. In addition, uncertainty needs to be updated as more data are obtained. This research proposes to first quantify sensitivity of both global and spatially distributed parameters by applying dimensionality reduction. Surrogate models are used to update both model and response uncertainty with high computational efficiency. The results obtained can be used for informed decision making. Jihoon park

paper: DGSA: A Matlab toolbox for distance-based generalized sensitivity analysis of geoscientific computer experiments

code:  https://github.com/SCRFpublic/DGSA

Groundwater management in Denmark

High quality drinkwater is an important Earth Resource, but global change, in terms of climate, energy needs, population, agriculture under such change will put considerable stress on fresh-water supplies . Denmark provides a unique case because of a clear formulation of governmental objectives, the appropriate definition and involvement of the various stakeholders, the collection of high quality data such as Airborne EM data, the specific nature of geological heterogeneity, and the general desire to do it the “right” way: leveraging science and community involvement into a sustainable future water supply. Our project in collaboration with scientist in Denmarks looks at how decision analysis, uncertainty quantification and the assessment of data collection methods can help local authorities in the management of the system, balancing needs for water vs protecting the environment. Jef Caers, Celine Scheidt, Troels Vilhelmsen.

Book: Quantifying Uncertainty in Subsurface Systems

Immersive visualization techniques for communicating geological uncertainty

Geological processes are studied, modeled, and communicated to inform decision making for research, education, or strategic planning of earth resources. Geologists often use technologies that naturally limit true three-dimensional representation, which may affect the understanding, communication, and collaboration on often complex and sensitive models. Immersive visualization techniques have evolved over the years, but have been widely avoided due to their cost, size, and relative unreliability compared to traditional visualization, such as two-dimensional representation via computer screens. With the onset of cheap, portable, and powerful head-mounted displays, true three-dimensional immersion can be more easily achieved. We aim to relieve potential decision paralysis and improve the efficacy of communicating uncertainty across the entire decision-making process via collaborative, immersive modeling and visualization.  Tyler Hall

Video: VRGE: An Immersive Visualization System for Modeling and Analysis in the Geosciences

Joint quantification of spatial and global uncertainty: application to mitigating agricultural run-off in Denmark

In quantifying uncertainty in spatially distributed system, one needs to deal with both spatial uncertainty and global model uncertainty. Spatial uncertainty is due to the presence of spatial varying properties (e.g. grade, porosity, hydraulic conductivity). In addition, global quantities, such as variogram, training images, fluid properties are uncertain as well. The current practice in geostatistics to treat global parameters as deterministic variables. In this work, our goal is to develop a probabilistic framework for quantifying both global and spatial uncertainty in earth models jointly. We first use Bayesian rules to get posterior distribution of global parameters. With uncertainty among global parameters, we conduct geostatistical simulation with local and global decompositions given sampled global parameters from previous posterior distribution. We apply this work in the context of mitigating contamination from agriculture in Denmark. The complex heterogeneity of the Danish groundwater system requires that we include both the spatial variability of the buried valley system as well understanding of conceptual uncertainty and other global parameters. Model uncertainty is reduced by two form of geophysical data: local-scale and regional-scale TEM. Lijing Wang

Learning from big data for rapid uncertainty quantification of exploratory ore deposits

Mineral resources forecasting requires integrating drill-holes and geophysical data with the guidance of geological knowledge. However, the cost of collecting more data in a specific local area can often be high, in an exploration setting. Uncertainty models built based solely on sparse local data have very limited predicting power. Would it be possible to learn from a large amount of already explored sites and rapidly build predictive models in the local area with sparse local data? To answer this question, we need to develop efficient statistical inference and modeling techniques. We take advantage of the most recent development in computer vision and computer graphics, called level set methods, for efficiently generating local uncertainty models constrained by data and geological rules, with parameters learned from big data. We aim to reduce the cost (time, money and labor) in mineral exploration with this approach.  Liang Yang

Modeling subsurface heterogeneity with surface processes data from flume experiments

Stochastic simulation of stratigraphy is now possible via computer graphics methods and due to the availability of dense datasets such as those produced by flume experiments. The main questions that remain open are 1) To what extent these methods reproduce the real phenomena? and 2) How to calibrate the inputs of the model with data such as LiDAR point clouds and videos (i.e. sequence of images) collected in the tank? Possible answers to these questions are relevant for modeling natural resources and for acting proactively upon natural disasters. Julio Hoffimann

 video: Building 3D models with Image Quilting

researchgate: Julio's papers on machine learning and AI

Monitoring low-enthalpy heating systems using time-lapse ERT

Low-enthalpy geothermal systems are increasingly used for climatization (heating/cooling) of buildings, in an effort to reduce the carbon footprint of this type of energy use. The main idea is the utilization of the subsurface, whether rocks, soils, saturated or unsaturated as a heat source or heat sink (cooling). The design as well as monitoring of the shallow geothermal system, like many other subsurface applications requires a multi-disciplinary approach, involving several fields such as geology, hydrogeology, physics, chemistry, hydraulics engineering design and economics. Characterization of heat flow, temperature changes and its effect on the ambient environment requires characterizing geological heterogeneity, combined fluid and thermal. In this research, we focus on the use of a specific method, namely Electrical Resistivity Tomography (ERT) and its time-lapse variety to characterize temperature and its changes under shallow geothermal exploitation and monitoring. We develop methods that allows for efficient characterization of the temperate change during production which is important in the design of heat pumps. Jef Caers, Thomas Hermans

paper: Uncertainty Quantification of Medium‐Term Heat Storage From Short‐Term Geophysical Experiments Using Bayesian Evidential Learning

Optimization under uncertainty for enhanced geothermal systems

Enhanced Geothermal Systems (EGS) generate electricity by creating an artificial subsurface heat exchange circuit. Cold fluid is injected into the subsurface and heats up as it circulates though hot rocks in a hydraulically fractured reservoir. The fluid is then pumped out of production wells and powers turbines to generate electricity. There are many uncertain parameters when creating an EGS, including: the stress field, the location and orientation of pre-existing fractures, the temperature field, and the physical mechanisms governing fracture creation. Therefore, the reservoir's response to stimulation and operation cannot be predicted with certainty. My research explores the optimization of EGS design parameters, such as the spatial configurations of the laterals, the number of fracturing stages, etc., given the subsurface uncertainty. Ahinoam (Noe) Pollack

Practice of bayesian evidential learning in reservoir uncertainty quantification

Fast and reliable subsurface reservoir prediction is critical for decision making processes, especially in the oil and gas industry. However, the conventional workflows based on causal analysis can be time-consuming due to iterative model re-constructions and forward modellings. A statistical learning approach (“Bayesian Evidential Learning”) is therefore used as an alternative paradigm to the conventional workflows. It makes use of statistical learning to directly infer the relationship between the data and reservoir model predictions. Such statistical relationship, along with the observed data, will produce a probabilistic estimate of the reservoir predictions under Bayesian framework, while making it more flexible to properly quantify the prediction uncertainty. This research is conducted in collaboration with Chevron. David Zhen Yin

Predicting favorable locations for geothermal development

Development of renewable geothermal energy resources is hindered by high exploration risk due to limited economic profitability and high subsurface uncertainty. In order to reduce exploration risk, conventional methodologies have sought to predict geothermal resources using spatial aggregation of data thought to be indicators of geothermal potential. However, this approach is problematic because it requires a large dataset of catalogued geothermal resources which does not exist. In our approach, using Bayesian Evidential Learning, uncertainty is reduced by directly learning how data, such as shallow temperature wells or geophysical data, is related to temperatures deeper in the subsurface. Noah Athens

paper: Sensitivity of Temperature Predictions in Basin-Scale Hydrothermal Models

Quantifying uncertainty using level sets with stochastic motion

In many geosciences applications, predicting requires building 3D models that are complex and cumbersome. As a result, often a single model, or some small variation of it, is built. One of the main bottlenecks lies in the creation of a high-resolution grid (million to billion cells) to provide enough detail and to constrain to data. Uncertainty is provided by generating many realizations, as sampled from some posterior distribution. Built into this common notion of modeling lies a fundamental inefficiency: building many accurate high-resolution models, each one of them being a poor approximation of actual reality (the truth). In this research, we provide a fundamentally new view on the same problem. Consider some geological geometry (ore, reservoir, fault) whose uncertainty needs to be quantified based on geological understanding rules and data. Instead of modeling these geometries and their uncertainty with many high-resolution models drawn by Monte Carlo, we will model their uncertainty by means of motion (velocity), representing uncertainty by means of an 3D velocity, as a stochastic process. To model 3D surfaces (e.g. faults, horizons, bodies) and their uncertainty, we will integrate this stochastic velocity into the level set equation. Level sets are an ideal way to represent mathematically complex surfaces without explicit grid-representations. This idea is applied to uncertainty quantification for groundwater systems, mineral resources and oil/gas reservoirs. Jef Caers, Liang YangPeter Achtiziger, David Hyde, Ron Fedkiw

Recognition of sub-resolution stacking patterns from seismic data

Geological modeling and characterization of energy reservoirs greatly relies on the interpretation of seismic data due to its high spatial availability. Owing to many factors, including subsurface complexities, heterogeneity and limited seismic resolution, accurate description of subsurface geological configuration becomes a major challenge. This research is focused on quantitative interpretation of individual seismic traces and extraction of information relevant to recognition of various geological parasequences.  The workflow proposed includes stochastic modeling of stratigraphic patterns involving thin sub-resolution beds, and subsequent application of statistical pattern recognition techniques to deduce those patterns from their respective acoustic responses. Riyad Muradov

Seismic reservoir property estimation and uncertainty quantification with statistical learning techniques

Seismic reservoir characterization has conventionally necessitated estimation of subsurface elastic properties, given that seismic data is the elastic response of the subsurface. Inverse methods employed to such an end, however, generally run into theoretical, numerical and computational complexities, exacerbated by the high dimensional nature of seismic data. The research question we pose: is such a step warranted when the end goal is reservoir characterization? We employ statistical learning techniques to learn the implicit relationship between seismic data and reservoir properties. Such learning is subsequently exploited for property estimation and uncertainty quantification, circumventing the limiting component of seismic inversion. Anshuman Pradhan

Stochastic fracture network simulation constrained by geophysical data

Fracture networks are important features of physical systems such as enhanced geothermal systems and unconventional oil & gas reservoirs. While fracture network models exist, incorporating real data in such simulations remains a challenge. Data may come in the form of seismic, microseismic, boreholes, surface outcrops, and more. This research focuses on developing methods for modeling fracture networks stochastically while constraining to observed geophysical data, primarily in the form of seismic and microseismic information. The stochastic approach allows for outcome uncertainty to be quantified and the geophysical data helps constrain the realizations to create realistic models. Alex Miltenberger

Uncertainty quantification and global sensitivity analysis for reactive transport models, application to uranium remediation

In addressing subsurface contamination, it is important to understand the efficacy of remediation practices. Do these practices achieve what they are intended for? How can we monitor this by means of sensors or geophysics? A major challenge is to address complexities in both geological heterogeneity as well as uncertainties about the biogeochemical system. In this research, we developed a general workflow to address such questions and demonstrate it on the uranium remediation experiment at the rifle site, Colorado. Groundwater contamination caused by the leftover uranium mill tailings after the cold war is a vital environmental concern in the United States. In Colorado, attention has been given to a number of hazardous sites adjacent to the Colorado River because the elevated concentrations of contaminants that can be harmful to young-of-year fish using backwater channels as habitat during late summer. Over the last couple of years, acetate injection has been proposed and tested at the Rifle pilot site to examine the effectiveness of in-situ bio-remediation.  Our research addresses how both geological as well as geochemical (reactions) uncertainties can be integrated to quantify the efficacy of acetate injection as a remediation practice. Jef Caers, Kate Maher

Dissertation: Holistic strategies for prediction uncertainty quantification of contaminant transport and reservoir production in field cases

Using level sets with stochastic motion for uncertainty visualization and risk assessment for mineral resources

In the mining industry, forecasting resources and uncertainty estimation of deposits is usually performed based on a single best fit geological model and associated best guess limits. These detailed models demand high computational power and human workforce. Our new approach of uncertainty quantification relies on stochastic but geological meaningful perturbations of level-sets in 3D space with uncertainty being dealt as an additional dimension through stochastic motion. This is a computational effective method, allowing the visualization and assessment of uncertainty without modeling multiple scenarios explicitly. To further reduce the computational demand general rules for local grid optimization are developed which honor the data availability and maintain the models’ statistics and uncertainties. The tools are developed and functionalities are tested, based on a range of true-world examples covering mining in copper, coal, iron and others.  Peter Achtziger-Zupančič

Value of information of time-lapse seismic data in reservoir development

Though time-lapse seismic data is frequently used in reservoir development to monitor the fluid saturation changes as a result of production, it comes at a high cost, and so there is a need to justify its cost. Value of information (VOI) is a decision-analytic metric which can be used for this purpose. The VOI is computed before actually collecting the data and it is a measure of the additional value that information could bring in a particular decision situation. VOI analysis using a conventional model-driven approach is computationally very expensive because posterior models have to be built for every possible dataset. My research involves VOI analysis using a data-driven approach called simulation-regression, in which the prior models are used to build a statistical relationship between the data and the prospect values, which is then used to directly predict the values given the data. Geetartha Dutta

paper: Simulation–Regression Approximations for Value of Information Analysis of Geophysical Data