Geoscience Reference
In-Depth Information
In general, the real-field HM workflows can be highly computationally demanding, mostly
on the account of time-consuming forward reservoir simulations. Furthermore, when the
full-fledged uncertainty analysis of the high-resolution geological model is addressed
through the generation of multiple ( i.e., sometimes in the order of 100s) static model
realizations, the multi-iteration AHM workflows may become prohibitively expensive. The
QuantUM AHM module addresses the issue of computational efficiency in two ways: a)
takes full advantage of parallel execution of VIP ® and/or Nexus ® reservoir simulator,
wherever multi-CPU cores are available and b) uses the option of computational load
distribution via standard submission protocol wherever the multi-node computational
resources are available.
The proposals generated by the Metropolis-Hastings sampler of the two-step MCMC-based
inversion workflow are very likely positively correlated; therefore, the convergence
diagnostics ought to be governed by the estimators averaged over the ensemble of
realizations. QuantUM AHM workflow implements the maximum entropy test (Full et al.,
1983), where the (negative) entropy, S , of the sampled stationary (posterior) distribution is
defined as the expected value of the logarithms of posterior terms of the objective function,
| (|)
m p md . Further mathematical derivations of the entropy, S and its variance,
implemented as convergence measures in AHM workflow are given in Maučec et al. 2007
and Maučec et al. 2011a. Selected results of QuantUM AHM workflow validation are given
in Fig. 12, with additional information available in (Maučec et al. 2011a; Maučec et al. 2011b):
The behavior of (negative) entropy, S (Fig. 12a) and the objective function, defined as
the logarithm of transition probability of two-step Metropolis-Hastings sampler (Fig.
12b) demonstrate the convergence rate of the MCMC sequence with the burn-in period
of approximately 750 samples ( i.e. , the total number of processed samples is 1500, a
product of 50 model realizations and 30 MCMC iterations).
Comparison of dynamic well production responses, here defined as the water-cut
curves, calculated with prior (Fig. 12c) and posterior (Fig. 12d) model realizations
demonstrate the efficiency of water-cut misfit reduction, between the simulated and
observed data. To demonstrate the case, the production response of one of the wells
with the most pronounced production dynamics over the 10-year period, is depicted.
Such non-monotonic behavior is usually most challenging to match.
The HM workflow demonstrates a significant reduction in the discrepancy of mean
dynamic response (Fig. 12e), calculated over ensemble of 50 history-matched models
with respect to observed watercut well water-cut curve as well as impressive reduction
of the ensemble-averaged variance (Fig. 12f) of history-matched responses with respect
to observed production curves.
Figs. 12g and 12h, depict log-permeability maps for one realization of top layer of
Brugge fluvial reservoir, corresponding to prior ( i.e. , initial, not history-matched) and
posterior ( i.e. , history-matched) models, respectively. The areas where the history-
matching algorithm attempts to reconcile the static model with dynamic data, by
connecting spatially-separated high-permeability areas to facilitate the fluid flow, as
governed by the calculated streamline-sensitivities, are emphasized.
2.6 Quantification of reservoir production forecast uncertainty
By its nature, the probabilistic history-matching workflows use multiple equally probable
but non-unique realizations of geological models that honor prior spatial constraints and
Search WWH ::




Custom Search