Content area

Abstract

We have developed a support vector regression (SVR) accelerated variant of the distributed derivative-free optimization (DFO) method using the limited-memory BFGS Hessian updating formulation (LBFGS) for subsurface field-development optimization problems. The SVR-enhanced distributed LBFGS (D-LBFGS) optimizer is designed to effectively locate multiple local optima of highly nonlinear optimization problems subject to numerical noise. It operates both on single- and multiple-objective field-development optimization problems. The basic D-LBFGS DFO optimizer runs multiple optimization threads in parallel and uses the linear interpolation method to approximate the sensitivity matrix of simulated responses with respect to optimized model parameters. However, this approach is less accurate and slows down convergence. In this paper, we implement an effective variant of the SVR method, namely ε-SVR, and integrate it into the D-LBFGS engine in synchronous mode within the framework of a versatile optimization library inside a next-generation reservoir simulation platform. Because ε-SVR has a closed-form of predictive formulation, we analytically calculate the approximated objective function and its gradients with respect to input model variables subject to optimization. We investigate two different methods to propose a new search point for each optimization thread in each iteration through seamless integration of ε-SVR with the D-LBFGS optimizer. The first method estimates the sensitivity matrix and the gradients directly using the analytical ε-SVR surrogate and then solves a LBFGS trust-region subproblem (TRS). The second method applies a trust-region search LBFGS method to optimize the approximated objective function using the analytical ε-SVR surrogate within a box-shaped trust region. We first show that ε-SVR provides accurate estimates of gradient vectors on a set of nonlinear analytical test problems. We then report the results of numerical experiments conducted using the newly proposed SVR-enhanced D-LBFGS algorithms on both synthetic and realistic field-development optimization problems. We demonstrate that these algorithms operate effectively on realistic nonlinear optimization problems subject to numerical noise. We show that both SVR-enhanced D-LBFGS variants converge faster and thereby provide a significant acceleration over the basic implementation of D-LBFGS with linear interpolation.

Details

Title
A machine-learning-accelerated distributed LBFGS method for field development optimization: algorithm, validation, and applications
Author
Alpak, Faruk 1 ; Gao, Guohua 2 ; Florez, Horacio 2 ; Shi, Steve 2 ; Vink, Jeroen 3 ; Blom, Carl 3 ; Saaf, Fredrik 2 ; Wells, Terence 3 

 Shell Woodcreek Campus, Shell International Exploration and Production Inc., Houston, USA 
 Shell Global Solutions (US) Inc., Houston, USA 
 Shell Global Solutions B.V., Den Haag, Netherlands 
Pages
425-450
Publication year
2023
Publication date
Jun 2023
Publisher
Springer Nature B.V.
ISSN
14200597
e-ISSN
15731499
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2814155137
Copyright
© The Author(s), under exclusive licence to Springer Nature Switzerland AG 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.