Content area

Abstract

The transient electromagnetic (TEM) method is a crucial tool for subsurface exploration, providing essential insights into the electrical resistivity structures beneath the Earth’s surface. Traditional forward modeling approaches, such as the finite-difference time-domain (FDTD) method and the finite-element method (FEM), are computationally intensive, limiting their practicality for real-time, high-resolution, or large-scale investigations. To address these challenges, we present Deep-TEMNet, an advanced deep learning framework specifically designed for two-dimensional TEM forward modeling. Deep-TEMNet integrates the U-Net architecture with a tailored two-dimensional long short-term memory (2D LSTM) module, allowing it to effectively capture complex spatial-temporal relationships in TEM data. The U-Net component enables high-resolution spatial feature extraction, while the 2D LSTM module enhances temporal modeling by processing spatial sequences in two dimensions, thereby optimizing the representation of electromagnetic field dynamics over time. Trained on high-fidelity FEM-generated datasets, Deep-TEMNet achieves exceptional accuracy in reproducing electromagnetic field distributions across diverse geological scenarios, with a mean squared error of 0.00000134 and a root mean square percentage error of 0.002373019. The framework offers over 150 times the computational speed of traditional FEMs, with an average inference time of just 3.26 s. Extensive validation across varied geological conditions highlights Deep-TEMNet’s robustness and adaptability, establishing its potential for efficient, large-scale subsurface mapping and real-time data processing. By combining U-Net’s spatial resolution capabilities with the sequential processing strength of the 2D LSTM module, Deep-TEMNet significantly advances computational efficiency and accuracy, positioning it as a valuable tool for geophysical exploration, environmental monitoring, and other applications requiring scalable, real-time TEM analyses that are easily integrated into remote sensing workflows.

Details

1009240
Business indexing term
Title
Deep-TEMNet: A Hybrid U-Net–2D LSTM Network for Efficient and Accurate 2.5D Transient Electromagnetic Forward Modeling
Author
Qu, Zhijie 1   VIAFID ORCID Logo  ; Gao, Yuan 1 ; Kang, Xing 2   VIAFID ORCID Logo  ; Zhang, Xiaojuan 3 

 Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China; [email protected] (Z.Q.); [email protected] (Y.G.); Key Laboratory of Electromagnetic Radiation and Sensing Technology, Chinese Academy of Sciences, Beijing 100190, China; School of Electronic, Electrical and Communication Engineering, University of Chinese Academy of Sciences, Beijing 100049, China 
 State Key Laboratory of Space-Earth Integrated Information Technology, Beijing 100095, China; [email protected]; Beijing Institute of Satellite Information Engineering, Beijing 100095, China 
 Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China; [email protected] (Z.Q.); [email protected] (Y.G.); Key Laboratory of Electromagnetic Radiation and Sensing Technology, Chinese Academy of Sciences, Beijing 100190, China 
Publication title
Volume
17
Issue
2
First page
264
Publication year
2025
Publication date
2025
Publisher
MDPI AG
Place of publication
Basel
Country of publication
Switzerland
Publication subject
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-01-13
Milestone dates
2024-11-12 (Received); 2025-01-09 (Accepted)
Publication history
 
 
   First posting date
13 Jan 2025
ProQuest document ID
3159535658
Document URL
https://www.proquest.com/scholarly-journals/deep-temnet-hybrid-u-net-2d-lstm-network/docview/3159535658/se-2?accountid=208611
Copyright
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-01-25
Database
ProQuest One Academic