Content area

Abstract

Mobile Edge Computing (MEC) is a computational paradigm that brings resources closer to the network edge to provide fast and efficient computing services for Mobile Devices (MDs). However, MDs are often constrained by limited energy and computational resources, which are insufficient to handle the high number of tasks. The problems of limited energy resources and the low computing capability of wireless nodes have led to the emergence of Wireless Power Transfer (WPT) and Energy Harvesting (EH) as a potential solution where electrical energy is transmitted wirelessly and then harvested by MDs and converted into power. This paper considers a wireless-powered MEC network employing a binary offloading policy, in which the computation tasks of MDs are either executed locally or fully offloaded to an edge server (ES). The objective is to optimize binary offloading decisions under dynamic wireless channel conditions and energy harvesting constraints. Hence, an Energy-Harvesting Reinforcement Learning-based Offloading Decision Algorithm (EHRL) is proposed. EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. Simultaneously, the DNN is trained using the Nadam optimizer (Nesterov-accelerated Adaptive Moment Estimation), which combines the benefits of Adam and Nesterov momentum, offering improved convergence speed and training stability. The proposed algorithm addresses the dual challenges of limited energy availability in MDs and the need for efficient task offloading to minimize latency and maximize computational performance. Numerical results validate the superiority of the proposed approach, demonstrating significant gains in computation performance and time efficiency compared to conventional techniques, making real-time and optimal offloading design truly viable even in a fast-fading environment.

Details

1009240
Business indexing term
Title
Energy-Harvesting Reinforcement Learning-based Offloading Decision Algorithm for Mobile Edge Computing Networks (EHRL)
Publication title
PLoS One; San Francisco
Volume
20
Issue
11
First page
e0336903
Number of pages
19
Publication year
2025
Publication date
Nov 2025
Section
Research Article
Publisher
Public Library of Science
Place of publication
San Francisco
Country of publication
United States
e-ISSN
19326203
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Milestone dates
2025-05-04 (Received); 2025-10-31 (Accepted); 2025-11-26 (Published)
ProQuest document ID
3276035657
Document URL
https://www.proquest.com/scholarly-journals/energy-harvesting-reinforcement-learning-based/docview/3276035657/se-2?accountid=208611
Copyright
© 2025 Bayoumi et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-11-27
Database
ProQuest One Academic