Full Text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

With the most recent developments in wearable technology, the possibility of continually monitoring stress using various physiological factors has attracted much attention. By reducing the detrimental effects of chronic stress, early diagnosis of stress can enhance healthcare. Machine Learning (ML) models are trained for healthcare systems to track health status using adequate user data. Insufficient data is accessible, however, due to privacy concerns, making it challenging to use Artificial Intelligence (AI) models in the medical industry. This research aims to preserve the privacy of patient data while classifying wearable-based electrodermal activities. We propose a Federated Learning (FL) based approach using a Deep Neural Network (DNN) model. For experimentation, we use the Wearable Stress and Affect Detection (WESAD) dataset, which includes five data states: transient, baseline, stress, amusement, and meditation. We transform this raw dataset into a suitable form for the proposed methodology using the Synthetic Minority Oversampling Technique (SMOTE) and min-max normalization pre-processing methods. In the FL-based technique, the DNN algorithm is trained on the dataset individually after receiving model updates from two clients. To decrease the over-fitting effect, every client analyses the results three times. Accuracies, Precision, Recall, F1-scores, and Area Under the Receiver Operating Curve (AUROC) values are evaluated for each client. The experimental result shows the effectiveness of the federated learning-based technique on a DNN, reaching 86.82% accuracy while also providing privacy to the patient’s data. Using the FL-based DNN model over a WESAD dataset improves the detection accuracy compared to the previous studies while also providing the privacy of patient data.

Details

Title
Wrist-Based Electrodermal Activity Monitoring for Stress Detection Using Federated Learning
Author
Almadhor, Ahmad 1   VIAFID ORCID Logo  ; Gabriel Avelino Sampedro 2   VIAFID ORCID Logo  ; Abisado, Mideth 3   VIAFID ORCID Logo  ; Abbas, Sidra 4   VIAFID ORCID Logo  ; Ye-Jin, Kim 5 ; Khan, Muhammad Attique 6 ; Baili, Jamel 7   VIAFID ORCID Logo  ; Jae-Hyuk Cha 5 

 Department of Computer Engineering and Networks, College of Computer and Information Sciences, Jouf University, Sakaka 72388, Saudi Arabia; [email protected] 
 Faculty of Information and Communication Studies, University of the Philippines Open University, Los Baños 4031, Philippines; [email protected]; Center for Computational Imaging and Visual Innovations, De La Salle University, 2401 Taft Ave., Malate, Manila 1004, Philippines 
 College of Computing and Information Technologies, National University, Manila 1008, Philippines; [email protected] 
 Department of Computer Science, COMSATS University, Islamabad 45550, Pakistan 
 Department of Computer Science, Hanyang University, Seoul 04763, Republic of Korea; [email protected] (Y.-J.K.); [email protected] (J.-H.C.) 
 Department of Computer Science, HITEC University, Taxila 47080, Pakistan; [email protected] 
 College of Computer Science, King Khalid University, Abha 61413, Saudi Arabia; Higher Institute of Applied Science and Technology of Sousse (ISSATS), Cité Taffala (Ibn Khaldoun) 4003 Sousse, University of Sousse, Sousse 4000, Tunisia 
First page
3984
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
14248220
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2806610981
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.