Abstract

Sleep arousals are transient periods of wakefulness punctuated into sleep. Excessive sleep arousals are associated with symptoms such as sympathetic activation, non-restorative sleep, and daytime sleepiness. Currently, sleep arousals are mainly annotated by human experts through looking at 30-second epochs (recorded pages) manually, which requires considerable time and effort. Here we present a deep learning approach for automatically segmenting sleep arousal regions based on polysomnographic recordings. Leveraging a specific architecture that ‘translates’ input polysomnographic signals to sleep arousal labels, this algorithm ranked first in the “You Snooze, You Win” PhysioNet Challenge. We created an augmentation strategy by randomly swapping similar physiological channels, which notably improved the prediction accuracy. Our algorithm enables fast and accurate delineation of sleep arousal events at the speed of 10 seconds per sleep recording. This computational tool would greatly empower the scoring process in clinical settings and accelerate studies on the impact of arousals.

Li and Guan present a deep learning approach for automatically segmenting sleep arousal regions based on polysomnographic recordings. The algorithm, which won an open competition, enables fast and accurate delineation of sleep arousal events and would be useful in the scoring process in clinical studies.

Details

Title
DeepSleep convolutional neural network allows accurate and fast detection of sleep arousal
Author
Li, Hongyang 1 ; Guan Yuanfang 1   VIAFID ORCID Logo 

 University of Michigan, Department of Computational Medicine and Bioinformatics, Ann Arbor, USA (GRID:grid.214458.e) (ISNI:0000000086837370) 
Publication year
2021
Publication date
2021
Publisher
Nature Publishing Group
e-ISSN
23993642
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2475029530
Copyright
© The Author(s) 2021. This work is published under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.