It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
To that end, this study presents the Hierarchical Context-Aware Transformer (HCAT), a new model to perform analysis on unstructured healthcare data that resolves significant problems related to medical text. In the proposed model, the hierarchical structure of the system is integrated with the context-sensitive mechanisms to process the healthcare documents at sentence level and document levels. HCAT complies with domain knowledge by a specific attention module and uses a detailed loss function that focuses on classification accuracy besides encouraging domain adaptation. The quantitative experiment shows that HCAT is a better choice than Bi-LSTM and BERT for sentence representation. The model attains 92.30% test accuracy on medical text classification, conversing with high computational efficiency; batch processing time is about 150ms, while the memory consumed is 320 MB. The proposed architecture for clinical text representation facilitates the incorporation of long-range dependencies for clinical story representation, whereas the context-sensitive layer supports a better understanding of medical language. Precision and recall are significant because of the healthcare application of the model; the model has an accuracy of 91.8% and a recall of 93.2%. From these results, it can be concluded that HCAT presented significant progress in computing healthcare data. It provides a highly practical application for real-world extraction of medical data from unformatted text.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer