Depot Institutionnel de l'UMBB >
Publications Scientifiques >
Publications Internationales >
Veuillez utiliser cette adresse pour citer ce document :
http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/7128
|
Titre: | A deep neural network approach to QRS detection using autoencoders |
Auteur(s): | Belkadi, Mohamed Amine Daamouche, Abdelhamid Melgani, Farid |
Mots-clés: | ECG Deep learning Stacked autoencoder QRS detection |
Date de publication: | 2021 |
Editeur: | Elsevier |
Collection/Numéro: | Expert Systems with Applications/ Vol.184 (2021); |
Résumé: | Objective: In this paper, a stacked autoencoder deep neural network is proposed to extract the QRS complex from raw ECG signals without any conventional feature extraction phase. Methods: A simple architecture has been deeply trained on many datasets to ensure the generalization of the network at inference. Results: The proposed method achieved a QRS detection accuracy of 99.6% using more than 1042000 beats which is competitive with all state-of-the-art QRS detectors. Moreover, the proposed method produced only 0.82% of Detection Error Rate using six unseen datasets containing more than 1470000 beats. Thus confirms the high performance of our method to detect QRSs. Conclusion: Stacked autoencoder neural networks are very effective in QRS detection. At inference, our algorithm processes 1042309 beats in less than 25.32 s. Thus, it is favorably comparable with state-of-the-art deep learning methods. Significance: The stacked autoencoder is an efficient tool for QRS detection, which could replace conventional systems to help practitioners make fast and accurate decisions |
URI/URL: | DOI:10.1016/j.eswa.2021.115528 https://www.sciencedirect.com/science/article/abs/pii/S0957417421009362 http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/7128 |
ISSN: | 0957-4174 |
Collection(s) : | Publications Internationales
|
Fichier(s) constituant ce document :
Il n'y a pas de fichiers associés à ce document.
|
Tous les documents dans DSpace sont protégés par copyright, avec tous droits réservés.
|