Depot Institutionnel de l'UMBB >
Publications Scientifiques >
Communications Internationales >
Veuillez utiliser cette adresse pour citer ce document :
http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/6025
|
Titre: | Word-Spotting approach using transfer deep learning of a CNN network |
Auteur(s): | Benabdelaziz, Ryma Gaceb, Djamel Haddad, Mohammed |
Mots-clés: | Word-Spotting approach using transfer deep learning a CNN network |
Date de publication: | 2020 |
Editeur: | IEEE |
Collection/Numéro: | 020 1st International Conference on Communications, Control Systems and Signal Processing (CCSSP); |
Résumé: | Convolutional Neural Networks (CNNs) are deep learning models that are trained to automatically extract the most discriminating features directly from an input image to be used for visual classification tasks. Recently, CNNs attracted a lot of interest thanks to their effectiveness in many computer vision applications (medical imaging, video surveillance, biometrics, pattern recognition, OCR, etc.). Transfer learning is an optimization method that uses a pretrained network to speed up the training of another related task or application. This helps speed up and improve the training process on a new dataset. In this paper, we propose a new approach of handwritten word retrieval based on deep learning and transfer learning. We compared the performance between two types of extracted features based on transfer learning: from a pre-trained model and a fine-tuned network. Experiments are performed using six different CNN architectures and three similarity measures on the presegmented Bentham dataset of the ICDAR competition. The obtained results demonstrate the effectiveness of our proposed approach compared to existing methods, evaluated in this competition |
URI/URL: | https://ieeexplore.ieee.org/document/9151583 http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/6025 |
ISSN: | 19855540 |
Collection(s) : | Communications Internationales
|
Fichier(s) constituant ce document :
|
Tous les documents dans DSpace sont protégés par copyright, avec tous droits réservés.
|