Depot Institutionnel de l'UMBB >
Mémoires de Master 2 >
Institut de Génie Electrique et d'Electronique >
Computer >
Veuillez utiliser cette adresse pour citer ce document :
http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/8676
|
Titre: | Arabic speech recognition using recurrent neural networks |
Auteur(s): | RABIAI, Zakaria DAHIMENE, A(supervisor) |
Mots-clés: | Neural networks (Computer science) Neural network |
Date de publication: | 2018 |
Résumé: | The purpose of this project is to implement an end-to-end automatic speech recognition system using recurrent neural networks, the Arabic language which ranks as the fifth most spoken language in the world has been chosen as the main language of the system. The Arabic language has been alienated from such type of projects due to its complexity, uniqueness and lack of free appropriate corpuses, but with new emerging algorithms in the domain of speech recognition such as the connectionist temporal classification, it is becoming more accessible to use unsegmented corpuses in the aim of building performant automatic speech recognition systems. The development of the project includes basic digital signal processing, exploration of the phonetic properties of the Arabic language, an adaption of a general corpus to fit the purpose of the project, feature extraction and a brief study on recurrent neural networks their performance in such a system. The full system with its various parts is implemented in Python and TensorFlow, different models inspired from literature are trained and tested using the Arabic speech corpus, leading to a selection of a final model that shows the lowest word error rate of 35.23%. The results encourage to explore more in depth the implementation of a speaker independent robust Arabic speech recognition system. |
Description: | 35 p. |
URI/URL: | http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/8676 |
Collection(s) : | Computer
|
Fichier(s) constituant ce document :
|
Tous les documents dans DSpace sont protégés par copyright, avec tous droits réservés.
|