Depot Institutionnel de l'UMBB >
Mémoires de Master 2 >
Institut de Génie Electrique et d'Electronique >
Computer >
Veuillez utiliser cette adresse pour citer ce document :
http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/11795
|
Titre: | DMRI Anatomical Constrained Tractography using proposed tissue maps |
Auteur(s): | Saoudi, Mazigh Abdelli, Mohand Saïd Cherifi, Dalila (Supervisor) |
Mots-clés: | DMRI (Diffusion Magnetic Resonance Imaging) Tractography algorithms. |
Date de publication: | 2021 |
Résumé: | Diffusion Magnetic Resonance Imaging (DMRI), is a technique used to map the brain's
inside anatomy in vivo. The images gotten from DMRI can then be passed to reconstruction
techniques that estimate the diffusion direction of the fibres at each voxel. Information that
can be used to get a 3D model of the brain fibres using tractography algorithms.
One inherent limitation of tractography is the determination of the accurate streamline
termination point. To address this issue, researches in the field proposed the Anatomical
Constrained Tractography (ACT), which is a technique that uses prior knowledge of brain
anatomy to restrain the generated streamlines to be biologically coherent. To achieve this,
partial volume estimations (PVE) of the different brain's tissues are needed, which are
generally segmented from T1 images having a good contrast, especially between the White
Matter (WM) and Grey Matter (GM) areas.
The contribution of this project is to propose an alternative way to generate the needed
PVE’s without using a T1 image. By using diffusion tensor (DT) measures such as Fractional
Anisotropy (FA) and Mean Diffusivity (MD), we were able to extract the different brain
tissue masks. Our results were promising, they gave masks that are close to the PVE maps
provided with the test dataset. |
Description: | 56p. |
URI/URL: | http://dlibrary.univ-boumerdes.dz:8080/handle/123456789/11795 |
Collection(s) : | Computer
|
Fichier(s) constituant ce document :
|
Tous les documents dans DSpace sont protégés par copyright, avec tous droits réservés.
|