Visualizing articulation disorders
Certain speech problems result from poor positioning of the tongue. And yet, it is impossible for sufferers to visualize the organ in question, which complicates the re-education process by speech therapists. To overcome this problem, French researchers1GIPSA-lab (CNRS/Université Grenoble Alpes/Grenoble-INP), INRIA Grenoble Rhône-Alpes have developed a system that enables the patient to visualize the movements of their tongue. These movements are captured by an echography probe placed under the tongue before being treated by an automatic learning algorithm and finally displayed to the user in real time, using a 3D avatar. A simplified version of this system has already been validated in the laboratory with healthy speakers. It is currently being tested in patients who have undergone a surgical operation on their tongue.
[:fr]© T. Hueber / GIPSA-Lab[:en]© T. Hueber / GIPSA-Lab[:]
[:fr]Exemple d’animations du modèle de langue de la tête parlante articulatoire du GIPSA-Lab à partir d’images échographiques, à l’aide de l’algorithme Integrated Cascaded Gaussian Mixture Regression pour les séquences [ata] (haut) et [uku] (bas).[:en]An animated example of the tongue model of the GIPSA-Lab's articulated talking head from ultrasound images, using the Integrated Cascaded Gaussian Mixture Regression algorithm for the [ata] (top) and [uku] (bottom) sequences.[:]