Abstract
The authors have developed a vocalization training system for the auditory impaired using a talking robot. The training system mainly consists of a talking robot that has mechanical organs like a human. With an adaptive learning strategy using an auditory feedback control, the robot autonomously learns the vocalization, and then reproduces the speech articulation from inputted sounds. In the previous system, the speech-learning algorithm of the robot was constructed by employing a self-organizing neural network (SONN), which consists of the combination of a self-organizing map (SOM) and a neural network (NN). However, improper maps were occasionally generated in the results of the speech articulation learning. To solve this problem, a new algorithm introducing two three-dimensional SOMs, called a dual-SOM, was employed for the autonomous learning of the robotic articulations. By applying the robot and its properties, we have constructed an interactive training system. The training is divided into two approaches; one is to use the talking robot to show the shape and the motion of the vocal organs, and the other is to use a topological map to present the difference of phonetic features of a trainee’s voices. In this study, first, the construction of the training system is described together with the autonomous learning of robotic vocalization using the dual-SOM algorithm, and then the analysis of the speech training progress is presented based on the phonetic features and the mechanical vocal articulations.
©2011 by Walter de Gruyter Berlin New York
Articles in the same Issue
- Editorial
- Disability, virtual reality, ArtAbilitation and music
- Reviews
- Customising games for non-formal rehabilitation
- Aphasic theatre or theatre boosting self-esteem
- Warriors’ Journey: a path to healing through narrative exploration
- CaDaReMi. An educational interactive music game
- Extending body and imagination: moving to move
- Original Articles
- Making music with images: interactive audiovisual performance systems for the deaf
- An infrared sound and music controller for users with specific needs
- Sound=Space Opera: choreographing life within an interactive musical environment
- Cognitive effects of video games on old people
- Providing disabled persons in developing countries access to computer games through a novel gaming input device
- Voice articulatory training with a talking robot for the auditory impaired
- Using augmented reality to support the understanding of three-dimensional concepts by blind people
- Augmented reality application for the navigation of people who are blind
- Case Report
- Unintentional intrusive participation in multimedia interactive environments
- Listening to complexity: blind people’s learning about gas particles through a sonified model
Articles in the same Issue
- Editorial
- Disability, virtual reality, ArtAbilitation and music
- Reviews
- Customising games for non-formal rehabilitation
- Aphasic theatre or theatre boosting self-esteem
- Warriors’ Journey: a path to healing through narrative exploration
- CaDaReMi. An educational interactive music game
- Extending body and imagination: moving to move
- Original Articles
- Making music with images: interactive audiovisual performance systems for the deaf
- An infrared sound and music controller for users with specific needs
- Sound=Space Opera: choreographing life within an interactive musical environment
- Cognitive effects of video games on old people
- Providing disabled persons in developing countries access to computer games through a novel gaming input device
- Voice articulatory training with a talking robot for the auditory impaired
- Using augmented reality to support the understanding of three-dimensional concepts by blind people
- Augmented reality application for the navigation of people who are blind
- Case Report
- Unintentional intrusive participation in multimedia interactive environments
- Listening to complexity: blind people’s learning about gas particles through a sonified model