- Tytuł:
- Multimodal Perceptual Training for Improving Spatial Auditory Performance in Blind and Sighted Listeners
- Autorzy:
-
Bǎlan, O.
Moldoveanu, A.
Moldoveanu, F. - Powiązania:
- https://bibliotekanauki.pl/articles/177275.pdf
- Data publikacji:
- 2015
- Wydawca:
- Polska Akademia Nauk. Czytelnia Czasopism PAN
- Tematy:
-
front-back confusions
HRTF
sound localization
training
virtual auditory environment - Opis:
- The use of individualised Head Related Transfer Functions (HRTF) is a fundamental prerequisite for obtaining an accurate rendering of 3D spatialised sounds in virtual auditory environments. The HRTFs are transfer functions that define the acoustical basis of auditory perception of a sound source in space and are frequently used in virtual auditory displays to simulate free-field listening conditions. However, they depend on the anatomical characteristics of the human body and significantly vary among individuals, so that the use of the same dataset of HRTFs for all the users of a designed system will not offer the same level of auditory performance. This paper presents an alternative approach to the use on non-individualised HRTFs that is based on a procedural learning, training, and adaptation to altered auditory cues.We tested the sound localisation performance of nine sighted and visually impaired people, before and after a series of perceptual (auditory, visual, and haptic) feedback based training sessions. The results demonstrated that our subjects significantly improved their spatial hearing under altered listening conditions (such as the presentation of 3D binaural sounds synthesised from non-individualized HRTFs), the improvement being reflected into a higher localisation accuracy and a lower rate of front-back confusion errors.
- Źródło:
-
Archives of Acoustics; 2015, 40, 4; 491-502
0137-5075 - Pojawia się w:
- Archives of Acoustics
- Dostawca treści:
- Biblioteka Nauki