Abstract
Algorithms for the improvement of speech intelligibility in hearing prostheses can degrade the spatial quality of the audio signal. To investigate the influence on distance perception and localization of such algorithms, a system to virtually render arbitrary static acoustical scenes has been developed (Müller et al., 2010). With this system, the localization of sounds processed by a hearing aid algorithm can be compared to unprocessed sound sources. The existing virtual acoustics system has been extended to present more realistic dynamic scenes, and it can also compensate for head movements of test subjects.