Kategorien
Seiten
-

Akustik-Blog

Kategorie: ‘Virtual Reality’

Paper published: Exploring auditory selective attention shifts in virtual reality: An approach with matrix sentences

26. November 2025 | von

We are happy to share that our paper “Exploring auditory selective attention shifts in virtual reality: An approach with matrix sentences” has been published in the Journal of the Acoustical Society of America:
https://doi.org/10.1121/10.0039864

In this study, we explored voluntary shifts of auditory selective attention in complex and more naturalistic acoustic environments. To move beyond earlier paradigms based on single-word stimuli, we introduced unpredictable German matrix sentences to simulate more realistic listening conditions.

Overall results were comparable to previous versions, but no strong reorienting effect emerged. Interaction patterns still indicate that shifting auditory attention is more demanding than maintaining it, and that preparing attention benefits performance, as reflected in decreasing reaction times for later target onsets.

This approach contributes a paradigm for investigating auditory perception and attention in dynamic room acoustic environments, helping to close the gap between laboratory setups and real-world listening.

This work was created by Carolin Breuer and Janina Fels and was funded by the German Research Foundation (DFG) as part of the Priority Program SPP2236 AUDICTIVE.

Paper Published: Exploring cross-modal perception in a virtual classroom: the effect of visual stimuli on auditory selective attention

14. November 2025 | von

A new paper Exploring cross-modal perception in a virtual classroom: the effect of visual stimuli on auditory selective attention has been published in Frontiers in Psychology, as part of the Research Topic “Crossing Sensory Boundaries: Multisensory Perception Through the Lens of Audition”: https://doi.org/10.3389/fpsyg.2025.1512851

In a virtual classroom environment, we investigated how visual stimuli influence auditory selective attention. Across two experiments, congruent and incongruent pictures modulated performance during an auditory attention task: concurrent visual input increased response times overall, and incongruent pictures led to more errors than congruent ones. When visual stimuli preceded the sounds, the timing mattered — positive priming at 500 ms, but semantic inhibition of return at 750 ms.

These results highlight that cross-modal priming differs from multisensory integration, and that temporal dynamics between modalities substantially shape attentional behaviour.

This work was a collaboration between Carolin Breuer, Lukas Jonathan Vollmer, Larissa Leist, Stephan Fremerey, Alexander Raake, Maria Klatte and Janina Fels.

It was funded by the Priority Programme SPP2236 AUDICITVE and the Research Training Group (RTG) 2416 – MultiSenses, MultiScales, both of which are funded by the German Research Foundation (DFG).

Many thanks to everyone involved.

Paper published: The influence of complex classroom noise on auditory selective attention

08. Oktober 2025 | von

We are very glad to share that our paper “The influence of complex classroom noise on auditory selective attention”, based on the bachelor’s thesis of Robert Schmitt, which we co-authored, has just been published in Scientific Reports: https://www.nature.com/articles/s41598-025-18232-2

It has been a real pleasure to supervise this work and to see it evolve into a full publication. In the study, we examined how plausible classroom noise affects auditory selective attention in a virtual reality classroom environment.

Our results underline the importance of studying realistic and complex acoustic scenarios to gain more reliable and valid insights into auditory perception by showing higher error rates in the auditory attention task under complex noise conditions, as well as an increased perceived listening effort when the background contained intelligible speech.

The paper was developed within the ECoClass-VR project, part of the DFG Priority Program SPP 2236 AUDICTIVE on Auditory Cognition in Interactive Virtual Environments. More information at www.spp2236-audictive.de.

We hope these findings contribute to advancing our understanding of auditory attention in complex, real-world listening situations.

Many thanks to the co-authors Robert Schmitt, Larissa Leist, Stephan Fremerey, Alexander Raake, Maria Klatte, and Janina Fels, and to the AUDICTIVE community for the inspiring collaboration and support.

Paper published: Audiovisual angle and voice incongruence do not affect audiovisual verbal short-term memory in virtual reality

28. August 2025 | von

Visual Abstract Paper Ermert et al. 2025 Plos OneWe are happy to announce that our paper Audiovisual angle and voice incongruence do not affect audiovisual verbal short-term memory in virtual reality by Cosima A. Ermert, Manuj Yadav, Jonathan Ehret, Chinthusa Mohanathasan, Andrea Bönsch, Torsten W. Kuhlen, Sabine J. Schlittmeier, and Janina Fels has just been published in PLOS ONE.

Virtual reality is increasingly used in research to simulate realistic environments – possibly increasing ecological validity. However, the visual component in virtual reality can affect participants, especially if there are incongruencies between auditory and visual information. In this study, we investigated verbal short-term memory under two types of audio-visual incongruencies: an angle incongruence, where the perceived position of the sound source did not match that of the visual representation, and a voice incongruence, where two virtual agents switched voices. The task was presented either on a computer screen or in virtual reality. We found no effect of the incongruencies or the display modality (computer screen vs. virtual reality), highlighting the complexity of audio-visual interaction.

This research is part of the priority program AUDICTIVE and was a cooperation with researchers from the Visual Computing Institute and the Work and Engineering Psychology at RWTH Aachen University.

Read the full article here.