Electrical Engineering and Information Technology

First AUDICTIVE Conference 2023

September 7th, 2023 | by
Woman standing on presentation stage.

Professorin Janina Fels © Gottfried Behler, Janina Fels und Alexander Raake

A conference like a booster shot for the dynamics of basic research aiming to bridge the gap between auditory perception and cognition on the one hand and virtual reality on the other.

With recent developments in hardware and software technologies, audiovisual virtual reality (VR) has reached a high level of perceptual plausibility, which allows to overcome the limitations of simple laboratory situations. This creates good conditions to explore in a controlled way the ability to interact with a complex audiovisual scene – as a representation of an authentic life experience, for example, in a classroom, an open-plan office, or else as complex outdoor communication situations – depending on acoustic, visual, and other contextual factors. The applicability of the resulting scientific results in the immediate living environment, as well as their feedback for quality enhancement in interactive audiovisual virtual environments and quality assessment methods of the interface of the two disciplines are topics of the conference and the subject of research.
In response to these multidisciplinary challenges, university professor Janina Fels from the Chair of Hearing Technology and Acoustics at RWTH Aachen University invited researchers from the fields of acoustics, cognitive psychology and computer science to the first AUDICTIVE Conference from June 19 to 22, 2023. The goal here was to ensure interdisciplinary collaboration in basic research and to make possible synergy effects that cannot be achieved by a single discipline.

Currently, research efforts are mostly conducted separately within individual scientific research communities. This prevents cognitive psychology and acoustics from fully exploiting the enormous potential of VR to test and extend their existing theories in the more realistic, rich, and interactive virtual environments that can be created with the current state of the art in VR technology. At the same time, VR research can benefit from the knowledge of auditory perception and cognition to understand the important quality criteria that must be met to optimize user perception and cognitive performance, as well as subjective experience and (social) presence in a virtual environment. Due to the added value of the collaboration and research methods of the three disciplines, it is expected that research in the areas of hearing, auditory cognition, and VR can be elevated to a much higher level.

Seated audience.

© Gottfried Behler, Janina Fels und Alexander Raake

The conference presentations offered fascinating insights into the future of human-computer interaction, auditory perception and virtual reality:

Professor Barbara Shinn-Cunningham of the Carnegie Mellon Institute elaborated in her presentation on how our brains perceive and evaluate the auditory world around us. It does this by using the interaction between voluntary top-down attention and involuntary bottom-up attention to focus on a speaker while processing new sound sources in our environment. In this talk, Professor Shinn-Cunningham explored how peripheral and central factors combine to determine communicative success, which is influenced by expected and unexpected sounds in everyday environments. She focused on the cortical networks that mediate competition for attention.

In his talk, Professor Frank Steinicke from the University of Hamburg presented the exciting development of the fusion of augmented reality (XR) and artificial intelligence (AI). He emphasized how these technologies enable seamless transitions between real and virtual worlds and have the potential to create immersive experiences.
Still, today’s immersive technology is decades away from the ultimate representation. However, the shortcomings of the human perceptual system, cognition, and motor skills can be exploited to bend reality to enable compelling immersive experiences. His talk presented several XR illusions that bring us closer to the ultimate fusion of intelligence and reality.

Professor Alexandra Bendixen from TU Chemnitz spoke about her research in auditory perception and sensing. She explained how she creates scenes with multiple interpretations and monitors listeners’ perception to investigate factors that stabilize auditory perception in ambiguous scenes. Recent combinations of auditory multistability with eye tracking have provided new insights into the interplay between auditory and visual multistability, with implications for our general understanding of scene analysis across all senses. When psychophysiological measurement logic is flipped, brain responses associated with sensory predictions can be used to evaluate certain aspects of virtual reality (VR), such as the appropriateness of VR latencies.

More information about the AUDICTIVE Conference is available on the official website.

Leave a Reply

Your email address will not be published. Required fields are marked *