{"id":23639,"date":"2026-02-09T12:00:00","date_gmt":"2026-02-09T11:00:00","guid":{"rendered":"https:\/\/blog.rwth-aachen.de\/itc\/?p=23639"},"modified":"2026-02-03T09:48:28","modified_gmt":"2026-02-03T08:48:28","slug":"kuenstliche-intelligenz-trifft-virtuelle-realitaet","status":"publish","type":"post","link":"https:\/\/blog.rwth-aachen.de\/itc\/en\/2026\/02\/09\/kuenstliche-intelligenz-trifft-virtuelle-realitaet\/","title":{"rendered":"Artificial Intelligence Meets Virtual Reality"},"content":{"rendered":"<div class=\"twoclick_social_bookmarks_post_23639 social_share_privacy clearfix 1.6.4 locale-en_US sprite-en_US\"><\/div><div class=\"twoclick-js\"><script type=\"text\/javascript\">\/* <![CDATA[ *\/\njQuery(document).ready(function($){if($('.twoclick_social_bookmarks_post_23639')){$('.twoclick_social_bookmarks_post_23639').socialSharePrivacy({\"txt_help\":\"Wenn Sie diese Felder durch einen Klick aktivieren, werden Informationen an Facebook, Twitter, Flattr, Xing, t3n, LinkedIn, Pinterest oder Google eventuell ins Ausland \\u00fcbertragen und unter Umst\\u00e4nden auch dort gespeichert. N\\u00e4heres erfahren Sie durch einen Klick auf das <em>i<\\\/em>.\",\"settings_perma\":\"Dauerhaft aktivieren und Daten\\u00fcber-tragung zustimmen:\",\"info_link\":\"http:\\\/\\\/www.heise.de\\\/ct\\\/artikel\\\/2-Klicks-fuer-mehr-Datenschutz-1333879.html\",\"uri\":\"https:\\\/\\\/blog.rwth-aachen.de\\\/itc\\\/en\\\/2026\\\/02\\\/09\\\/kuenstliche-intelligenz-trifft-virtuelle-realitaet\\\/\",\"post_id\":23639,\"post_title_referrer_track\":\"Artificial+Intelligence+Meets+Virtual+Reality\",\"display_infobox\":\"on\"});}});\n\/* ]]> *\/<\/script><\/div><p><div id=\"attachment_23645\" style=\"width: 310px\" class=\"wp-caption alignright\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-23645\" class=\"size-medium wp-image-23645\" src=\"https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum_Blog-1-300x200.png\" alt=\"A virtual museum guide presents a painting in a VR museum\" width=\"300\" height=\"200\" srcset=\"https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum_Blog-1-300x200.png 300w, https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum_Blog-1.png 527w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><p id=\"caption-attachment-23645\" class=\"wp-caption-text\">Source: K\u00fchlem et al., 2025<\/p><\/div><\/p>\n<p>Whether in digital learning spaces or virtual museum tours, virtual reality (VR) makes an important contribution to immersive learning. In this context, users can interact with virtual teachers. At RWTH Aachen University, research is being conducted into how these conversations can be made as realistic and natural as possible.<\/p>\n<p>&nbsp;<\/p>\n<p><!--more--><\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"color: #00549f;\">Virtual Teachers in Digital Environments<\/span><\/h3>\n<p>Embodied conversational agents (ECAs), i.e., computer-controlled virtual humans, are an important component of many VR applications. These include, for example, interactive learning environments in which users can communicate with a virtual teacher, or\u2014as in our example\u2014museum tours in which a digital guide provides information about the various exhibits and answers any questions. In these scenarios, it is crucial that a fluid and natural conversation takes place between the ECAs and the VR users. To achieve this, the ECA must be able to understand the users&#8217; questions and statements, think about appropriate answers, and then communicate these both verbally and non-verbally.<\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"color: #00549f;\">Conversations Between Users and Virtual Teachers<\/span><\/h3>\n<p>In recent years, much has happened in the area of \u201cthinking\u201d through the use of large language models (LLMs) such as ChatGPT. Instead of a limited number of pre-programmed responses, ECAs can now respond individually to users&#8217; questions and needs through the use of AI. The AI analyzes the user&#8217;s questions and generates an appropriate and context-related response. Finally, this response is converted into spoken language, appropriate lip movements are generated for the ECA, and the response is played back in VR. In this way, the use of AI makes it possible to respond meaningfully to almost all queries and turn ECAs into plausible conversation partners.<\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"color: #00549f;\">Minimal Response Time for Natural Conversation Flow<\/span><\/h3>\n<p>For the most natural conversation flow possible, it is important that the response times of ECAs are as short as possible. This means that the ECA should respond to the VR user&#8217;s request or statement as quickly as possible and without significant delay. Gestures, emotions, and eye contact can minimize the perceived response time by allowing the ECA to simulate human-like \u201cthinking.\u201d This can be done both verbally, through filler words and sounds, and non-verbally, for example through gestures and body language.<\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"color: #00549f;\">Research at RWTH Aachen University<\/span><\/h3>\n<div id=\"attachment_23646\" style=\"width: 310px\" class=\"wp-caption alignright\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-23646\" class=\"size-medium wp-image-23646\" src=\"https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum-1-300x71.png\" alt=\"A virtual museum guide presents three different paintings in a VR museum.\" width=\"300\" height=\"71\" srcset=\"https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum-1-300x71.png 300w, https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum-1-1024x242.png 1024w, https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum-1-768x181.png 768w, https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum-1-1536x363.png 1536w, https:\/\/blog.rwth-aachen.de\/itc\/files\/2026\/01\/Virtuelles-Museum-1.png 1647w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><p id=\"caption-attachment-23646\" class=\"wp-caption-text\">Source: K\u00fchlem et al., 2025<\/p><\/div>\n<p>At RWTH Aachen University, the Virtual Reality and Immersive Visualization research group is investigating communication between humans and ECAs. Its research uses a virtual museum tour as an example, in which a virtual guide accompanies users through the museum and responds to individual questions. In this scenario, it is possible to achieve realistic interaction between humans and ECAs.<\/p>\n<p>Among other things, the aim is to investigate how the response time of ECAs can be reduced technically. An initial demo was already presented at <a href=\"https:\/\/iva.acm.org\/2025\/\">ACM IVA 2025<\/a> in Berlin. The results will also be presented at <a href=\"https:\/\/aivr.science.uu.nl\/2026\/index.html\">IEEE AIxVR<\/a> in Osaka, Japan, in January 2026.<\/p>\n<p>&nbsp;<\/p>\n<hr \/>\n<p>Responsible for the content of this article are <a href=\"https:\/\/www.itc.rwth-aachen.de\/cms\/it-center\/it-center\/profil\/team\/~epvp\/mitarbeiter-pvz-\/?gguid=PER-J63BYGV&amp;allou=1&amp;lidx=1\">Andrea B\u00f6nsch<\/a>, <a href=\"https:\/\/www.itc.rwth-aachen.de\/cms\/it-center\/it-center\/profil\/team\/~epvp\/mitarbeiter-pvz-\/?gguid=PER-89HL5HS&amp;allou=1&amp;lidx=1\">Hedda Faber<\/a>, and <a href=\"https:\/\/www.itc.rwth-aachen.de\/cms\/it-center\/it-center\/profil\/team\/~epvp\/mitarbeiter-pvz-\/?gguid=PER-HS2ABDL&amp;allou=1&amp;lidx=1\">Konstantin W. K\u00fchlem<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>Sorry, this entry is only available in Deutsch.<\/p>\n","protected":false},"author":3522,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"c2c_always_allow_admin_comments":false,"footnotes":""},"categories":[1514,310],"tags":[470,620,1026],"class_list":["post-23639","post","type-post","status-publish","format-standard","hentry","category-ki","category-studium-lehre","tag-forschung","tag-ki","tag-vr"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/posts\/23639","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/users\/3522"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/comments?post=23639"}],"version-history":[{"count":4,"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/posts\/23639\/revisions"}],"predecessor-version":[{"id":23682,"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/posts\/23639\/revisions\/23682"}],"wp:attachment":[{"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/media?parent=23639"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/categories?post=23639"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.rwth-aachen.de\/itc\/en\/wp-json\/wp\/v2\/tags?post=23639"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}