IGMR Seminar: Dr. Michael Cashmore – Plan-Based Robot Control in Real-Time
Mit dem Vortrag von Dr. Miachel Cashmore von der University of Strathclyde startet die virtuelle IGMR Vortragsreihe im Wintersemester 20/21. Wir freuen uns auf einen Einblick in ROSPlan und Plan-Based Robot Control in Real-Time.
Mittwoch, 2. Dezember 2020 16:30 Uhr in Zoom Zoom Meeting Informationen: https://rwth.zoom.us/j/98454895570?pwd=NkpiSWkyaTJtdWlralJrSUtnMDdDZz09 Meeting-ID: 984 5489 5570, Kenncode: 186393
Die Datenschutzhinweise zur Nutzung von Zoom und eine Handreichung für Teilnehmer (Studierende) können von den Seiten des CLS der RWTH Aachen University heruntergeladen werden.
The topic of the seminar will focus on the numerous temporal and numeric challenges that arise in plan execution. If a plan is produced with some flexibility, how it can be executed? In this context the properties of temporal controllability, robustness envelopes, replanning in-situ, and planning concurrently to execution, deliberation in a system of distributed components, in which your actions can affect other parts of a larger system will be discussed.
Die Veranstaltungen im Wintersemester 2020/2021 werden in Zusammenarbeit mit dem VDI-GPP-Arbeitskreis des Bezirksvereins Aachen durchgeführt.
Haptic feedback system RePlaLink
At IGMR, the haptic feedback system RePlaLink (Reconfigurable Planar Linkage) is being developed. With this system, hand-actuated mechanisms can be haptically simulated and interactively synthesized. Furthermore, users can interactively synthesize these mechanisms. This should allow mechanisms with optimal haptic properties to be developed.
People frequently interact with hand-actuated mechanisms in everyday life, e.g., in car doors, furniture doors, reconfigurable furniture, or fitness equipment. Their haptic properties largely determine the perceived quality of these mechanisms. The RePlaLink (Reconfigurable Planar Linkage) aims to support the design and development of these mechanisms by applying haptic feedback systems based on virtual prototypes. The haptic simulation and synthesis method allows users to directly feel mechanisms’ kinematic and kinetostatic properties while operating the system. In addition, users can interactively modify these properties and receive direct haptic feedback. In the first video, the design of the RePlaLink, consisting of a planar five-link with an additional serial link for the handle, is shown.
https://youtu.be/pemrysX4Cr8
The second video shows haptic simulation and synthesis using a kitchen cabinet door as an example.
https://youtu.be/0AqONOv1R5E
Project page:
https://www.igmr.rwth-aachen.de/index.php/de/gt/gt-replalink
Contact:
Mahshid Pour Ebrahimabadi M.Sc.
Exercise with the Fanuc Education Cell in the module Robotic Systems
As part of the practical exercise of the Robotic Systems module https://www.igmr.rwth-aachen.de/index.php/de/lehrveranstaltungen/rs the Fanuc Education Cell with the Roboguide software is used in our Robotic Lab.
The students learn the basic operation of a robotic cell and implement a logistics scenario by themselves using the Roboguide software. This semester we are also able to maintain the practical exercises remotely. For this purpose, teaching materials have been digitized so that students can simulate the task from their own computers.
Der an dieser Stelle eingebundene Inhalt führt Sie auf Seiten, die von der von Google betriebenen Seite YouTube - YouTube, LLC, 901 Cherry Ave., San Bruno, CA 94066, USA - zur Verfügung gestellt werden. Mit dem Aufruf des Inhalts kann YouTube Ihre IP-Adresse und die Sprache des Systems, sowie verschiedene browserspezifische Angaben ermitteln. Wenn Sie in Ihrem YouTube-Account eingeloggt sind, ermöglichen Sie YouTube, Ihr Surfverhalten direkt Ihrem persönlichen Profil zuzuordnen. Dies können Sie verhindern, indem Sie sich aus Ihrem YouTube-Account ausloggen. YouTube verwendet Cookies und Tracking-Tools. Die Datenverarbeitungsvorgänge sowie die Zwecke der Verarbeitung können direkt bei YouTube erfragt und eingesehen werden.
Internet page for the course:
https://www.igmr.rwth-aachen.de/index.php/de/lehrveranstaltungen/rs
Contact:
Markus Schmitz
Teaching the KUKA iiwa by manual guidance
The hand guidance of the collaborative robot KUKA iiwa is well suited for programming spatial points. This is possible both within a program and in the robot’s T1 mode.
https://youtu.be/v7D4yknlxJI
The video shows the manual guidance of the KUKA iiwa within a program. The robot can be guided by hand on the flange when the user presses the gray consent button. After releasing it, the program asks whether the current position is correct and should be saved. Any number of additional positions can then be added. At the end of the program, all stored room points are traversed in the taught sequence.
Contacts:
Robot-Guided Form Scan and Coating
Automatic shape recognition via laser scanner and trajectory planning for coating.
In this project, shapes are detected and their edges are realized by a laser scanner mounted on the end effector of the robot. The collected data is synchronized and filtered, and a suitable trajectory is created for coating the inner surface of the forms. Several variables, such as nozzle speed, spacing and gaps, nozzle size, and outlier for homogeneous coating can be selected during trajectory planning.
The project was carried out in cooperation with International Partners in Glass Research e.V.
Watch this video on YouTube: https://youtu.be/SAHmeuPKeG4
The content embedded at this point takes you to pages provided by the YouTube site operated by Google – YouTube, LLC, 901 Cherry Ave, San Bruno, CA 94066, USA. By calling up the content, YouTube can determine your IP address and the language of the system, as well as various browser-specific details. If you are logged into your YouTube account, you enable YouTube to assign your surfing behavior directly to your personal profile. You can prevent this by logging out of your YouTube account. YouTube uses cookies and tracking tools. The data processing operations as well as the purposes of the processing can be requested and viewed directly at YouTube.
Contact person:
Camera calibration in SHAREWORK
In the EU project SHAREWORK we develop a framework for industry that allows human-robot collaboration even with heavy industrial robots. We are developing camera-based methods for localizing objects in the workspace and deriving process states from them, which in turn can be further used in task planning algorithms.
For the camera network, we set up and calibrated four Stereolabs ZED stereo cameras in our hall. Random, checkerboard, Aruco, as well as ChAruco patterns were used for the calibration. In the end, we have managed to calibrate the cameras to sub-pixel accuracy. In the video, you can see some data from our calibration sets. Currently the data is being processed and we hope to show more in a few weeks.
Watch this video on YouTube: https://youtu.be/8goCBVKaKtU
The content embedded at this point leads you to pages provided by the Google-operated site YouTube – YouTube, LLC, 901 Cherry Ave, San Bruno, CA 94066, USA. By calling up the content, YouTube can determine your IP address and the language of the system, as well as various browser-specific details. If you are logged into your YouTube account, you enable YouTube to assign your surfing behavior directly to your personal profile. You can prevent this by logging out of your YouTube account. YouTube uses cookies and tracking tools. The data processing operations as well as the purposes of the processing can be requested and viewed directly at YouTube.
Project page:
https://www.igmr.rwth-aachen.de/index.php/de/rob/rob-sharework
Contact person:
Robot Companion: A mobile helper in case of need
Robot Companion is a framework to implement robot tracking systems in a simple and cost-saving way. For this purpose, IGMR develops methods for tracking with different sensors (laser, radar, camera), agile path planning and actuation.
The current objective of Robot Companion is to provide a robot for emergency rescue. In doing so, the robot will autonomously follow first responders and enable the transport of materials and equipment, as well as the removal of debris and casualties. A first path to this vision was implemented with the basic module. The basic module has methods for tracking with a camera and laser, and enables autonomous tracking of an operator.
The video shows the tracks of the vertical and horizontal trackers, as well as the state of detection (top right). In a tracking test, 100% accuracy was achieved at low speeds.
https://youtu.be/imU8j2zlQrQ
Project website:
https://www.igmr.rwth-aachen.de/index.php/de/rob/rob-comp
Contact:
Cooperative mobile packaging in production lines in the “Internet of Production”
Applications of mobile manipulation in production lines of the “Factory of Future”.
Internet of Production (IoP) has the vision to enable cross-domain collaboration in production lines on a new level. Mobilization of robotic agents and resources are the essences of the production lines of the Factory of Future. To enable robotic agents to react to the changes in production lines, we at IGMR are developing online motion planning and control strategies for mobile manipulators in dynamic situations.
Der an dieser Stelle eingebundene Inhalt führt Sie auf Seiten, die von der von Google betriebenen Seite YouTube - YouTube, LLC, 901 Cherry Ave., San Bruno, CA 94066, USA - zur Verfügung gestellt werden. Mit dem Aufruf des Inhalts kann YouTube Ihre IP-Adresse und die Sprache des Systems, sowie verschiedene browserspezifische Angaben ermitteln. Wenn Sie in Ihrem YouTube-Account eingeloggt sind, ermöglichen Sie YouTube, Ihr Surfverhalten direkt Ihrem persönlichen Profil zuzuordnen. Dies können Sie verhindern, indem Sie sich aus Ihrem YouTube-Account ausloggen. YouTube verwendet Cookies und Tracking-Tools. Die Datenverarbeitungsvorgänge sowie die Zwecke der Verarbeitung können direkt bei YouTube erfragt und eingesehen werden.
Further information on the Internet of Production:
https://www.igmr.rwth-aachen.de/index.php/de/rob/rob-iop
https://www.iop.rwth-aachen.de
Contact person: