Archive for October, 2020
Teaching the KUKA iiwa by manual guidance
The hand guidance of the collaborative robot KUKA iiwa is well suited for programming spatial points. This is possible both within a program and in the robot’s T1 mode.
https://youtu.be/v7D4yknlxJI
The video shows the manual guidance of the KUKA iiwa within a program. The robot can be guided by hand on the flange when the user presses the gray consent button. After releasing it, the program asks whether the current position is correct and should be saved. Any number of additional positions can then be added. At the end of the program, all stored room points are traversed in the taught sequence.
Contacts:
Robot-Guided Form Scan and Coating
Automatic shape recognition via laser scanner and trajectory planning for coating.
In this project, shapes are detected and their edges are realized by a laser scanner mounted on the end effector of the robot. The collected data is synchronized and filtered, and a suitable trajectory is created for coating the inner surface of the forms. Several variables, such as nozzle speed, spacing and gaps, nozzle size, and outlier for homogeneous coating can be selected during trajectory planning.
The project was carried out in cooperation with International Partners in Glass Research e.V.
Watch this video on YouTube: https://youtu.be/SAHmeuPKeG4
The content embedded at this point takes you to pages provided by the YouTube site operated by Google – YouTube, LLC, 901 Cherry Ave, San Bruno, CA 94066, USA. By calling up the content, YouTube can determine your IP address and the language of the system, as well as various browser-specific details. If you are logged into your YouTube account, you enable YouTube to assign your surfing behavior directly to your personal profile. You can prevent this by logging out of your YouTube account. YouTube uses cookies and tracking tools. The data processing operations as well as the purposes of the processing can be requested and viewed directly at YouTube.
Contact person:
Camera calibration in SHAREWORK
In the EU project SHAREWORK we develop a framework for industry that allows human-robot collaboration even with heavy industrial robots. We are developing camera-based methods for localizing objects in the workspace and deriving process states from them, which in turn can be further used in task planning algorithms.
For the camera network, we set up and calibrated four Stereolabs ZED stereo cameras in our hall. Random, checkerboard, Aruco, as well as ChAruco patterns were used for the calibration. In the end, we have managed to calibrate the cameras to sub-pixel accuracy. In the video, you can see some data from our calibration sets. Currently the data is being processed and we hope to show more in a few weeks.
Watch this video on YouTube: https://youtu.be/8goCBVKaKtU
The content embedded at this point leads you to pages provided by the Google-operated site YouTube – YouTube, LLC, 901 Cherry Ave, San Bruno, CA 94066, USA. By calling up the content, YouTube can determine your IP address and the language of the system, as well as various browser-specific details. If you are logged into your YouTube account, you enable YouTube to assign your surfing behavior directly to your personal profile. You can prevent this by logging out of your YouTube account. YouTube uses cookies and tracking tools. The data processing operations as well as the purposes of the processing can be requested and viewed directly at YouTube.
Project page:
https://www.igmr.rwth-aachen.de/index.php/de/rob/rob-sharework
Contact person:
Robot Companion: A mobile helper in case of need
Robot Companion is a framework to implement robot tracking systems in a simple and cost-saving way. For this purpose, IGMR develops methods for tracking with different sensors (laser, radar, camera), agile path planning and actuation.
The current objective of Robot Companion is to provide a robot for emergency rescue. In doing so, the robot will autonomously follow first responders and enable the transport of materials and equipment, as well as the removal of debris and casualties. A first path to this vision was implemented with the basic module. The basic module has methods for tracking with a camera and laser, and enables autonomous tracking of an operator.
The video shows the tracks of the vertical and horizontal trackers, as well as the state of detection (top right). In a tracking test, 100% accuracy was achieved at low speeds.
https://youtu.be/imU8j2zlQrQ
Project website:
https://www.igmr.rwth-aachen.de/index.php/de/rob/rob-comp
Contact: