CLAIX System Maintenance on 2023-11-27
Dear users of the RWTH compute cluster,
on 2023-11-27 the complete cluster will not be available from 8am to 12am due to system maintenance.
Kind regards,
Your HPC team
You can track any disruptions or security advisories that may occur due to the aforementioned change in the RWTH-HPC category on our status reporting portal.
Deletion of course rooms of the winter semester 2020/21
In line with the life cylce of course rooms with connection to RWTHonline, the winter semester 2020/21 course rooms were deleted on November 13, 2023.
Opencast: Upgrade to version 12
Improvements:
- security issue was closed
- Paella player version 6.5.6 as the default player
- all Release Notes
Release Notes Version 2.7.0 – Locations of defibrillators and restrooms for people with disabilities
Features:
- The learning space map is available in light/dark mode
- Locations of restrooms for people with disabilities and defibrillators are now displayed
Improvements and bug fixes:
- The general “search” behavior has been improved overall and “recent searches” has been added
- “Favorites” icons in the dashboard have been adjusted
End of Apptainer Pilot Phase
We are happy to announce that, after a long pilot phase, we are granting all users full access to use Apptainer containers on the cluster. Containers are virtual environments that allow running an identical software configuration across several systems, e.g., two different HPC systems, and simplify the setup of software that only runs well on other Linux distributions. Apptainer also supports the conversion of Docker images and can thus run a vast variety of existing images with little to no extra effort.
Previously, we only allowed curated container images as part of our software stack and individual images on a per-case basis. Starting today, users can build and run their own container images anywhere on the cluster.
If you are interested in using Apptainer, please take a look at our documentation[1] and read the “Best Practices” section to get started and avoid common problems. As part of our efforts to support containerized workloads in HPC, we will also grow our collection of container images in the module system and provide a set of Claix-specific base images for various scenarios that can be used as a foundation for your own container images.
Kind regards,
Your HPC Team
[1] https://help.itc.rwth-aachen.de/service/rhr4fjjutttf/article/e6f146d0d9c04d35aeb98da8d261e38b/
Status of Organizations Can Now be Recorded in the Organization Directory
The status of organizations can now be recorded in the organization directory. In addition, entered URLs and e-mail addresses are validated.
Release Notes Version 2.6.0 – New study room map
Features:
- Study room map is now integrated
- Calendar data can now also be used offline
Improvements and bug fixes:
- Filtering options for meals and study rooms have been expanded and improved
- The “No search results” display has been revised
- Several minor bug fixes
Opencast: Migration of streaming database
- Migration from MySQL to PostgresSQL
CLAIX-2018 dialog systems
Due to the high load on the login / dialog nodes affecting their usability, we decided to reduce the maximum usable cores on each login node to four cores for each user. Please note: These login nodes should be used for programming, preparation and minimal post processing of batch jobs. They are not intended for production runs or performance tests. For longer tests (max. 25 minutes), parallel debugging, compiling, etc., you can use our “devel” partition by adding “#SBATCH –partition=devel” to batch jobs or interactively with “salloc -p devel”.
For all productive jobs, please use our batch system **without** “#SBATCH –partition=devel”. If you want to more learn more about the batch system, we invite you to our Slurm introduction.
You can track any disruptions or security advisories that may occur due to the aforementioned change in the RWTH-HPC category on our status reporting portal.
FastX Server Component Upgraded to Version 3.3.39
The FastX server component installed on the HPC frontend nodes was upgraded to version 3.3.39.
The update contains security enhancements and several bugfixes from which all users benefit when using FastX.
Please ensure to use the latest desktop client if you are using FastX when accessing the cluster.
For more information on how to access the RWTH Aachen Compute cluster via FastX, please refer to the ITC Help Page
You can track any disruptions or security advisories that may occur due to the aforementioned change in the Email category on our [status reporting](https://maintenance.rz.rwth-aachen.de/ticket/status/messages/14-rechner-cluster) portal.

