Wiss. Rechnen » News EN

Dear cluster users,

ZIMT has recently aquired an HPE Moonshot system, which, like HorUS, is available to everyone at the university. This system can be used for compute jobs in addition to HorUS.

The difference to HorUS is that the Moonshot technology is specialized for so-called High Throughput Computing. This term refers to workloads which consist of a large numer of small independent compute jobs (as opposed to High Performance Computing which describes small numbers of large interdependent jobs).

The Moonshot system is mostly integrated into HorUS and uses a separate SLURM queue ( htc) for submitting jobs. Your Home and Workspace directories and most environment modules are the same. You can also submit htc jobs from HorUS and vice versa.

If you believe that your computations are suitable for the Moonshot system we would be glad if you could test the system and give us feedback about your tests (hpc-support@uni-siegen.de).

Please note that the CPU architecture is different from HorUS and we have not tested each module. If you have problems with individual applications please contact us.

Further usage tips can be found here:


We will be adding other systems in the near future and will keep you informed.

Best regards
Your HPC Team

Aktualisiert um 10:12 am 10. December 2019 von Jan Steiner

Dear cluster users,

there are still open spots on the next cluster introduction course on this Thursday (Nov. 7). If you would like to participate, send an e-mail to jan.steiner@uni-siegen.de. More information here.

Best regards
Your HPC Support Team

Aktualisiert um 10:23 am 6. November 2019 von Jan Steiner

Dear cluster users,

the problem has been solved and all partitions are available without restrictions again. We apologize for any inconveniences the restart might have caused.

Best regards,

your HPC Team.

Aktualisiert um 9:26 am 28. October 2019 von Jan Steiner

Dear cluster users,
due to a technical problem the management node of the Horus cluster needs to be restarted. This has the following consequences for you:
1. All jobs running at the time of the restart will be killed. That will not happen immediately. We will choose the time of the restart such that as few jobs as possible will be killed (most likely Wednesday afternoon). Unfortunately, a few jobs have very long running times. We will contact the affected users tomorrow. If you are not contacted by us, your jobs are not affected.
2. All jobs that have been queued but have not started (status ‘PD’) will run after the restart. You do not have to do anything here.
3. Queuing new jobs is currently only possible in the Short Queue. We will notify you when queuing jobs will be possible again.
The Horus cluster will most likely be fully available again on Thursday. We will inform you as soon as possible.
Best regards
Your HPC Support Team

Aktualisiert um 17:56 am 21. October 2019 von Jan Steiner

Dear cluster users,

the next cluster introduction course will be held (in English) on  November 7 and will take all afternoon. You can find more information here.

The course covers the basics for new cluster users. We recommend that every cluster user attend this course at least once. The course takes place at least once per semester, alternating between German and English.

If you would like to participate, please send an e-mail to jan.steiner@uni-siegen.de.

Aktualisiert um 9:35 am 15. October 2019 von Jan Steiner

Dear cluster users,

this semester, the lecture “High Performance Computing in the Simulation Sciences” will be held by the chair of Simulation Technology and Simulation (Mechanical Engineering Department). It will be held every Tuesday 12:00-14:00 c.t. in room PB-A 337.

The lecture will be held in German, although the materials will be in English.

For more information, switch to the German language version of this post or contact monika.harlacher@uni-siegen.de.

Aktualisiert um 9:45 am 7. October 2019 von Jan Steiner

Dear cluster users,

the next Linux Introduction Course will take place on September 26 and take all day. It will be held in English.

The basics of Linux will be covered. We recommend that all users who are inexperienced with Linux take this course. The course takes place once per semester, alternating between German and English.

If you want to participate, please send an e-mail to jan.steiner@uni-siegen.de.


  • General Linux concepts
    • Command line
    • Processes, users and permissions
  • Files and directories
    • Wildcards
  • Scripting
    • Environment variables
    • Pipes etc.
  • Other concepts
    • System configuration files
    • Symbolic links and aliases
    • Tips and tricks


Jan Steiner


September 26, 2019


H-D 2202



Max. participants:




Target audience:

Employees and Students who are completely new to Linux




If you do not have your own device, there are desktop clients available.

You do not need access to the HorUS cluster to participate, but you may use your own account if you prefer.


Simply send an e-mail to jan.steiner@uni-siegen.de. Also, please indicate whether you intend to bring your own laptop and which operating system it has.

If you are interested but do not have time on the specified date, please notify us as well. We will hold another course if enough people are interested.

Aktualisiert um 14:40 am 3. September 2019 von Jan Steiner

Tensorflow version 1.13.1 has been installed on the HoRUS cluster. The software is available with the command module load tensorflow and is pre-set to be imported so no import tensorflow is necessary. More information can be found here.


  • Tensorflow was installed with its own, separate Python 3.6.5 installation! This will result in different installed packages. This is also described in more detail at our Tensorflow page.
  • If the module does not show up with a module avail command, your module cache has not been updated yet (this happens automatically at regular intervals). In that case you can force the latest list with the command module --ignore-cache avail.

Aktualisiert um 9:54 am 21. May 2019 von Jan Steiner

Matlab has been updated to version 2019a. This version will be launched when typing the matlab command without loading any modules. The 2018a version can still be accessed by loading the module matlab/2018a.

Note: if you use parallel pools or other MDCS features with Matlab 2019 you need to download the updated settings file here and import it into the Matlab cluster profile manager.

Aktualisiert um 18:55 am 12. April 2019 von Jan Steiner

cmake has been updated on the nodes login1, login2, smp1 and pre1 from version 2.8 to version 3.13.

Aktualisiert um 8:55 am 8. April 2019 von Gerd Pokorra