Dear cluster users,

due to time constraints we cannot hold the consultation hour today. We apologize for the inconvenience. There will be a normal consultation hour again next week.

Best regards

Your HPC Team

Aktualisiert um 11:09 am 26. July 2022 von Gerd Pokorra

Dear cluster users,

in order to reduce the cooling load during the current heat period, we will shut down the rarely used HTC nodes until further notice. This does not affect the OMNI cluster.

Best regards
Your HPC Team

Aktualisiert um 11:49 am 22. July 2022 von Gerd Pokorra

Dear cluster users,

the MATLAB course that was scheduled for tomorrow has been moved to a new date:

Thursday, May 20, 9:00-12:30
Friday, May 21, 9:00-12:30

This means the course has been divided over two half-days instead of one full day. The content will stay the same however. There are also enough free spaces open. If you wish to register despite the course being in German, you can find more details (in German) here. Otherwise, the course will be held in English next semester as usual.

Best regards

Your HPC Team

Aktualisiert um 16:25 am 17. May 2021 von Gerd Pokorra

Dear Cluster Users,

the new University Cluster OMNI is now available for production starting on Monday March 8th. The new system is several times larger than HoRUS and will replace it by end of March. Starting April 1st, no computations on HoRUS will be possible, but you can still log in after that to secure your data.

We have summarized all information at the following page. This includes gaining access, migrating your data, an overview of newly available hard- and software and other changes:

https://cluster.uni-siegen.de/transition/?lang=en

IMPORTANT: Even users who have HoRUS access already must request OMNI access again via Unisim.

You will receive the cluster address in the announcement e-mail via the HPC mailing list. If you do not have access yet, and are therefore not yet on the mailing list, you will be told the cluster address in the welcome e-mail when you obtain cluster access.

You will find detailed descriptions on the new system and its usage in the left side menu. Please contact us in case of questions or problems via hpc-support@uni-siegen.de

Your HPC-Team

Aktualisiert um 10:14 am 9. March 2021 von Gerd Pokorra

Dear colleagues and students,

starting August 4 the HPC-Team will offer a weekly consultation hour, in which you can join us for all questions and concerns about scientific computing.

Scientific Computing Consultation Hour
every Tuesday, 2-3 pm (starting August 4)
in room H-D 6211
as well as online via DFNconf https://conf.dfn.de/webapp/conference/979127069

With this invitation we want to address students and employees who use scientific computing for their work or their studies, independent of their prior knowledge and whether or not they currently use ZIMT systems for their computations.

For example, we offer advice and support for the following topics:

  • Networking and knowledge transfer on scientific computing
  • Cluster usage
  • Problem identification and solving
  • Implementation of scientific computing
    • Assistance with the choice of appropriate programming language and software
    • Selection of suitable hardware or central compute systems
    • Software installation and usage
  • Efficient usage of systems and software
  • Workflow optimization (e.g. job management and data processing
  • Performance analysis and optimization of self-written codes
  • Application for compute time on larger systems (tier 2 and 1)

We hope this service best accommodates your needs.
Of course, you may continue to contact us by email (hpc-support@uni-siegen.de).

Best regards,
Your HPC-Team

Aktualisiert um 15:23 am 28. July 2020 von Gerd Pokorra

Dear cluster users,

in the next few months, the University of Siegen will replace the HorUS cluster with a new and considerably more capable system. According to the current plans, the system will go live at the end of April (the initial plans were for the end of February, this will not be possible, mostly due to delays in the delivery of some components).

Overview of the new system

The system consists of the following compute partitions:

  • 434 nodes in the regular HPC partition, which are equipped with AMD CPUs of type EPYC 7452 and 256 GB of RAM each.
  • OpenStack and Kubernetes partitions (8 and 5 nodes respectively).
  • 2 SMP nodes, each with 4 Intel Xeon 5218 CPUs (Cascade Lake) and 1.5 TB RAM.
  • 10 GPU nodes with a total of 24 nVidia GPUs type Tesla V100 (4×4, 2×2, 4×1 GPUs per node)

The storage capacity is 1 Petabyte of primary hard drive space, additionally there is a burst buffer of SSDs (32 Terabytes) and 48 TB of Object Storage space. The flexible concept allows us to adjust the relative sizes of these storage complexes in the future if necessary.

The high-speed interconnect used throughout the cluster is of type Infiniband HDR100.

The following information represents the current state of planning and may change until the new cluster starts operating.

What will change about access and login?

Like the HoRUS cluster, the new cluster will be available to all members of the University of Siegen. The same cluster address will continue to be used and will point to the new cluster after the end of the transition phase at the latest.

What will be the name of the new cluster?

The cluster does not have a name yet. We will have a naming contest for it soon.

Will the other compute resources also be replaced by the new cluster?

No, the HPE Moonshot HTC system and the recently acquired NEC SX-Aurora Tsubasa vector system (more information about that in the near future) will continue to be available.

How do I get my data from HoRUS to the new cluster?

During the transition phase you will have enough time to copy whatever information you need.

Which operating system will the new cluster use?

The OS will be CentOS 8 (HoRUS currently uses CentOS 7). The job scheduler used will be SLURM, just like on HoRUS.

What software will be available on the new cluster?

There will be a collection of commercial software and non-commercial software for a variety of purposes, similar to the HoRUS cluster. You will also be able to request the installation of additional software, we will decide this on a case-by-case basis.

If you have any other questions, email us at hpc-support@uni-siegen.de

Best regards
Your HPC Team

Aktualisiert um 11:12 am 26. February 2020 von Gerd Pokorra