Due to the current situation, the exchange with the HPC team is currently limited:

  • Please contact us by e-mail to one of the addresses below. Primarily, please contact our support address
  • We are currently not available by telephone. However, if you urgently need a personal conversation, we will be happy to call you back after consultation.
  • Until further notice, we are not personally present in the office.

Dr. Stefan Harfst

+49 (0)441 798-3147

W3 1-139

Fynn Schwietzer

 +49 (0)441 798-3287

 W3 1-139

HPC Support


Carl von Ossietzky Universität Oldenburg
Fakultät V - Geschäftsstelle
Ammerländer Heerstr. 114-118
26129 Oldenburg


Newsletter January-February 2021

This years first Newsletter for our HPC users from February 2021

Dear HPC users,

Here is our first newsletter of this new, hopefully promising year 2021!

  1. Change Job Scheduler: Starting on Feb 15th, we will change a setting of the job scheduler to limit the maximum number of jobs actively running to 250 per unixgroup (agname). The reason for this change is that we have found that large job arrays consisting of many single-core jobs can lead to these jobs being allocated a disproportionate amount of resources, while other jobs have to wait a comparatively long time for resources. Unfortunately, other modifications to the scheduling settings of Slurm had little effect on this problem, so that we now must enforce a limit on running jobs. However, the limit will not affect most of you and it will only lead to a fairer sharing of the HPC resources. More information can be found in an article in the HPC wiki and of course if you have further questions you can contact us anytime under .
  2. HPC Course: The next three-day course Introduction to HPC has been scheduled for March 15th to 17th from 10am to 17pm. The course will be given online with BigBlueButton and the agenda for the full course is as follows:

    Day 1 – Introduction to HPC (HPC Cluster, Job Scheduler, …)
    Day 2 – Parallel Programming
    Day 3 – Using Matlab on the HPC Cluster

    Therefore, each day targets a different audience, and it is perfectly ok to only participate in the sessions that suit your particular needs. If interested, please sign up for the course in Stud.IP. More information about the course is also available in the HPC wiki.
  3. Update Global Protect: The IT Services have rolled out an update of the Global Protect Client this morning. This may cause an error next time you try to establish a VPN connection to or and in the login screen, you may see ??? or other symbols where your username (abcd1234) should be entered. The solution is to simply re-enter your username and login again normally. In case of a problem with your VPN connection, please refer to the web page of IT Services or contact
  4. Dates: Please take note following upcoming events
  5. Software News: The following modules are now available on hpc-env/8.3:
    • OpenFOAM-Plus/2006-gimkl-2019b
    • Geant4/10.06 & 10.06.03
      1. Different modules with/without GPU, and multithreading support. On hpc-env/8.3, check with ml av
    • GATE/9.0-foss-2019b-Python-3.7.4 & GATE/9.0-fosscuda-2019b-Python-3.7.4
    • makedepend/1.0.6
    • Automake/1.16.2-GCCcore-8.3.0
    • quanteda/2.1.2-foss-2019b-R-4.0.2
    • … and much more packages! So you might want to take a look at our software page, should you need one or two dependency updates.

The list above might not be complete, you can always search for software with the command “module spider <softwarename>”.

Best wishes and happy computing,
Your HPC support team

(Changed: 2021-04-30)