Clusters at IMAU - UU-IMAU/Earth-System-Modelling GitHub Wiki

Lorenz

Lorenz is the local compute server of the oceanography group at IMAU

  • The machine has 9 compute-nodes with in total 288 cores (32 cores/node)
  • It is characterised by a fast connection between the nodes, lots of memory per node and a high disk storage per node which render very fast I/O
  • It has a lot of standard software, easy to use via loading of modules
  • The operating system is AlmaLinux and batch scheduling system is SLURM
  • Lorenz is typically used for doing simulations with models of intermediate complexity for instance climate model Climber or hydrodynamic model Delft3d
  • At IMAU we run the more state of the art models like CESM that need more compute power (i.e. cores) on the national supercomputer Snellius. Analysing the output of such models can then later be done on Lorenz.
  • The fast I/O feature makes Lorenz also an ideal machine to run Parcels , a toolbox that enables particle tracking simulations using output from Ocean Circulation models.
  • The machine does not have GPUs.

More information about how to use Lorenz can be found here.

Access can be requested via Michael Kliphuis.

Gemini

Students and staff at IMAU can also make free use of Gemini, a server of the Beta Faculty of the University Utrecht.

  • The machine has 12 compute-nodes with in total 452 cores (with either 20, 32 or 48 cores/node)
  • It is free to use but the machine is shared by many users
  • It has a lot of standard software, easy to use via loading of modules
  • The operating system is Scientific Linux and batch schedulung system is Sun Grid Engine (SGE)
  • Gemini provides (like Lorenz) a good solution for the development and testing of small programs that don't require huge dedicated computing power
  • The machine has 8 GPUs

When you have a SolisID you already have access. Here is some more information about how to log in.

   

A wiki by the Parcels group about how to run Parcels on Lorenz and Gemini, together with other useful information, is here.