Attention
ShARC will be decommissioned on the 30th of November 2023, after which time users will no longer be able to access that cluster and any jobs running or
queueing at that time will be cancelled.
Please see our page of important info about ShARC’s decommissioning.
ShARC/SGE Parallel Environments
The available SGE parallel environments (for ShARC only) can be found below:
ShARC SGE Parallel Environments Table
Parallel Environment Name <env> |
Parallel Environment description |
smp
|
Symmetric multiprocessing or ‘Shared Memory Parallel’ environment. Limited to a single node and therefore 16 cores on a normal ShARC node. |
openmp
|
A ‘Shared Memory Parallel’ environment supporting OpenMP execution. Limited to a single node and therefore 16 cores on a normal ShARC node. |
mpi
|
Message Passing interface. Can use as many nodes or cores as desired. |
mpi-rsh
|
The same as the mpi parallel environment but configured to use RSH instead of SSH for certain software like ANSYS. |
Other parallel environments not mentioned do exist for specific purposes. Those who require these will be informed directly or via signposting in other documentation.
A current list of environments on ShARC can be
generated using the qconf -spl
command.