Decommissioned Clusters

Bessemer

Some of the Bessemer hardware.

The Bessemer HPC cluster was retired in October 2025.

Key features:

  • 1,040 CPU cores (Intel Skylake-SP)

  • 5 TB RAM (25 nodes with 192GB and 1 node with 384GB)

  • 460 TB Lustre filesystem

  • Ethernet (25 GB/s) interconnects between worker nodes and to/from storage nodes.

  • 4 public GPUs (NVIDIA Tesla V100)

  • Used the Slurm job scheduler / resource manager

  • SUSE Liberty Linux 7.x operating system

  • Used EasyBuild to build software stack


ShARC

Some of the ShARC hardware.

The ShARC HPC cluster was retired in November 2023. It was shipped to Igor Sikorsky Kyiv Polytechnic Institute in Kyiv, for it’s second life.

Key features:

  • 2,024 CPU cores (mixture of Intel Haswell, Intel Broadwell and Intel Skylake-SP over its lifetime)

  • 10 TB RAM (98 nodes with 64GB, 9 nodes with 256Gb, and 1 node with 384GB)

  • 669 TB Lustre filesystem

  • Fast OmniPath (100 GB/s) interconnects between worker nodes and to/from storage nodes.

  • 16 public GPUs (NVIDIA Tesla K80)

  • Used the Sun of Grid Engine job scheduler / resource manager

  • CentOS 7.x operating system


Iceberg

Some of the Iceberg hardware.

The Iceberg HPC cluster was retired in November 2019.

Key features:

  • Approx. 3,500 CPU cores (mixture of AMD, Intel Ivybridge and Intel Westmere over its lifetime)

  • 260 TB Lustre filesystem

  • Fast Infiniband interconnects between worker nodes and to/from storage nodes.

  • 16 public GPUs (NVIDIA K40M and NVIDIA M2070 GPUs)

  • Used the Sun of Grid Engine job scheduler / resource manager

  • Scientific Linux 6 operating system