High performance computing cluster

What is High Performance Computing (HPC)?

High Performance Computing (HPC) refers to the practice of aggregating computing power in a way that delivers much higher performance than one could get out of a typical desktop computer or workstation. This allows analysis of very large data sets and the solving of complex problems in areas such science, engineering, health and medicine, or business and marketing. The terms High Performance Computing and Supercomputing are used interchangeably.

What areas of research would HPC benefit?

High Performance Computing (HPC) can be utilised in a wide range of research applications, including:

  • Computational chemistry
  • Environmental science & management
  • Genomics
  • Geosciences
  • Materials science
  • Mechanical and structural engineering
  • Molecular biology
  • Physics
  • Proteomics

How do I get access to HPC?

La Trobe researchers are able to access the following HPC facilities at no cost:

  • Gadi: Through La Trobe's partnership with Intersect Australia, researchers at La Trobe can access the largest supercomputer facility in Australia, managed by the National Computational Infrastructure (NCI), named Gadi. Gadi features 3,200 compute nodes supplied by Fujitsu Australia. It includes 3,000 nodes containing Intel's second-generation Xeon Scalable ‘Cascade Lake’ processor with two 24-core CPUs and 192 Gigabytes of RAM per node. Gadi also includes 160 nodes containing 640 Nvidia V100 GPUs, and 50 large-memory 'Cascade Lake' nodes offering 1.5 Terabytes of Intel Optane DC Persistent memory. Users are able to request computing time on Gadi through Intersect. The detailed user guide on how to get started on Gadi can be accessed here.

There are also a number of other specialised external supercomputing facilities that can be accessed by researchers. Access to these systems often relies on applying for computing time under a competitive merit allocation scheme that is open to all Australian researchers. Some examples include:

Access HPC (La Trobe login required)

What HPC facility is right for me?

There are a number of factors that go into working out which HPC facility you should use. Often this comes down to whether or not the software you need is already available on one of the machines. Other considerations are how much memory you will need, how many cores you will need.

Do you need to use proprietary software?

If you need to use proprietary software, it may be easier and more cost-effective to see if this software is already installed on one of the HPC facilities. Open source software can generally be installed on any HPC facility.
How many cores do you need to simultaneously access?If you need to use lots of cores at the same time then you may find it easier to get a large allocation on a larger system.
How much memory do you need for your analysis?If you need lots of RAM it may be best to run your analysis on a high memory node.
How long do you need to run your jobs for?Some HPC facilities impose a maximum time limit that a job can be run for in order to ensure equitable use of the system. This limit is known as 'Wall Time'. If your jobs need to run for a long time then wall time limits may be important.

Is training available?

Please refer to the Digital Research training program for upcoming HPC courses.

What if I have problems with access?

Any issues with using the service please contact Information Services (x1500)

If you have HPC issues that cannot be dealt with through Information Services, please contact: digitalresearch@latrobe.edu.au