Skip to content

Using modules

Computations on the cluster often require software to run. Since ’yum’ isn't available for users an option would be to compile the software yourself but that would take up a lot of time and disk space for every user. This is where a centralized module system with Lmod comes into play. UTHPC has a collection of modules installed on a shared filesystem that's available on all of the nodes in the cluster.

Quick start

  • module avail {keyword} - searches for available modules with keywords, using without ’{keyword}’ lists all available modules.
  • module load {module} - loads the required module into your environment.
  • module list - shows currently loaded modules.
  • module unload {module} - unloads the selected module.
  • module purge - unloads all loaded modules.

Understanding the module system

What loading a module does is just loading different values to your $PATH, $LD_LIBRARY_PATH, etc. Those values correspond to the location of the specific location that's requested by the module. In addition it loads necessary dependencies as modules if needed.

The module avail command

module avail or module av shows you the complete list of available modules on the cluster. Adding a keyword to the end narrows your search, for example module av r- displays all modules that have ’r-’ anywhere in their names. This search is case insensitive.

The module load command

As described before, the module load command reads the module file of the software requested and does the required changes to the environment. Using it's as simple as module load {module name}/{module version}, though you can just use module load {module name} which loads the latest version available. If you have an older version loaded and load a newer one of the same module, the system automatically replaces it so you wont need to unload anything, this works in reverse as well.

Different namings for some modules

UTHPC has divided the modules into two parts:

  • The first and larger one is the modules that UTHPC has built using Spack . UTHPC uses it to automatically build most of software. The module names are mostly structured as ’{module name}/{module version}’. UTHPC is working on making the names more descriptive.

  • The second part is the software that UTHPC admins have compiled by hand and prefixed by broadwell/, zen2/ or any/. An example is the module any/python/3.8.3-conda. They prefixes are different for several system architectures. any/ works for all nodes, broadwell/ works only for stages, bfrs and sfrs and ze2/ works for the ares nodes.

R and python packages as modules

The default R and python modules built with Spack are as bare bones as you can get, which means it has as few packages in it as possible. Extra packages can be either installed in an environment in your home directory , or loaded as modules. Module names have prefixes accordingly either py- or r- respectively. For example module load py-numpy enables the numpy package in your environment.

Can't find a needed module, what do do?

If you require software that you are unable to install in your home directory and/or you feel that many people would need it, send an email to with the package name, version, and, if possible, the link to the source code, GitHub page for example. UTHPC admins either install it centrally as a module or in your home directory if needed.

Intel compiler collection

Intel Parallel studio (ICC) is a module that loads a bit differently than most modules in the UTHPC system. To load and use ICC, first you have to load the module

module load any/intel_parallel_studio

From there, all of the submodules from ICC are visible to Your modules system, but as a separate path under module avail.

[user@login1 intel_parallel_studio]# module av

------------------- /gpfs/space/software/cluster_software/modules/intel_parallel_studio_2022.1.0 --------------------
   clck/2021.6.0             compiler32/2022.1.0       icc32/2022.1.0          mpi/2021.6.0
   compiler-rt/2022.1.0      debugger/2021.6.0         init_opencl/2022.1.0    oclfpga/2022.1.0
   compiler-rt32/2022.1.0    dev-utilities/2021.6.0    inspector/2022.1.0      tbb/2021.6.0
   compiler/2022.1.0         icc/2022.1.0              itac/2021.6.0           tbb32/2021.6.0
--- Output omitted ---