You are here:

CHPC - Research Computing Support for the University

In addition to deploying and operating high performance computational resources and providing advanced user support and training, CHPC serves as an expert team to broadly support the increasingly diverse research computing needs on campus. These needs include support for big data, big data movement, data analytics, security, virtual machines, Windows science application servers, protected environments for data mining and analysis of protected health information, and advanced networking. Visit our Getting Started page for more information.

Change in use of /scratch/local on compute nodes

Effective Tuesday July 21 on lonepeak

posted 7 July 2020

Access permissions to /scratch/local will be set such that users will no longer be able to created directories in the top level /scratch/local directory. Instead, as part of the slurm job prolog (before the job is started), a job level directory,  /scratch/local/$USER/$SLURM_JOB_ID , will be created.   After the end of the job, in the slurm job epilog, this job level directory will be removed.

Once we confirm there are no issues on lonepeak, we will make the same change on the other clusters (kingspeak, notchpeak, ash and redwood)


Summer 2020 CHPC Presentation Schedule


CHPC DOWNTIME: 6/9/2020 at 8am - and windows servers (Narwhal, Beehive) and the community MySQL server

Posted May 27th, 2020

CHPC DOWNTIME: Network upgrade - Wednesday, May 20 7:30 to ~8 am

Posted May 14th, 2020

Arbiter  now running on redwood general Interactive nodes

Posted May 5th, 2020

CHPC DOWNTIME - Science DMZ outage Monday May 4th at 4am

Posted May 1st, 2020

CHPC OUTAGE - /scratch/general/lustre

Posted April 24th, 2020

CHPC ANNOUNCEMENT: Change in Lonepeak Access

Posted: March 31st, 2020

2020 Spring CHPC Newsletter

CHPC ANNOUNCEMENT: CHPC staff working remotely

New PE scratch space - /scratch/general/pe-nfs1

News History...

Clean Coal: Powered by Exascale

By Philip J. Smith and Michal Hradisky, CCMSC

The mission of the Carbon-Capture Multidisciplinary Simulation Center (CCMSC) at the University of Utah is to demonstrate the use of exascale uncertainty quantification (UQ) predictive simulation science to accelerate deployment of low-cost, low-emission electric power generation to meet the growing energy needs in the United States and throughout the world. The two main objectives, advancing simulation science to exascale with UQ-predictivity in real engineering systems and use of high-performance computing (HPC) and predictive science to achieve a societal impact, are linked together through an overarching problem: simulation of an existing 1,000 MW coal-fired ultra-supercritical (USC) boiler and simulation of a design 500 MW oxy-coal advanced ultra-supercritical (AUSC) boiler.

Read the full article in the newsletter

System Status

General Environment

last update: 2020-07-16 11:01:03
General Nodes
system cores % util.
kingspeak 816/832 98.08%
notchpeak 2880/3164 91.02%
lonepeak 2676/2676 100%
Owner/Restricted Nodes
system cores % util.
ash 3580/7360 48.64%
notchpeak 5625/5676 99.1%
kingspeak 5672/5672 100%
lonepeak 416/416 100%

Protected Environment

last update: 2020-07-16 11:00:02
General Nodes
system cores % util.
redwood 28/408 6.86%
Owner/Restricted Nodes
system cores % util.
redwood 1316/3920 33.57%

Cluster Utilization


Last Updated: 7/7/20