You are here:

CHPC - Research Computing Support for the University

In addition to deploying and operating high performance computational resources and providing advanced user support and training, CHPC serves as an expert team to broadly support the increasingly diverse research computing needs on campus. These needs include support for big data, big data movement, data analytics, security, virtual machines, Windows science application servers, protected environments for data mining and analysis of protected health information, and advanced networking. Visit our Getting Started page for more information.

CHPC moving from JIRA to Service-Now for Helpdesk issues and incidents

CHPC Presentation Schedule for Spring 2018 now available

Protected Environment Users: Move of home and project/groups spaces scheduled Jan 4th starting at 8am

Downtime on kingspeak, ash, lonepeak and tangent: Thursday, December 21 8am - 4pm

This is the same fix that was put in place on ember on December 14th which solved the observed issues. Cluster backup up and scheduling jobs by 4:00 p.m. 

/scratch/general/lustre unscheduled downtime - 12/15-18 

  • UPDATE 12/18/17 6:07 p.m. - The /scratch/general/lustre file system is available for use.
  • UPDATE 12/18/17 3:30 p.m. - Hardware replaced and system up, but not ready for use. Watch for notice.

System Issues December 15th, 2017

  • FastX - Resolved 12/15

Posted: December 15th, 2017

CHPC Fall 2017 Newsletter

Tangent Users - issue with jobs not starting - some jobs still not starting successfully 

Updated October 24th, 2017 

CHPC on Twitter

News History...

Sea level pressure analysis from the operational High Resolution Rapid Refresh at 1 PM March 14, 2017 with unusually low pressure associated with a major New England snowstorm

Efficient Storage and Data Mining of Atmospheric Model Output

By Brian Blaylock and John Horel, Department of Atmospheric Sciences

Our group … purchased 30TB in CHPC’s pando [archive storage] system to test its suitability for several research projects. We have relied extensively over the years on other CHPC storage media such as the tape archive system and currently have over 100TB of network file system disk storage. However, the pando system is beginning to meet several of our interwoven needs that are less practical using other data archival approaches: (1) efficient expandable storage for thousands of large data files; (2) data analysis using fast retrieval of user selectable byte-ranges within those data files; and (3) the ability to have the data accessible to the atmospheric science research community.

The CHPC pando storage archive has made it possible for us to efficiently archive, access, and analyze a large volume of atmospheric model output. Several researchers outside the University of Utah have already discovered its utility in the short time that the archive has been available.

Read the full article in the newsletter

System Status

last update: 01/21/18 5:42 am
General Nodes
system cores % util.
ember 900/972 92.59%
kingspeak 772/824 93.69%
lonepeak 140/1104 12.68%
Restricted Nodes
system cores % util.
ash 6468/7224 89.53%
apexarch Status Unavailable
ember 1196/1220 98.03%
kingspeak 7272/7328 99.24%
lonepeak 120/380 31.58%

Cluster Utilization

Last Updated: 1/18/18