December 14th, 2015: SIParCS Program
- Posted November 9th: Blue Waters Internships
- Posted November 6th: REMINDER: Allocation Requests for Winter 2016 are Due December 1st, 2015
- Posted November 3rd, 2015: ATTN: users who samba mount CHPC file systems
- Posted November 1st, 2015: Partial network outage starting Tuesday Nov 3 at 8pm
- Posted October 21, 2015: Introduction to I/O at CHPC
- Posted October 20, 2015: Hybrid MPI-OpenMP Programming
- Posted October 16, 2015: New Warnings about Home and/or Group File System Space
- Posted October 9, 2015: XSEDE HPC Workshop: Big Data
- Posted October 7, 2015: CHPC Data Center Tours: Nov 5 and Dec 3
- Posted September 25, 2015: XSEDE OpenMP workshop reminder
- Posted September 25, 2015: /scratch/ibrix/chpc_gen is at 94% full
- Posted September 22, 2015: NVidia online OpenACC course and free development toolkit
- Posted September 18, 2015: Power Outage: 9/17/2015 from ~8 p.m. until ~ 10 p.m.
- Posted September 11, 2015: CHPC Data Center Tour: October 1st
- Posted September 8, 2015: EMBER Cluster Flushed around 2:30 pm.
- Posted September 8, 2015: DOWNTIME: September 22, 2015 beginning at 6:30 a.m.
- Posted September 3, 2015: ATTENTION: Protected Environment Users
- Posted August 21, 2015: ATTENTION: Protected Environment Users
- Posted August 14, 2015: Issue with CHPC File Server
- Posted August 10, 2015: SC15 Doctoral Showcase – Deadline Extended until August 21
- Posted August 7: Allocation requests for Fall 2015 quarter due September 1, 2015
- Posted July 23rd: Issues with mounting CHPC file systems from Windows and Macs desktops
- Posted July 16th: Change in CHPC /scratch file system scrubbing policy
- Posted July 16th: INSCC Building Power Outage, Wednesday July 22nd from 8 a.m. until 12:30 p.m.
- Posted July 16th: CHPC Downtime: Thursday July 30th from 4 - 6 p.m
- Posted July 15th: Brief (30 seconds) CHPC network outage: Thursday, July 16 at 7:30 a.m.
- Posted July 6th: OUTAGE for users of CHPC Data Transfer Nodes dtn01 and dtn02 -- July 7, 2015 from 9pm to midnight
- Posted July 6th: Cambridge Structural Database Users: Mercury has been updated to release 3.6
- Posted July 2nd: Partial systems outage at the DDC - July 1st about 5 p.m.
- Posted June 19th: Unplanned power outage, INSCC (bldg 19), June 19th, at 3:30 p.m. for 15 seconds
- Posted June 12th: Rocky Mountain Advanced Computing Consortium (RMACC) HPC Symposium -- Aug 11-13
- Posted June 8th: Upcoming CHPC Presentations
- Posted June 1st: TANGENT DOWNTIME -- Thursday, June 4 at 7:30AM
- Posted May 28th: Intel Intern Position
- Posted May 28th: SDSC Summer Institute 2015: HPC for the Long Tail of Science
- Posted May 21st: New version of FastX -- FastX2 -- now available
- Posted May 14th: CHPC research highlights for CASC brochure
- Posted May 14th: Notice of Sympa email list serve maintenance Sunday May 17, 2-4 p.m.
- Posted May 5th: Brief Network outage on Wednesday, May 5th beginning at 6 a.m.
- Posted May 1st: REMINDER: Allocation Requests for Summer 2015 are Due June 1st, 2015
- Posted April 29th: Unplanned power interruption involving INSCC building: Saturday May 2, 2015 from 8-8:10 a.m.
- Posted April 28th: Unscheduled network outage - 2:52 p.m. April 28th, 2015
- Posted April 21st: DTN Interruptions Complete
- Posted April 21st: Biomedical Modeling Symposium Wednesday (fwd)
- Posted April 21st: FastX issue has been resolved
- Posted April 21st: DTN (Data Transfer Node) Interruptions Tonight 11pm - 1 am
- Posted April 20th: Issue with FastX
- Posted April 20th: /scratch/ibrix/chpc_gen at 97% full
- Posted April 17th: Virtual School of Computational Science and Engineering webcasts at CHPC
- Posted April 15th: Power Outage of INSCC Building - Monday April 20th from 5 - 5:30 a.m.
- Posted April 8th: HPCWire Highlights Thomas Cheatham's Research on Ebola
- Posted April 6th: SLURM Training: April 9th, 1-2 p.m., INSCC Auditorium (Room 110)
- Posted April 6th: 2015 TACC Summer Supercomputing Institute July 6 - July 10, 2015
- Posted April 3rd: Final Update on CHPC Downtime
- Posted April 3rd: SLURM Training: April 3rd, 1-2 p.m., INSCC Auditorium (Room 110)
- Posted March 25th: Matlab seminars
- Posted March 23rd: SLURM Training: March 24th, 1-2 p.m., INSCC Auditorium (Room 110)
- Posted March 23rd: Change in allocation metric and a date adjustment
- Posted March 23rd: **Reminder** Upcoming Downtime which includes SLURM Deployment on all clusters - APRIL 2nd, 2015
- Posted March 23rd: Intel Software Development Tools, March 24th from 1-2 p.m., INSCC Auditorium
- Posted March 17th: SLURM Training: March 17th, 1-2 p.m., INSCC Auditorium (Room 110)
- Posted March 12th: Message to CHPC Principal Investigators
- Posted March 11th: XSEDE15 Student Program CFP - Travel assistance available to accepted
- Posted March 10th: UPDATE on the mysql.chpc.utah.edu migration to a new server
- Posted March 10th: Lonepeak now using SLURM for Batch Scheduling
- Posted March 10th: Hands-on Introduction to Numpy & Scipy
- Posted March 9th: mysql.chpc.utah.edu migration March 10th
- Posted March 6th: IMPORTANT ANNOUNCEMENT on upcoming changes to batch scheduling system on ALL clusters
- Posted March 5th: Advanced Module Use
- Posted March 4th: Change in Upcoming CHPC DOWNTIME
- Posted March 3rd: XSEDE HPC Monthly Workshop: MPI
- Posted January 27th: UPCOMING DOWNTIME
- Posted February 13th: Additional CHPC Presentation on the use of modules at CHPC
- Posted February 10th: Artificial Intelligence / Machine Learning / Natural language processing, Date February 10th, 2015, Time: 2 p.m.
- Posted February 10th: Hands-on Intro to Linux (part 3 of 3), Date: Tuesday, February 10th, 2015, Time: 1-3 p.m.
- Posted February 4th: Allocation requests for Spring 2015 quarter due March 1, 2015
- Posted February 3rd: R update on Linux Machines
- Posted February 3rd: Hands-on Intro to Linux (part 2 of 3), Date: Tuesday, February 3rd, 2015, Time: 1-3p.m.
- Posted January 30th: Urgent Rebooting of interactive nodes
- Posted January 29th: IMPORTANT security patch for all users with linux desktops
- Posted January 27th: Hands-on Intro to Linux (part 1 of 3), Date: Tuesday, February 3rd, 2015, Time: 1-3p.m.
- Posted January 23rd: IDL Update
- Posted January 22nd: FastX server upgrade has been completed
- Posted January 22nd: CHPC Presentation: Protected Environment, January 20th, 2015, Time 2-3 p.m., BMI Classroom (421 Wakara Way, room 1470)
- Posted January 20th: Supercomputing in Plain English: Tuesdays beginning January 20th, 2015, 12:30 p.m.
- Posted January 15th: Security Awareness - Please read
- Posted January 13th: Overview of CHPC, January 13th, 2015, 1:00 p.m., INSCC Auditorium (room 110)
- Posted January 12th: Matlab Update
- Posted January 10th: XSEDE HPC Monthly Workshop, February 6th, 2015 - OpenACC
- Posted January 9th: CSD system software has been updated
- Posted January 6th: Spring 2015 CHPC Presentation Schedule
February 13, 2015
Additional CHPC Presentations on the Use of Modules at CHPC
On Thursday Feb 19, 2015 CHPC will start to provide the option for users to change to the use of modules (from the current sourcing of set up scripts for different applications) in order to manage their session environment.
Modules is an environment management tool which makes modifying the user's shell environment simple and dynamic. The advantage of modules primarily comes from the capability to load and unload the environment needed for a given software package, allowing users to quickly and easily start using programs or switch development environments.
For the foreseeable future, CHPC users will have the choice to adopt the module environment or to remain with the script sourcing method that has been used in the past. All new users accounts will be started with the module option.
More details about this change will follow next week when we go live with this option.
At this time we wanted to announce that we will be offering the following sessions about modules:
- Thursday Feb 19 at 1pm in INSCC auditorium – Getting Started with Modules
- Thursday Feb 26 at 1pm in INSCC auditorium – Getting Started with Modules (repeat)
- Thursday Mar 5 at 1:30pm in INSCC 345 – Advanced Module Use
In the Getting Started sessions we will focus on how to use modules and what steps users need to take in order to start using the module environment. Users can come with laptops if preferred and we will assist with the transition.
In the Advanced Module Use session we will focus on how we have our modules organized and how users can create modules for any applications they may have installed in their own space.
This is a reminder that proposals for allocation in the general allocation pool for the Spring 2015 quarter (Apr 1-Jun 30, 2015) are due by Sunday, March 1, 2015.
If you are in the final quarter of your current allocation you will have already received a notice that you need to submit an allocation request by this date.
Please also note that we have made several minor changes in the information being requested in the resources required, sources of funding, and results of previous CHPC support sections.
1. Information on the allocation process and relevant forms are located online here:
2. Your request may be for up to 4 quarters.
3. Please submit your request through our online system by going here:
Please let us know if you have any problems with or suggestions of ways we can improve the process. Please submit all feedback by sending email email@example.com.
We have built a new distribution of R that takes advantage of multithreading when processing numerical arrays and made it the
default R to use on CHPC's Linux systems including the clusters. The multithreaded build will run in parallel on a single compute node and
standard R linear algebra benchmark achieves 5-10 speedup on kingspeak as compared to the single threaded build. To use multiple nodes, a
distributed parallel package such as Rmpi or snow is still necessary.
We have installed external packages that have been built for the older versions but if there is an additional package you need, please, let
For details how to use R at CHPC, see the documentation at
If you have a CHPC administered linux desktop we have applied the security patch for the recently announced GHOST glibc vulnerability. You MUST reboot your desktop as soon as possible in order to have the patch take effect.
If you are running a linux desktop that is not CHPC administered,please make sure that your system has been updated and rebooted. If you have any questions, please contact firstname.lastname@example.org
Further information on this vulnerability can be found at http://www.openwall.com/lists/oss-security/2015/01/27/9