Research Highlights

Here is a list of items that are currently being featured on the home page, or that have been in the past.

Air Quality Balloon

Air Quality Modeling at CHPC

Collaboration between the Utah Division of Air Quality (UDAQ) and the University of Utah’s Center for High Performance Computing (CHPC) now gives the air quality modeling group at UDAQ access to advanced computing resources.  This cooperative agreement began with a request from the Utah office of the Bureau of Land Management (BLM) for consultation on air quality modeling to support environmental impact analysis needed to process natural gas drilling permits in the Uintah Basin.  This collaboration between UDAQ and CHPC is now a critical element in UDAQ’s ability to conduct air quality modeling for a wide variety of applications throughout the state, from the urbanized Wasatch Front to energy production areas of the Uintah Basin. 

HIPAA cluster

HIPAA-compliant Servers at CHPC

CHPC’s security expert Wayne Bradford, along with John Hurdle, Bernie LaSalle and Julio Facelli at BMI, has published a case study of the HIPAA-compliant environment at CHPC.  The study shows “how an HPC can be re-engineered to accommodate clinical data while retaining its utility in computationally intensive tasks such as data mining, machine learning, and statistics.”   Access to CHPC’s secured servers requires double authentication:  first, users must have access to the CHPC virtual private network; and, second, users must be listed in the HPC network information service directory.  Additional security is provided by housing the physical hardware in our data center that has controlled room access. All CHPC employees and the users of the secured servers are required to take the U’s Health Insurance Portability and Accountability Act training. The HIPAA-compliant environment is put to good use.  Wayne reports that “in the first 3 years, researcher count has increased from 6 to 58.”

You can read his full case study here:

Mapping the Universe with CHPC Resources

The Sloan Digital Sky Survey makes use of the University of Utah's Center for High Performance Computing (CHPC) parallel computing resources to help with its mission to map the Universe, from our Solar System through the Milky Way Galaxy, and beyond. Building on fifteen years of discovery, the fourth phase of SDSS (SDSS-IV) recently had two public data releases including DR14 earlier this year.

In SDSS-IV the survey expands its reach in three different ways:

  1. We observe a million stars in both the Northern and Southern skies by including a second telescope in Chile. SDSS now uses both the 2.5m Sloan telescope in New Mexico, and the 2.5m du Pont Telescope in Las Campanas, Chile.
  2. We observe millions of galaxies and quasars at previously unexplored distances to map the large-scale structure in the Universe 5 billion years ago, and to understand the nature of Dark Energy.
  3. We use new instrumentation to collect multiple high-resolution spectra within 10,000 nearby galaxies, to discover how galaxies grow and evolve over billions of years of cosmic history.

University of Utah astronomers are a core part of this international collaboration. Joel Brownstein, Professor of Physics and Astronomy, is the Principal Data Scientist, making sure that the SDSS data reduction pipelines run smoothly, and that the data products are easily accessible both within the team and publicly. Professor Kyle Dawson and postdoctoral fellows are also involved, working on instrumentation to map the distant Universe. Professor Gail Zasowski and her research group use SDSS observations of stars within our home Milky Way Galaxy to understand when and how they formed, and how our Galaxy is changing over time.

Autism children

Autism Research within CHPC’s Protected Environment

The Utah Autism and Developmental Disabilities Monitoring Project (UT-ADDM) headed by Deborah Bilder, M.D. and William McMahon, M.D. in the Department of Psychiatry at the University of Utah’s School of Medicine, uses CHPC’s  robust protected environment that allows researchers using protected health information (PHI) to gather, process and store data, increasing user productivity and compliance.  In addition to access to high performance computing power, other tangible benefits for researchers using PHI is that the CHPC handles systems management issues, such as rapid response to electrical power issues, provision of reliable cooling and heating, VPN support for a work-anywhere computing experience, and ensuring a hardened, secure environment compared to office computers or departmental servers. For the institution this resource allows much better compliance and reduces the vulnerabilities of exposure of PHI data. 


A New Role for Proteins

DNA encodes RNAs and RNAs encode proteins. This flow of cellular information is commonly referred to as the Central Dogma of Molecular Biology. However, a team of researchers discovered a notable exception to this rule where a protein can direct the synthesis of another protein, without an RNA template. This unusual mode of protein synthesis only occurs after normal protein synthesis has failed and appears to send a distress signal to the cell that something has gone awry.

The researchers first detected template-free protein synthesis by visualizing it directly by using a technique known as electron cryo-microscopy (cryo-EM). The image analysis, performed on the University of Utah Center for High Performance Computing cluster, required processing hundreds of thousands of 2D images to compute a 3D reconstruction of the cellular assembly. Once the researchers analyzed the structure and performed follow-up biochemical experiments, they knew they had stumbled upon an unexpected discovery. "In this case, we have a protein playing a role similar to that filled by messenger RNA," says Adam Frost, M.D., Ph.D., assistant professor at University of California, San Francisco (UCSF) and adjunct professor of biochemistry at the University of Utah, who led the research team. "I love this story because it blurs the lines of what we thought proteins could do."  This work was featured in the January 2, 2015 issue of Science.


Uncovering the secrets of Darwin’s pigeons

Centuries of selective breeding have generated a tremendous diversity among the 300+ breeds of domestic rock pigeon (Columba livia), an organism that Charles Darwin proposed as an exemplary model for evolution under selection. This variation gives us the opportunity to better understand the process of vertebrate development, and to investigate how those processes evolve over time. Until recently, however, the specific molecular mechanisms responsible for phenotypic differences among pigeon breeds were a complete mystery.

To identify genetic changes responsible for novel traits among pigeon breeds, researchers from the Shapiro lab at the University of Utah Department of Biology utilized resources at CHPC to identify genetic differences in the genomes of over 100 diverse pigeons, which were then compared to pinpoint which genetic differences are most-closely associated with specific traits.  These analyses require high performance computing to processes terabytes of genomic data.  Without the large scale computing resources available at CHPC, identifying genetic differences among pigeons would take years, rather than days. Through these studies, surprising links between genes responsible for trait evolution in pigeons and genes responsible for genetic disorders in humans have been found. These results help researchers better understand mechanisms of evolution on a level that Charles Darwin could only have imagined.



Yellowstone Supervolcano

Imaging Magma Reservoir beneath Yellowstone Park

The supervolcano that lies beneath Yellowstone National Park is one of the world’s largest active volcanoes. University of Utah seismologists Fan-Chi Lin, Hsin-Hua Huang, Robert B. Smith and Jamie Farrell have used advanced seismic imaging techniques to develop a more complete view of the magma chamber beneath this supervolcano, extending the known range from 12 miles underground to 28 miles. For the study the researchers used new methods to combine the seismic information from two sources. Data from local quakes and shallower crust were provided by University of Utah Seismographic Stations surrounding Yellowstone. Information on the deeper structures was provided by the NSF-funded EarthScope array of seismometers across the US.

Their recent study, as reported in the May 15, 2015 issue of Science, reveals that along with the previously known upper magma chamber there is also a second previously unknown second reservoir that is deeper and nearly 5 times larger than the upper chamber, as depicted in the cross-section illustration which cuts from the southwest to the northeast under Yellowstone.  This study provides the first complete view of the plumbing system that supplies hot and partly molten rock from the Yellowstone hotspot to the Yellowstone supervolcano. Together these chambers have enough magma to fill the Grand Canyon nearly 14 times. Using resources at the Center for High Performance Computing, new 3D models are being developed to provide greater insight into the potential seismic and volcanic hazards presented by this supervolcano.


Computational Fluid Dynamic Simulation of a Novel Flash Ironmaking Technology

The U.S. steel industry needs a new technology that produces steel from iron ore with lower greenhouse gas emission and energy consumption. At the University of Utah, Prof. Hong Yong Sohn and his team have conceived of a drastically novel idea for an alternative method called the Flash Ironmaking Technology to replace the century-old blast furnace process.  This new technology eliminates the highly problematic cokemaking and pelletization/sintering steps from the current ironmaking processes by directly utilizing iron ore concentrates, which are in abundance in the United States. 

Using CHPC resources, the Sohn group is developing high-resolution computational fluid dynamics (CFD) simulations to select the optimal operating conditions for testing and subsequently reduce the testing time and effort. Simulation results will be used to analyze the results from the flash reactors. Also of high importance, the results of the simulations will assist in the subsequent design of an industrial-scale pilot facility and eventual full-scale commercial plant.

food mapping

An Analysis of Tobacco and Food Purchases

Professor John Hurdle, Biomedical Informatics,  has developed QualMART, a tool for helping grocery stores promote healthy eating habits for their customers.  To validate the effectiveness of this tool, the group conducted a study that compared tobacco purchases and the quality of food purchases.  They classified household grocery transactions in the Atlanta region, based on whether shoppers had ever purchased tobacco, and then applied their novel food purchase quality scoring measure to evaluate the household food environment. The master database with 15 months’ shopping activity from over 100,000 households nationally is housed on a HIPAA-compliant cluster at CHPC (accessed via Swasey).

The graphic shows the difference between ‘ever’ and ‘never’ tobacco purchasing groups is significant, with green areas indicating higher food quality scores and grey and red showing lower quality scores, aggregated by the zip code of the grocery shopping location.

This study validated the group's data-driven grocery food quality scoring design as the findings reproduce results from other studies in the scientific literature showing that tobacco users have lower overall diet quality compared to people who do not use tobacco.    


Prediction of Crystal Structures from First Principle Calculations

Using CHPC resources a team of researchers from the University of Utah and the University of Buenos Aires has demonstrated that it is possible to predict the crystal structures of a biomedical molecule using solely first principles calculations.  The results on glycine polymorphs shown in the figure were obtained using the Genetic Algorithms search implemented in Modified Genetic Algorithm for Crystals coupled with the local optimization and energy evaluation provided by Quantum Espresso. All three of the ambient pressure stable glycine polymorphs were found in the same energetic ordering as observed experimentally.  The agreement between the experimental and predicted structures is of such accuracy that they are visually almost indistinguishable.

The ability to accomplish this goal has far reaching implications well beyond just intellectual curiosity.  Crystal structure prediction can be used to obtain an understanding of the principles that control crystal growth.  More practically, the ability to successfully predict crystal structures and energetics based on computation alone will have a significant impact in many industries for which crystal structure and stability plays a critical role in product formulation and manufacturing, including pharmaceuticals, agrochemicals, pigments, dyes and explosives.

Lund AM, Pagola GI, Orendt AM, Ferraro, MB, Facelli, JC (2015). Crystal structure prediction from first principles: The crystal structure of glycine. Chemical Physics Letters, 626, 20-24. 

Figure 1: Snapshots from simulations of two types of nanomaterials. (a) A highly porous metal-organic framework (ZIF-8), consisting of Zn ions (yellow spheres) and methylimidazolate linkers (nitrogen atoms are colored blue, carbon atoms are colored gray, hydrogen atoms are not shown). (b) A superstructure formed from octahedral silver nanocrys- tals. The pink frame indicates the boundaries of the simulated region. A few nanocrystals are colored yellow and blue to highlight features of the complex structure they form.

Watching Nanomaterials Assemble at CHPC

By Prof. Michael Grünwald, Grünwald Research Group, Department of Chemistry

My son and I like to build remote control cars. The path that leads from a disordered pile of plastic parts and metal screws to a new race car is straightforward and fun: step after step, we collect the pieces that need to be assembled and put them together according to the instructions. In fact, this assembly strategy is the blueprint for much human building activity and applies almost generally to the construction of houses, machines, furniture (in particular the Swedish kind), and many other objects of our daily lives.

Large objects, that is. Building small things, as it turns out, requires a strikingly different approach. Consider, for instance, the "objects" illustrated in Figure 1: A porous crystal structure made from intricately arranged metal ions and organic molecules (a "metal-organic framework"), and an ordered arrangement of nanoparticles (a "superstructure"), which themselves consist of many thousands of atoms. These structures are examples of "nanomaterials", objects that derive their unusual properties from their fascinating microscopic structure. Because of their large pores, metal-organic frameworks like the one in Figure 1a can be used to store hydrogen gas, filter CO2, or separate molecules by shape. Depending on the kinds of nanoparticles used, superstructures such as the one in Figure 1b can be used to alter the direction of light, or act as new kinds of solar cells.

Read the full article in the newsletter

Interactions of Insects and Plants

Cataloging the Interactions of Insects and Plants

By Thomas Kursar, Department of Biology

For our NSF-funded project, “Dimensions: Coexistence, Herbivore Host Choice, and Plant-Herbivore Evolution in the Recently Radiated and Speciose Neotropical Tree Genus, Inga," we developed a data repository that is based at CHPC. Our NSF project addresses critical, long-standing questions in tropical ecology, the origin and maintenance of diversity. We focus on the herbivorous insects that consume leaves and the defenses of plants against these invertebrates. We hypothesize that the diversity among plant species of their anti-herbivore defenses is exceedingly high. If it is the case that each species has a distinct defensive niche, this could explain how tropical forests maintain an exceedingly high number of coexisting species. Additionally, we hypothesize that the arms race between plants and herbivores may drive divergence and speciation, thus explaining the origin of diversity in tropical rainforests. Our repository supports this project by storing data and images on plants, herbivores, and the toxic metabolites that plants make in order to defend themselves.
Comparative genomics and signatures of social behavior in bees

Genomic Insights Through Computation

By Karen Kapheim, Kapheim Lab, Utah State University

The primary focus of research in the Kapheim Lab is understanding how social behavior evolves in bees. We take an integrative approach to finding answers to this question, and in doing so merge ecology, behavior, neuroscience, comparative genomics, and molecular biology. We conduct experiments in the field with live bees, process these in our molecular biology lab, and then analyze the sequence data using the CHPC. Examples of on-going projects include using metabarcoding to characterize the role of the microbiome in social behavior and health of bees. We have sequenced a portion of the bacterial 16s rRNA gene in DNA extracted from the guts of bees during various life stages. We are processing these sequences on the CHPC. As a side project, we are also using similar computational methods to characterize the metabarcodes sequenced from the guts of carrion flies to characterize the mammal community on a tropical island where we work. Other projects involve comparative genomics of bee genomes to look for signatures of evolutionary transitions between solitary and social lifestyles. We are also using the CHPC to analyze microRNA expression differences among bees that vary in social behavior, and in response to hormone treatments. In each of these projects, the CHPC staff and resources have been extremely valuable, as genomic data is particularly large and analyses would not be possible on desktop computers.
Niwot Ridge, Colorado

Understanding the Carbon Cycle Through Climate Models

By Brett Raczka, Department of Biology

Land surface models are useful tools to quantify contemporary and future climate impact on terrestrial carbon cycle processes, provided they can be appropriately constrained and tested with observations. Stable carbon isotopes of CO2 offer the potential to improve model representation of the coupled carbon and water cycles because they are strongly influenced by stomatal function. Recently, a representation of stable carbon isotope discrimination was incorporated into the Community Land Model component of the Community Earth System Model. Here, we tested the model's capability to simulate whole-forest isotope discrimination in a subalpine conifer forest at Niwot Ridge, Colorado, USA.

Read the paper in Biogeosciences

Tracking Pressure Features

By Alexander Jacques, MesoWest/SynopticLabs and Atmospheric Sciences

Center for High Performance Computing resources were used to model the progression of a mesoscale gravity wave generated by a large storm system on April 26–27, 2011.

A mesoscale gravity wave, generated by a large storm system in the southern United States, moved northward through the central United States causing short-term changes in surface wind speed and direction. This animation shows efforts to detect and evaluate the negative mesoscale surface pressure perturbation generated by this wave. Detected positive (red contours) and negative (blue contours) perturbations are determined from perturbation analysis grids, generated every 5 minutes, using USArray Transportable Array surface pressure observations (circle markers). Best-track paths for the perturbations are shown via the dotted trajectories. To identify physical phenomena associated with the perturbations, conventional radar imagery was also leveraged. It can be seen here that the detected feature migrates north away from the majority of the precipitation, which is often seen with mesoscale gravity wave features.

Modeling Ozone Concentration

By Brian Blaylock, Department of Atmospheric Sciences

A strong lake breeze with impact on air quality was observed on 18 June 2015 in the Salt Lake Valley. The progression of this lake breeze was simulated using the Weather Research and Forecast Model. The model was initialized using hourly analyses of the High Resolution Rapid Refresh model. Shown in the [above] videos are the concentrations of atmospheric tracers released near the surface at the north (red) and south (blue) end of the Salt Lake Valley. Tracers are released every time step from the source regions and then transported by the wind field. The development and passage of the simulated lake breeze is recognizable in the simulation on 18 June 2015 at 1830 UTC.

Modeling the Unexpected Formation of a Gyroid

By Carlos Chu-Jon, Grünwald Research Group, Department of Chemistry

You mix lemon, water, and sugar; and you expect lemonade, and not cider. Here we show the unexpected formation of a gyroid from the components that make up the porous metal organic framework ZIF-8. Although, the formation of this structure was not our original intent, its geometric intricacy, and simple beauty, makes it a worthwhile specimen.

Changes in Neuronal Membrane Properties Lead to Suppression of Hippocampal Ripples

By Eric D. Melonakos, John A. White, and Fernando R. Fernandez, Department of Bioengineering

Center for High Performance Computing resources were used to study the effects of cholinergic inputs to the hippocampus on patterns of brain activity.

Ripples (140–220 Hz) are patterns of brain activity, seen in the local field potential of the hippocampus, that are important for memory consolidation. Cholinergic inputs to the hippocampus from neurons in the medial septum-diagonal band of Broca cause a marked reduction in ripple incidence as rodents switch from memory consolidation to memory encoding behaviors. The mechanism for this disruption in ripple power is not fully understood. Among the major effects of acetylcholine (or carbachol, a cholinomimetic) on hippocampal neurons are 1) an increase in membrane potential, 2) a decrease in the size of spike after hyperpolarization (AHP), and 3) an increase in membrane resistance. Using an existing model of hippocampal ripples that includes 5000 interconnected neurons (Brunel and Wang, 2003), we manipulated these parameters and observed their effects on ripple power. Shown here, the network firing rate and ripple power of the original model (top row; pyramidal neuron data is shown in red, interneuron data is shown in black) undergo marked changes following a decrease in pyramidal neuron AHP size, as well as an increase in the membrane voltage of both types of neurons. These changes could be the means whereby cholinergic input suppresses hippocampal ripples.

Multiscale Modeling of Anion-exchange Membrane for Fuel Cells

By Jibao Lu, Liam Jacobson, Justin Hooper, Hongchao Pan, Dmitry Bedrov, and Valeria Molinero, University of Utah; Kyle Grew and Joshua McClure, US Army Research Laboratory; and Wei Zhang and Adri Duin, Pennsylvania State University

To our knowledge, this is the first coarse grain (CG) model that includes explicitly each water and ion, and accounts for hydrophobic, ionic, and intramolecular interactions explicitly paramterized to reproduce multiple properties of interest for hydrated polyelectrolyte membranes. The CG model of polyphenylene oxide/trimethylamine is about 100 times faster than the reference atomistic GAFF model. The strategy implemented here can also be used in parameterization of CG models for other substances, such as biomolecular systems and membranes for desalination, water purification and redox flow batteries. We anticipate that the large spatial and temporal simulations made possible by the CG model will advance the quest for anion-exchange membranes with improved transport and mechanical properties.

Analyzing and Predicting Stream Properties

By Milada Majerova and Bethany Neilson, Utah Water Research Laboratory, Utah State University

The stream temperature regime is an important and very complex component of habitat quality. With introducing beaver dams in to the system and thus changing stream hydraulic properties, the processes become even more complicated and difficult to predict. Beaver dams increase spatial and temporal variability in temperature and flow, as well as increase baseflow and groundwater gains during summer months. This variability could play an important role for fish and other aquatic organisms under changing conditions when summers are predicted to be hotter and longer with less precipitation throughout the year. Stream temperature quantification and modeling then becomes an essential tool in order to better understand, predict and manage our stream systems. CHPC resources play an indispensable role in the modeling effort of capturing and predicting the stream hydraulic properties and temperature variability.

Role of Stacking Disorder in Nucleation, Growth and Stability of Ice

By Laura Lupi, Arpa Hudait, Baron Peters, and Valeria Molinero, Department of Chemistry

Accurate forecasts of changes in weather and climate rely on the possibility of accurately predictions of the properties of clouds. Rates of ice nucleation, in the temperature ranges relevant for the atmosphere, are usually based on extrapolations using classical nucleation theory (CNT), which assumes that the structure of nanometer-sized ice nuclei correspond to that of bulk hexagonal ice. Here we use molecular dynamics simulations and free energy calculations to show that stacking disordered ice is the stable phase for critical-sized ice nuclei. The finding results in over three orders of magnitude higher nucleation rates with respect to CNT predictions and should have a strong impact on climate models.

Quantifying Contributions from Natural and Non-local Sources to Uintah Basin Ozone

By Huy Tran, Seth Lyman, Trang Tran, and Marc Mansfield, Bingham Entrepreneurship & Energy Research Center

Ozone in the lowest layer of the atmosphere (the troposphere) results in large part from human activity: Pollutants already present in the atmosphere are converted to ozone by the action of sunlight. However, there are also natural sources of ozone, such as wildfires and a phenomenon known as a "stratospheric intrusion," when strong vertical mixing pulls ozone from the stratospheric ozone layer down to the surface. Using the GEOS-Chem global chemical model, we have successfully demonstrated that a stratospheric ozone intrusion event occurred on June 8–9, 2015, which caused surface ozone in the Uintah Basin to exceed the 70-ppb national standard. We have also identified many other cases in which natural or non-local sources contributed a large portion of the surface ozone in the Basin, especially during spring and summer, although at levels not exceeding the national standard. The ability to distinguish human-caused local, human-caused non-local, and natural ozone events is important for planning and evaluating ozone mitigation strategies.

Data Assimilation for Improving WRF Performance in Simulating Wintertime Thermal Inversions in the Uintah Basin

By Trang Tran, Huy Tran, and Erik Crosman, Utah State University and University of Utah

Meteorological models for simulating atmospheric properties (e.g., temperature and wind) during thermal inversions are important for simulating winter ozone pollution in the Uintah Basin. The Weather Research and Forecasting (WRF) meteorological model supports "observational nudging," i.e., a technique in which the model is biased to conform to available observational data. We recently performed two WRF simulations, one nudged with temperature, wind field, and humidity data, and one without nudging, for the period of Jan 16 to Feb 9, 2013. Contrary to expectations, the nudged model produced an unrealistic inversion structure that was too intense and shallow. It confined most pollutants to only a shallow area at the bottom of the Basin. On the other hand, the non-nudged WRF model tends to produce a weaker, deeper inversion layer and produced too much vertical mixing.

Understanding Wind Energy

By Gerard Cortina and Marc Calaf, Wind Energy & Turbulence, Department of Mechanical Engineering

The Wind Energy and Turbulence laboratory was designed to improve the current understanding of wind energy harvesting. To achieve this goal we dedicate much of our efforts to develop new knowledge on the turbulent atmospheric boundary layer. Our focus resides on solving high resolution numerical simulations with the help of the Center for High Performance Computing at the university of Utah, which we ultimately complement with the analysis of experimental data.

Currently we mainly use Large Eddy Simulations, which are capable of resolving most of the atmospheric turbulent scales as well as the wind turbines, providing very good results when compared to the experimental data. We are highly interested in improving the current conception of the land-atmosphere energy exchanges, and our work strives to fill the gaps of our current understanding. It is only by properly capturing the land-atmosphere connection that forces the atmospheric flow aloft that we will be able to reproduce with high accuracy the atmospheric flow.

Tracking Pressure Perturbations Resulting From Thunderstorm Complexes

By Alexander Jacques, MesoWest/SynopticLabs and Atmospheric Sciences

Two strong thunderstorm complexes moved across the north-central plains of the United States late on August 11 into August 12, 2011. This animation shows research efforts to detect and evaluate large mesoscale surface pressure perturbation features generated by these complexes. The detected positive (red contours) and negative (blue contours) perturbations are determined from perturbation analysis grids, generated every 5 minutes, using USArray Transportable Array surface pressure observations (circle markers). Best-track paths for perturbations are shown via the dotted trajectories. To identify physical phenomena associated with the perturbations, conventional radar imagery was also leveraged to identify regions of thunderstorm and precipitation activity. It can be seen here that two distinct thunderstorm complexes are co-located with several of the detected pressure perturbation feature.

Clean Coal: Powered by Exascale

By Philip J. Smith and Michal Hradisky, CCMSC

The mission of the Carbon-Capture Multidisciplinary Simulation Center (CCMSC) at the University of Utah is to demonstrate the use of exascale uncertainty quantification (UQ) predictive simulation science to accelerate deployment of low-cost, low-emission electric power generation to meet the growing energy needs in the United States and throughout the world. The two main objectives, advancing simulation science to exascale with UQ-predictivity in real engineering systems and use of high-performance computing (HPC) and predictive science to achieve a societal impact, are linked together through an overarching problem: simulation of an existing 1,000 MW coal-fired ultra-supercritical (USC) boiler and simulation of a design 500 MW oxy-coal advanced ultra-supercritical (AUSC) boiler.

Read the full article in the newsletter

Tackling Large Medical Genomics Datasets

By Barry Moore, USTAR Center for Genetic Discovery

The University of Utah has a long and rich history of genetic research that spans decades and has led to the discovery of over 30 genes linked to genetic disease. These Utah discoveries range from relatively common and well-known heritable disease, such as breast cancer linked to BRCA1/BRCA2 genes, to the truly obscure Ogden syndrome, which in 2010 became the first new genetic disease to be described based on genome sequencing. The Utah Genome Project (UGP), together with clinical investigators across the University of Utah, is continuing this tradition of cutting edge genetic research in Utah by launching several large medical genomics projects over the last year. The USTAR Center for Genetic Discovery (UCGD)—the computational engine for the UGP—has partnered with the University’s Center for High Performance Computing (CHPC) to tackle the massive datasets and the large scale computing requirements associated with these projects.

Read the full article in the newsletter

Linking Frost Timing to Circulation Patterns

Atmospheric sciences professor Courtenay Strong and Gregory McCabe of the United States Geological Survey studied how frost timing (specifically, the lengthening of the frost-free season) is influenced by global warming and local atmospheric circulation by utilizing objective-clustering algorithms and optimization techniques. By discovering the circulations responsible for frost timing in different climatic regions of the conterminous United States, they found that atmospheric circulation patterns account for between 25 and 48 percent of variation in frost timing.

Read the paper in Nature Communications or read the article in UNews

Sea level pressure analysis from the operational High Resolution Rapid Refresh at 1 PM March 14, 2017 with unusually low pressure associated with a major New England snowstorm

Efficient Storage and Data Mining of Atmospheric Model Output

By Brian Blaylock and John Horel, Department of Atmospheric Sciences

Our group … purchased 30TB in CHPC’s pando [archive storage] system to test its suitability for several research projects. We have relied extensively over the years on other CHPC storage media such as the tape archive system and currently have over 100TB of network file system disk storage. However, the pando system is beginning to meet several of our interwoven needs that are less practical using other data archival approaches: (1) efficient expandable storage for thousands of large data files; (2) data analysis using fast retrieval of user selectable byte-ranges within those data files; and (3) the ability to have the data accessible to the atmospheric science research community.

The CHPC pando storage archive has made it possible for us to efficiently archive, access, and analyze a large volume of atmospheric model output. Several researchers outside the University of Utah have already discovered its utility in the short time that the archive has been available.

Read the full article in the newsletter

Modeling Pollution in Utah's Valleys

By Christopher Pennell, Utah Division of Air Quality

The Utah Division of Air Quality simulated a high pollution episode that occurred during the first eleven days of January, 2011. Using CHPC resources, we produced a high resolution, hourly animation showing when levels of fine particulate matter (PM2.5) far exceeded federal standards in Northern Utah.

Air pollution builds up during the day with the onset of sunlight and human activity. Pollution levels greatly decrease in the late evening except when a persistent temperature inversion gets established in Utah’s valleys. As inversion conditions persist, air pollution steadily accumulates across several days triggering public health concerns. We are left waiting for a strong winter storm that can destroy surface air stability and bring in fresh clean air.

Our pollution modeling not only accounts for human activity, but also for the mechanisms that make particulate pollution from emitted gases. The computational power provided by CHPC allows the State of Utah to model the complex relationship between meteorology, human activity, and air chemistry with impressive precision.

Cryo-EM at the University of Utah

By Peter Shen, Department of Biochemistry

In recent years, the University of Utah has established an outstanding core base of cryo-electron microscopy (cryo-EM) expertise and compiled a strong track record of performing impactful cryo-EM research. These efforts have resulted in the University of Utah being awarded one of five $2.5 million grants from the Arnold and Mabel Beckman Foundation to establish a world-class cryo-EM facility.

Most of the cryo-EM data analysis procedures at the University of Utah are performed using CHPC resources. CHPC supports many software packages used in cryo-EM data processing, including RELION, EMAN2, SPIDER, FREALIGN, BSoft, and cryoSPARC.

One major focus in the field [of cryo-electron microscopy] is to fully automate the entire pipeline of recording cryo-EM movies, de-blurring the images, identifying the particles, and reconstructing them in 3D. Perhaps the time is not far off when high-quality 3D reconstructions will be attainable within hours after the cry-EM imaging session. Our ongoing collaborations with CHPC will certainly play an important role for this dream to become a reality here at the University of Utah.

Read the full article in the newsletter

The Effects of Wind Angle on the Effectiveness of Erosion Control Structures

By Eden Furtak-Cole, Department of Mathematics and Statistics, Utah State University

Roughness elements experiments have been conducted at the Owens lake playa to control harmful PM10 emissions. These maps of shear stress magnitude result from a 3D simulation of flow over box-shaped roughness elements, used for erosion control. Flow is from left to right. The rotated element is shown to be less effective in reducing shear, though it has a greater frontal area exposed to the wind direction. This underscores the importance of 3D simulation in predicting atmospheric boundary layer flows. Simulations were conducted by solving the incompressible Navier-Stokes equations with OpenFOAM.

Cluster dendrogram for Inga species with AU/BP values (%) using Ward D

Using CHPC resources to calculate chemical similarity of species of tropical trees

By Gordon Younkin, Department of Biology

We have developed a metric to quantify the similarity of defensive compounds (secondary metabolites) among different species of plants. The goal is to address fundamental questions in the ecology of tropical forests: What is the origin of the extremely high diversity? How is the exceedingly high local diversity maintained? Our hypothesis is that the answers have to do with the interactions of plants with their herbivores, with particular importance ascribed to the chemical defenses of plants. Here, we report on how we used CHPC resources to quantify the chemical similarity among species of plants.

Using ultra performance liquid chromatography-mass spectrometry (UPLC-MS), we examined the chemical profiles of 166 species of Inga, a genus of tropical trees. Among these species, we have recorded nearly 5000 distinct compounds, most of which are of unknown structure. Based on the abundance of these compounds in each species, we can calculate the overall chemical similarity of each species pair. While each individual calculation is not all that resource-intensive, we have multiple individuals for each species for a total of 795 individuals. Pairwise comparisons between all individuals requires 316,410 separate similarity calculations, a task much too large for a desktop computer. We have parallelized these calculations on a CHPC cluster, where the calculations finish in a matter of hours.

Uncertainty Quantification of RNA-Seq Co-expression Networks

By Lance Pflieger and Julio Facelli, Department of Biomedical Informatics

Systems biology utilizes the complex and copious data originating from the “omics” fields to increase understanding of biology by studying interactions among biological entities. Gene co-expression network analysis is a systems biology technique derived from graph theory that uses RNA expression data to infer functional similar genes or regulatory pathways. Gene co-expression network analysis is a computationally intensive process that requires matrix operations on tens-of-thousands of genes/transcripts. This technique has been useful in drug discovery, functional annotation of a gene and insight into disease pathology.

To assess the effect of uncertainty inherent with gene expression data, our group utilized CHPC resources to characterize variation in gene expression estimates and simulate a large quantity of co-expression networks based on this variation. The figure shown is a representation of network generated using WGCNA and expression data from the disease Spinocerebellar Type 2 (SCA2). The colors represent highly connected subnetworks of genes which are used to correlate similar gene clusters with a phenotypic trait. Our results show that uncertainty has a large effect on downstream results including subnetwork structure, hub genes identification and enrichment analysis. For instance, we find that the number of subnetworks correlating with the SCA2 phenotype varies from 1 to 6 subnetworks. While a small gene co-expression network analysis can be performed using only modest computation resources, the scale of resources required to perform uncertainty quantification (UQ) using Monte Carlo ensemble methods is several orders of magnitude larger, which are only available at CHPC.

The Music of Fault Zones

By Amir Allam, Hongrui Qiu, Fan-Chi Lin, and Yehuda Ben-Zion, Department of Geology & Geophysics

We deployed 108 seismometers in a dense line across the most active fault in Southern California (the San Jacinto fault) and recorded 50 small earthquakes. This animation shows how the fault zone itself is resonating due to the passing waves. The earthquakes are exciting normal mode oscillations - just like on a guitar string - directly underneath the seismometers. This is due to a zone of highly damaged rocks near the active fault which act to trap passing seismic energy. This resonance decreases in amplitude with increasing distance from the fault zone.

Lanthanide Ion Thermochemistry and Reactivity with Small Neutrals: Benchmarking Theory

By Maria Demireva and P. B. Armentrout, Armentrout Research Group, Department of Chemistry

Heavy elements, such as the lanthanides, are difficult to describe theoretically because of spin-orbit and relativistic effects and the many electronic configurations that arise from the valence electrons occupying the 4f shell. Testing different theoretical models requires benchmarks from experiment. Thermochemistry measured from gas phase experiments, where small systems can be probed in isolation from solvent or substrate molecules, can serve as useful standards. Additionally, results from such experiments can be used together with theory to learn about the properties and behavior of these heavy elements, which remain relatively unexplored. For example, we have studied the exothermic activation of CO2 by the lanthanide gadolinium cation to form the metal oxide cation (GdO+) and CO. Because the ground state reactant and product ions differ in their spin states while the neutrals have singlet states, the reaction between ground state reactants and products is formally spin-forbidden. Yet experiment indicates that the reaction occurs efficiently and without any barriers. This can be explained by theoretical calculations, which reveal that the surface correlating with the ground state reactants likely mixes in the entrance channel with the surface correlating with the ground state products. Because there are no barriers along these potential energy surfaces that exceed the reactant asymptote, the reaction can proceed with relatively high efficiency at thermal collision energies. An increase in reaction efficiency is observed in the experiments at higher collision energies. From theoretical calculations, this increase can be attributed to the reactants having enough energy to surmount a barrier found along the potential energy surface of the ground state reactants such that an electronically excited GdO+ product can be formed directly via a single diabatic surface. Although the theoretical calculations can explain qualitatively the experimental results, it is also important that they quantitatively agree. Comparison with high level calculations indicate that there is room for improvement. Combination of precise and accurate experiments with state-of-the-art computational resources provides detailed energetics and mechanistic understanding of lanthanide reactivity that would be difficult to gain by experiment or theory alone.

95th percentile of 10 meter wind speed for every hour in May, June, and July 2015-2017. Strong winds often occur during evening hours, over mountain ridges, oceans, and Great Lakes, and the mountain and central states. This video has been truncated to better fit this format.

Weather Statistics with Open Science Grid

By Brian Blaylock and John Horel, Atmospheric Sciences

CHPC's Pando archive hosts 40+ TB of weather model analyses and forecasts from the High Resolution Rapid Refresh model beginning April 2015. Resources from the Open Science Grid were used to quickly retrieve data from the Pando archive and calculate percentile statistics for several weather variables. Percentiles from three years of data were calculated for every hour of the year using a 30 day window centered on each hour. These statistics are being used to perform data quality checks of in situ weather observations and provide meteorologists insight on model performance at specific locations.

Oriented Attachment of ZIF-8 Nanoparticles

By the Grünwald Group, Department of Chemistry

Nanocrystal growth can occur through a variety of different mechanisms. Our group uses molecular dynamics simulations to visualize these various processes. In this case, two ZIF-8 nanocrystals, once close enough proximity to each other, coalesce through oriented attachment to form a larger nanocrystal.

Formation of COF-5 in an Implicit Solvent Model

By Grünwald Group, Department of Chemistry

These three movies describe the formation of covalent organic framework, No. 5 (usually known as COF-5) in an implicit solvent model. The description for each of them is as follows:

  1. An extreme case in which the stacking interaction among molecules are turned off thus no COF structure formed after hundreds of nanoseconds.
  2. At experimental condition, the formation occurs through an event called "spinodal decomposition" which results in the creation of defective COF motifs spontaneously in the solution.
  3. A single case where the stacking interaction is scaled smaller so that the crystallization of COF-5 happens through the growth of a single, defect-free crystal, which is much desired in experiment.

The Utah PRISMS Ecosystem: An Infrastructure for Global Exposomic Research

By Ramkiran Gouripeddi1,2, Mollie Cummins1,2,3, Julio Facelli1,2, and Katherine Sward1,2,3 for the Utah PRISMS Team

1Department of Biomedical Informatics, 2Center for Clinical and Translational Science, 3College of Nursing

The Utah PRISMS (Pediatric Research Using Integrated Sensor Monitoring Systems) Team uses a systematic approach to aggregate data on environmental exposures and socio-economic factors to explore potential effects of the modern environment on health. The project uses sensor measurements and information from individuals in the community to support research at both the population and personal level. It supports a standards-based, open source informatics platform to meaningfully integrate sensor and biomedical data and consists of

  1. Data Acquisition Pipeline: Hardware and software, wireless networking, and protocols to support easy system deployment for robust sensor data collection in homes, and monitoring of sensor deployments.
  2. Participant Facing Tools: Annotate participant generated data, display sensor data, and inform participants of their clinical and environmental status.
  3. Computational Modeling: Generate high resolution spatio-temporal data in the absence of measurements as well as for recognition of activity signatures from sensor measurements.
  4. Central Big Data Integration Platform (OpenFurther): Standards-based, open-access infrastructure that integrates study-specific and open sensor and computationally modeled data with biomedical information along with characterizing uncertainties associated with these data.
  5. Researcher Facing Platforms: Tools and processes for researchers performing exposomic studies of a variety of experimental designs.

An Agent-Based Model for Estimating Human Activity Patterns on the Wasatch Front

By Albert M. Lund1,2, Nicole B. Burnett2,3, Ramkiran Gouripeddi1,2, and Julio C. Facelli1,2

1Department of Biomedical Informatics, 2Center for Clinical and Translational Science, 3Department of Chemistry

It is difficult to measure the impact of air quality on human health because populations are mobile. Additionally, air quality data is reported at low geographic resolutions (> 1 km2), which makes it difficult to characterize acute local variations in air quality. There are few examples of combining human movement and activity data with high resolution air quality data to capture trajectory based exposure profiles in a comprehensive way. An agent-based model helps simulate human activities and locations throughout an arbitrary day. Simulation is used to overcome the limitations of existing datasets; simulated households based on aggregate data for the state of Utah are modeled and activity profiles generated from the American Time Use Survey of the U.S. Bureau of Labor Statistics. The activity profiles are combined with the simulated households to build individual trajectories of activity and location over the desired region of study.

How will new technology change deep brain stimulation programming?

By G. Duffley1, J. Krueger2, A. Szabo3, B. Lutz4, M.S. Okun5, and C.R. Butson1

1University of Utah, 2University of Duisberg-Essen, 3Medical College of Wisconsin, 4University of North Carolina Wilmington, 5University of Florida

For movement disorders the typical programming process consists of a nurse or physician systematically moving through a subset of the over 10,000 possible stimulation settings looking for benefit as well as known side effects by visually examining the patient. Once this information is found, the nurse searches for the best stimulation setting within the range of those that do not induce an apparent side effect. Once what is assumed to be the best setting is found, the patient is sent home, only to return on average a month later to tweak the stimulation settings based on long term side effects or residual motor disability. The burden of travel to attend regular DBS [deep brain stimulation] programming sessions is high for patients and their primary caregivers. The objective of our study is to test a clinical decision support system that we believe will enable nurses to more effectively achieve these goals [of adequate symptom relief with minimal side effects]. We are prospectively assessing changes in DBS programming time, patient outcomes, quality of life, and family caregiver burden using an iPad-based mobile app to program DBS devices for PD [Parkinson's disease] patients. Our computational models show there is some variability between the location and spatial extent of best stimulation settings at six months across patients, but it is unknown if the same level of variability exists within individual patients. So far, programming time hasn't been significantly reduced, but the challenge of changing clinician behavior is non-trivial. Determining how our technology fits within the context of DBS programming algorithms is an open question. Developing an easy to follow, but effective, workflow for novice programmers will be essential for phase two of the trial to succeed.

Structural Imaging Evaluation of Subcallosal Cingulate Deep Brain Stimulation for Treatment-resistant Depression

By Kara A. Johnson1,2; Darren L. Clark, PhD3; Gordon Duffley1,2; Rajamannar Ramasubbu, MD3; Zelma H.T. Kiss, MD3; and Christopher R. Butson, PhD1,2,4

1Department of Bioengineering; 2Scientific Computing & Imaging (SCI) Institute; 3Departments of Clinical Neurosciences and Psychiatry, University of Calgary; 4Departments of Neurology and Neurosurgery

Deep brain stimulation (DBS) of the subcallosal cingulate cortex (SCC) is an investigational therapy for treatment-resistant depression (TRD). There is a wide range of response rates for SCC DBS for TRD. The ideal location and extent of stimulation within the SCC to produce substantial therapeutic effects are currently unknown and may vary between patients. We used T1-weighted structural MRI to make between- and within-subject comparisons of volumes of tissue activated (VTAs) relative to structural anatomy to make observations about the effects of stimulation location and settings on clinical response. Our preliminary results suggest that stimulation location and volume relative to T1 structural anatomy alone may not predict clinical response in SCC DBS for TRD. Therapeutic response to SCC DBS may depend on a combination of several factors, such as patient-specific stimulation parameters, duration of stimulation, or other factors that play a role in specific fiber activation. Further analysis is warranted to elucidate whether stimulation locations, parameters, and durations predict therapeutic response to SCC DBS.

Realistic five compartment (skin, skull, CSF, gray matter, white matter) finite element head model

Influence of Uncertainties in the Head Tissue Conductivities on the EEG Forward Problem

By James Vorwerk1, Carsten H. Wolters2, and Christopher R. Butson1

1Scientific Computing and Imaging (SCI) Institute, 2Institute for Biomagnetism and Biosignalanalysis, University of Münster

For accurate EEG [electroencepahlography] source analysis, it is necessary to solve the forward problem of EEG as exact as possible. We investigate the influence of the uncertainty with regard to the conductivity values of the different conductive compartments of the human head on the EEG forward and inverse problem. The goal is to identify for which of these compartments varying conductivity values have the strongest influence, so that these conductivity values can be individually calibrated in future studies. For the investigated source in the somatosensory cortex, the skull conductivity clearly has the strongest influence, while white and gray matter conductivities have a very low influence. If possible, an individual calibration of the skull conductivity should therefore be performed. The feasibility of a calibration of further conductivity values based on SEPs [somatosensory evoked potentials] is questionable given the dominance of the skull conductivity. This study shows that besides the geometrical modeling of the conductive compartments of the human head, also the conductivity values assumed for these compartments have a strong influence in EEG source localization.