Collaborative Research: Elements: Software: NSCI: HDR: Building An HPC/HTC Infrastructure For The Synthesis And Analysis Of Current And Future Cosmic Microwave Background Datasets

Project: Research project

Project Details


The photons created in the Big Bang have experienced the entire history of the Universe, and every step in the evolution of the Universe has left its mark on their statistical properties. Observations of these photons have the potential to unlock the secrets of fundamental physics and cosmology, and to provide key insights into the formation and evolution of cosmic structures such as galaxies and galaxy clusters. Since the traces of these processes are so faint, one must gather enormous datasets to be able to detect them above the unavoidable instrumental and environmental noise. This in turn means that one must be able to use the most powerful computing resources available to be able to process the volume of data. These computing resources include both highly localized supercomputers and widely distributed grid and cloud systems. The PI and Co-Is will develop a common computing infrastructure able to take advantage of both types of resource, and demonstrate its suitability for ongoing and planned experiments by adapting the analysis pipelines of four leading Big Bang observatories to run within it. In addition to enabling the full scientific exploitation of these extraordinarily rich data sets, the investigators will mentor students engaged in this research and run summer schools in applied supercomputing.

This project seeks to enable the detection of the faintest signals in Cosmic Microwave Background radiation, and in particular the pattern of peaks and troughs in the angular power spectra of its polarization field. In order to obtain these spectra one must first reduce the raw observations to maps of the sky in a way the preserve the correlations in the signal and characterizes the correlation in the noise. While the algorithms to perform this reduction are well-understood, applying them to data sets with quadrillions to quintillions of observations is a very serious computational challenge. The computational resources available to the project to address this include both high performance and high throughput computing systems, and one will need to take advantage of both of them. This project will develop a joint high performance/high throughput computational framework, and deploy within it analysis pipelines currently being fielded by the ongoing Atacama Cosmology Telescope, BICEP/Keck Array, POLARBEAR, and South Pole Telescope experiments. By doing so one will also demonstrate the frameworks efficacy for the planned Simons Observatory and CMB-S4 experiments.

This project is supported by the Office of Advanced Cyberinfrastructure in the Directorate for Computer & Information Science & Engineering and the Division of Astronomical Sciences in the Directorate of Mathematical and Physical Sciences.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Effective start/end date9/1/188/31/21


  • National Science Foundation: $40,000.00


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.