A year ago, FRIB and the Department of Statistics and Probability (STT) at Michigan State University (MSU) formed a new collaboration between nuclear physics and the statistical sciences.
The team is using data science to help determine the existence of rare isotopes. Rare isotopes are different forms of chemical elements not normally found in nature. Studying them helps researchers understand the Universe. They can also have very real applications in the modern world, such as in precision medicine.
Their work is now explained in a joint paper in Physical Review C. It was highlighted by the journal as an Editors' Suggestion.
The discipline of statistics is at the core of new data science. It provides critical data analysis services to many areas of inquiry. However, nuclear physics has not traditionally partnered with statistics. The new collaboration is working to fix this problem with the joint hire of statistics researcher Dr. Léo Neufcourt (PhD Columbia University, 2016).
Nuclear physicists study which isotopes can exist for a given element. This fundamental question relates to several factors. One is nucleonic matter stability (how radioactive are some of these isotopes?). A second is the boundaries of nuclear existence (do some of these isotopes even exist?). Another is the chemical elements’ cosmic origin (how did the matter we observe today on Earth and in the universe come to be?).
The joint FRIB/STT team, led by Dr. Neufcourt, built predictive models to address these questions. The models answer questions like, “If we wanted to remove a neutron from this rare isotope (creating a new rare isotope), how hard would we need to hit it, and how much energy would it take?”
The group used computational Bayesian analysis as the statistical framework of choice. This centuries-old idea was developed by the English vicar Thomas Bayes in the 1700s. It calculates probabilities of unobserved events based on their relations with observations and on one's beliefs about them. The modern use of this idea took off about 20 years ago when computers became powerful enough to handle the massive computations involved.
Based on what is currently known about existing nuclei, the researchers used nuclear theory models to predict what new ones might be, and with what probability they exist. To calculate these probabilities, they used Bayesian extrapolation methods.
This statistical approach is close in spirit to the original thought: to relate observed and calculated properties of interest. Employing powerful computational methods to estimate the parameters of statistical models, the team uses the parameters, and how sure they are about their estimates, to make predictions about other quantities which have not yet been observed.
This computer-based analysis is a form of what is sometimes known as supervised machine learning. The computer explores myriads of possibilities and evaluates them according to the underlying probability models (that is the supervision part). It then concentrates around the most relevant ones in view of the observed data. The entire methodology allows researchers to quantify their predictions’ uncertainties very precisely. These tailor-made capabilities in uncertainty quantification (UQ) make the beauty and power of a Bayesian methodology.
By using statistical computer-based analyses, the team can update the UQ when new data become available. This happened recently: a new exotic calcium isotope (60Ca) was discovered. This discovery allowed the FRIB/STT team to increase their confidence that heavier calcium isotopes, up to 70Ca, could exist.
The team is working on several other uses of Bayesian machine learning with applications to nuclear physics, including a project to calibrate the particle beam in the FRIB accelerator.
Members of the collaboration also presented their work in a “Statistics Applications in Nuclear Physics” session at the recent MSU Symposium on Mathematical Statistical and Applications.