department energy – Polkinghorne http://polkinghorne.org/ Thu, 17 Mar 2022 18:30:28 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.3 https://polkinghorne.org/wp-content/uploads/2022/01/icon-2022-01-25T202759.511-150x150.png department energy – Polkinghorne http://polkinghorne.org/ 32 32 New results from MicroBooNE provide clues to the mystery of particle physics https://polkinghorne.org/new-results-from-microboone-provide-clues-to-the-mystery-of-particle-physics/ Thu, 28 Oct 2021 07:00:00 +0000 https://polkinghorne.org/new-results-from-microboone-provide-clues-to-the-mystery-of-particle-physics/ MicroBooNE detector being lowered into the Fermilab experimental facility. Credit: Fermilab New results from a more than a decade-long physics experiment offer insight into unexplained electronic-like events discovered in previous experiments. The results of the MicroBooNE experiment, while not confirming the existence of a proposed new particle, the sterile neutrino, open the way to exploring […]]]>

MicroBooNE detector being lowered into the Fermilab experimental facility. Credit: Fermilab

New results from a more than a decade-long physics experiment offer insight into unexplained electronic-like events discovered in previous experiments. The results of the MicroBooNE experiment, while not confirming the existence of a proposed new particle, the sterile neutrino, open the way to exploring physics beyond the Standard Model, the fundamental force theory of nature and elementary particles.

“The results so far from MicroBooNE make the explanation for the electronic-like anomalous events of the MiniBooNE experiment more likely to be physics beyond the Standard Model,” said William Louis, a physicist at Los Alamos National Laboratory. and member of the MicroBooNE collaboration. “What exactly the new physics is remains to be seen.”

The MicroBooNE experiment at the US Department of Energy’s Fermi National Accelerator Laboratory explores a striking anomaly in particle beam experimentation first discovered by researchers at Los Alamos National Laboratory. In the 1990s, the liquid scintillator neutrino detector experiment at the Laboratory saw more electron-like events than expected, compared to calculations based on the Standard Model.

In 2002, the MiniBooNE follow-up experiment at Fermilab began collecting data to further investigate the LSND outcome. MiniBooNE scientists also saw more electronic-like events than calculations based on the Standard Model prediction. But the MiniBooNE detector had a particular limitation: it was unable to tell the difference between electrons and photons (particles of light) near where the neutrino was interacting.

The MicroBooNE experiment seeks to explore the source of the anomaly for additional events. The MicroBooNE detector is built on state-of-the-art techniques and technology, using special light sensors and over 8,000 painstakingly attached wires to capture particle trails. It is housed in a 40-foot-long cylindrical container filled with 170 tons of pure liquid argon. The neutrinos hit the dense, transparent liquid, releasing additional particles that the electronics can record. The resulting images show detailed particle trajectories and, importantly, distinguish electrons from photons.

“Liquid argon technology is relatively new in neutrino physics, and MicroBooNE has been a pioneer for this technology, demonstrating what amazing physics can be done with it,” said Sowjanya Gollapinni, laboratory physicist and co-lead of analysis. “We had to develop all the tools and techniques from scratch, including how to process the signal, how to reconstruct it, and how to do the calibration, among other things.”

MicroBooNE included a series of measurements: one measurement of photons and three measurements of electrons. In early October, the results of the photon measurement, which specifically looked for Delta radiative decay, provided the first direct evidence disfavoring an excess of neutrino interactions due to this abnormal single photon production as an explanation for the excess of MiniBooNE energy. Delta radiative decay was the only background that the MiniBooNE experiment could not directly constrain.

The three new electron analyzes address the question of whether the excess is due to the scattering of an electron neutrino off an argon nucleus, producing an outgoing electron. The new results disfavor this process as an explanation for excess MiniBooNE, leaving the question of what causes the MiniBooNE anomaly still unanswered.

“In my mind, the fact that neither photon nor electron production explains the excess makes understanding the MiniBooNE results more interesting and more likely to venture into some very interesting physics beyond the Standard Model. “, said Louis.

New results from MicroBooNE provide clues to the mystery of particle physics

Interior of the MicroBooNE Time Projection Chamber detector. Credit: Fermilab

With only half of the MicroBooNE data still evaluated, possible explanations yet to be considered (or tested in future experiments) include the possibility that as yet unproven sterile neutrinos could decay into gamma rays. The decay of the axion – the axion is another hypothetical elementary particle – into gamma or an electron-positron pair could also be responsible. Neutrinos and sterile axions could be linked to the dark sector, the hypothetical realm of yet unobserved different physics and particles.

“The possibilities are endless,” Gollapinni said, “and MicroBooNE will be on a mission to explore each one with the full data set. The results pave the way for further physics experiments, but a full understanding of the results will also depend on our colleagues in theoretical physics, who are very intrigued by these results.”

MicroBooNE is part of a suite of neutrino experiments looking for answers. The ICARUS detector starts collecting physical data and the Short Baseline Proximity Detector (SBND) will come online in 2023; both detectors use liquid argon technology. Together with MicroBooNE, the three experiments form Fermilab’s short-base neutrino program and will yield a wealth of neutrino data. For example, in one month, SBND will record more data than MicroBooNE collected in two years. Today’s results from MicroBooNE will help guide some of the research in the trio’s extensive portfolio.

Building further on MicroBooNE’s techniques and technology, liquid argon will also be used in the Deep Underground Neutrino Experiment (DUNE), a flagship international experiment hosted by Fermilab which already has more than 1,000 researchers from over 30 countries. DUNE will study the oscillations by sending neutrinos 1,300 km (800 miles) through the earth to detectors at the underground research center in Sanford, South Dakota. Combining short- and long-range neutrino experiments will give researchers insight into how these fundamental particles work.

At Fermilab or underground in South Dakota, Laboratory researchers bring the technology and analytical understanding to probe the mysteries of particle physics. What awaits us is unknown, but exciting.

“What we have found and continue to find with MicroBooNE will have important implications for future experiments,” Gollapinni said. “These results point us in a new direction and tell us to think outside the box. MicroBooNE’s journey to explore the exciting physics that awaits us has just begun, and there is much more that MicroBooNE will reveal in the years to come.”


Scientists find no trace of sterile neutrino


Provided by Los Alamos National Laboratory

Quote: New results from MicroBooNE provide clues to the mystery of particle physics (2021, October 28) retrieved February 15, 2022 from https://phys.org/news/2021-10-results-microboone-clues-particle -physics.html

This document is subject to copyright. Except for fair use for purposes of private study or research, no part may be reproduced without written permission. The content is provided for information only.

]]>
New MicroBooNE Results Provide Clues to Particle Physics Mystery – Los Alamos Reporter https://polkinghorne.org/new-microboone-results-provide-clues-to-particle-physics-mystery-los-alamos-reporter/ Wed, 27 Oct 2021 07:00:00 +0000 https://polkinghorne.org/new-microboone-results-provide-clues-to-particle-physics-mystery-los-alamos-reporter/ MicroBooNE detector being lowered into the Fermilab experimental facility. Photo courtesy Fermilab LANL PRESS RELEASE New results from a more than a decade-long physics experiment offer insight into unexplained electronic-like events discovered in previous experiments. The results of the MicroBooNE experiment, while not confirming the existence of a proposed new particle, the sterile neutrino, open […]]]>

MicroBooNE detector being lowered into the Fermilab experimental facility. Photo courtesy Fermilab

LANL PRESS RELEASE

New results from a more than a decade-long physics experiment offer insight into unexplained electronic-like events discovered in previous experiments. The results of the MicroBooNE experiment, while not confirming the existence of a proposed new particle, the sterile neutrino, open the way to exploring physics beyond the Standard Model, the fundamental force theory of nature and elementary particles.

“The results so far from MicroBooNE make the explanation for the electronic-like anomalous events of the MiniBooNE experiment more likely to be physics beyond the Standard Model,” said William Louis, physicist at Los Alamos National Laboratory. and member of the MicroBooNE collaboration. “What exactly the new physics is – that remains to be seen.”

The MicroBooNE experiment at the US Department of Energy’s Fermi National Accelerator Laboratory explores a striking anomaly in particle beam experimentation first discovered by researchers at Los Alamos National Laboratory. In the 1990s, the liquid scintillator neutrino detector experiment at the Laboratory saw more electron-like events than expected, compared to calculations based on the Standard Model.

In 2002, the MiniBooNE follow-up experiment at Fermilab began collecting data to further investigate the LSND outcome. MiniBooNE scientists also saw more electronic-like events than calculations based on the Standard Model prediction. But the MiniBooNE detector had a particular limitation: it was unable to tell the difference between electrons and photons (particles of light) near where the neutrino was interacting.

The MicroBooNE experiment seeks to explore the source of the additional event anomaly. The MicroBooNE detector is built on state-of-the-art techniques and technology, using special light sensors and over 8,000 painstakingly attached wires to capture particle trails. It is housed in a 40-foot-long cylindrical container filled with 170 tons of pure liquid argon. The neutrinos hit the dense, transparent liquid, releasing additional particles that the electronics can record. The resulting images show detailed particle trajectories and, importantly, distinguish electrons from photons.

“Liquid argon technology is relatively new in neutrino physics, and MicroBooNE has been a pioneer for this technology, demonstrating what amazing physics can be done with it,” said Sowjanya Gollapinni, laboratory physicist and co-lead of analysis. “We had to develop all the tools and techniques from scratch, including how to process the signal, how to reconstruct it, and how to do the calibration, among other things.”

MicroBooNE included a series of measurements: one measurement of photons and three measurements of electrons. In early October, the results of the photon measurement, which specifically looked for Delta radiative decay, provided the first direct evidence disfavoring an excess of neutrino interactions due to this abnormal single photon production as an explanation for the excess of MiniBooNE energy. Delta radiative decay was the only background that the MiniBooNE experiment could not directly constrain.

The three new electron analyzes address the question of whether the excess is due to the scattering of an electron neutrino off an argon nucleus, producing an outgoing electron. The new results disfavor this process as an explanation for excess MiniBooNE, leaving the question of what causes the MiniBooNE anomaly still unanswered.

“In my mind, the fact that neither photon nor electron production explains the excess makes understanding the MiniBooNE results more interesting and more likely to venture into some very interesting physics beyond the Standard Model. “, said Louis.

With only half of the MicroBooNE data still evaluated, possible explanations yet to be considered (or tested in future experiments) include the possibility that as yet unproven sterile neutrinos could decay into gamma rays. The decay of the axion – the axion is another hypothetical elementary particle – into gamma or an electron-positron pair could also be responsible. Neutrinos and sterile axions could be linked to the dark sector, the hypothetical realm of yet unobserved different physics and particles.

“The possibilities are endless,” said Gollapinni, “and MicroBooNE will be on a mission to explore each one with the full data set. The results pave the way for further experiments in physics, but a full understanding of the results will also depend on our colleagues in theoretical physics, who are very intrigued by these results.

MicroBooNE is part of a suite of neutrino experiments looking for answers. The ICARUS detector starts collecting physical data and the Short Baseline Proximity Detector (SBND) will come online in 2023; both detectors use liquid argon technology. Together with MicroBooNE, the three experiments form Fermilab’s short-base neutrino program and will yield a wealth of neutrino data. For example, in one month, SBND will record more data than MicroBooNE collected in two years. Today’s results from MicroBooNE will help guide some of the research in the trio’s extensive portfolio.

Building further on MicroBooNE’s techniques and technology, liquid argon will also be used in the Deep Underground Neutrino Experiment (DUNE), a flagship international experiment hosted by Fermilab which already has more than 1,000 researchers from over 30 countries. DUNE will study the oscillations by sending neutrinos 1,300 km (800 miles) through the earth to detectors at the underground research center in Sanford, South Dakota. Combining short- and long-range neutrino experiments will give researchers insight into how these fundamental particles work.

At Fermilab or underground in South Dakota, Laboratory researchers bring the technology and analytical understanding to probe the mysteries of particle physics. What awaits us is unknown, but exciting.
“What we have found and continue to find with MicroBooNE will have important implications for future experiments,” Gollapinni said. “These results point us in a new direction and tell us to think outside the box. MicroBooNE’s journey to explore the exciting physics that awaits us has just begun, and there is much more that MicroBooNE will reveal in the years to come.

Inside the MicroBooNE Time Projection Chamber detector.pPhoto courtesy of Fermilab

MicroBooNE is supported by the US Department of Energy, US National Science Foundation, Swiss National Science Foundation, UK Science and Technology Facilities Council, UK Royal Society and European Union Horizon 2020.

On Los Alamos National Laboratory
Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Triad, a public service-focused national security science organization equally owned by its three founding members. : the Battelle Memorial Institute (Battelle), the Texas A&M University System (TAMUS), and the University of California (UC) Regents for the Department of Energy’s National Nuclear Security Administration.

Los Alamos strengthens national security by ensuring the safety and reliability of America’s nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and addressing issues related to energy, environment, infrastructure, to global health and security issues.

]]>
Argonne aurora will accelerate particle physics discoveries at CERN https://polkinghorne.org/argonne-aurora-will-accelerate-particle-physics-discoveries-at-cern/ Thu, 22 Jul 2021 07:00:00 +0000 https://polkinghorne.org/argonne-aurora-will-accelerate-particle-physics-discoveries-at-cern/ July 22, 2021 – The U.S. Department of Energy’s (DOE) Argonne National Laboratory will house one of the nation’s first exascale supercomputers when Aurora arrives in 2022. To prepare code for the architecture and System-wide, 15 research teams are taking part in the Aurora Early Science program through the Argonne Leadership Computing Facility (ALCF), a […]]]>

July 22, 2021 – The U.S. Department of Energy’s (DOE) Argonne National Laboratory will house one of the nation’s first exascale supercomputers when Aurora arrives in 2022. To prepare code for the architecture and System-wide, 15 research teams are taking part in the Aurora Early Science program through the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility. With access to pre-production time on the supercomputer, these researchers will be among the first in the world to use an exascale machine for science.

Early philosophers first formulated the idea of ​​the atom around the fifth century BCE. And just when we thought we had understood its basic structure – protons, neutrons and electrons – theories and technologies emerged to prove us wrong. It turns out that there are still more fundamental particles, like quarks, bound together by aptly named gluons.

Physicists discovered many of these and other particles in the huge beasts of machinery we call colliders, helping to develop what we know today as the Standard Model of physics. But there are questions that keep nagging: is there something even more fundamental? Is the standard model all there is?

Determined to find out, the high-energy physics community is working to integrate ever larger colliders and more sophisticated detectors with exascale computing systems. Among them is Walter Hopkins, assistant physicist at Argonne National Laboratory and collaborator on the ATLAS experiment at CERN’s Large Hadron Collider (LHC), near Geneva, Switzerland.

In collaboration with researchers from Argonne and Lawrence Berkeley National Lab, Hopkins is leading an Aurora Early Science Program project through the ALCF to prepare software used in LHC simulations for exascale computing architectures, including the upcoming exascale machine d ‘Argonne, Aurora. With a trillion calculations per second, Aurora is at the frontier of supercomputing and at the height of the next particle physics challenge of gargantuan scale.

The project was started several years ago by physicist and Argonne Emeritus James Proudfoot, who saw the distinct advantages of exascale in enhancing the impact of such complex science.

Align the codes on the new architecture

The collisions produced in the LHC occur in one of the many detectors. The one the team is focusing on, ATLAS, witnesses billions of particle interactions every second and the new particle signatures these collisions create in their wake.

One type of code the team is focusing on, called event generators, simulates the underlying physical processes that occur at interaction points in the 17-mile-circumference collider ring. Aligning the physics produced by the software with that of the Standard Model helps researchers accurately simulate collisions and predict the types, trajectories and energies of residual particles.

Detecting physics in this way creates a mountain of data and requires equally significant computational time. And now CERN is upping the ante as it prepares to upgrade the LHC’s luminosity, enabling more particle interactions and a 20-fold increase in data output.

As the team looks to Aurora to handle this increase in its simulation needs, the machine is not without some challenges.

Workers inside ATLAS, one of several primary detectors at CERN’s Large Hadron Collider. ATLAS witnesses a billion particle interactions every second and the signatures of new particles created in proton-proton collisions at near light speed. (Picture: CERN)

Until recently, event generators ran on computer processors (central processing units). Although they work fast, a processor can usually only perform several operations at a time.

Aurora will feature both CPUs and GPUs (graphics processing units), the choice of gamers around the world. GPUs can handle many operations by breaking them up into thousands of smaller tasks spread across many cores, the engines that drive both types of units.

But it takes a lot of effort to move CPU-based simulations to GPUs efficiently, Hopkins notes. So making this move to prepare for both Aurora and the onslaught of new LHC data presents several challenges, which have become a central part of the team’s goal.

“We want to be able to use Aurora to help us meet these challenges,” says Hopkins, “but that forces us to study computing architectures that are new to us and our code base. For example, we focus on a generator used in ATLAS, called MadGraph, which runs on GPUs, which are more parallel and have different memory management requirements.

A particle interaction simulation code, MadGraph, has been written by an international team of high-energy physics theorists and meets the simulation needs of the LHC.

Simulation and AI support experimental work

The LHC has played an important role in bringing the predictions to fruition. Even more famously, the Standard Model predicted the existence of the Higgs boson, which imparts mass to all fundamental particles; ATLAS and its counterpart detector, CMS, confirmed the existence of Higgs in 2012.

But, as is often the case in science, big discoveries can lead to more substantial questions, many of which are not predicted by the Standard Model. Why does the Higgs have the mass it is? What is dark matter?

“The reason for this very large LHC upgrade is that we hope to find that needle in the haystack, that we will find an anomaly in the dataset that offers a hint of physics beyond the Standard Model,” says Hopkins. .

A combination of computing power, simulation, experience, and artificial intelligence (AI) will greatly aid this research by providing accuracy in both prediction and identification.

When the ATLAS detector witnesses these particle collisions, for example, it records them as electronic signals. These are reconstructed in the form of pixels of bursts of energy which can correspond to the passage of an electron.

“But just like in AI, where the canonical example identifies cats and dogs in images, we have algorithms that identify and reconstruct those electronic signals into electrons, protons, and other things,” says Taylor Childers, computer scientist at the ALCF, member of the team. .

Data reconstructed from real crash events is then compared to simulated data to look for differences in patterns. This is where the accuracy of the physical models comes in. If they perform well and the real and simulated data do not match, you continue to measure and eliminate anomalies until it is likely that you found that needle, that something that doesn’t fit the standard pattern.

The team is also using AI to quantify uncertainty, to determine the likelihood that they have correctly identified a particle.

Humans are able to identify particles to a limited extent – several parameters like moment and position could tell us that a certain particle is an electron. But base that characterization on 10 interrelated parameters, then that’s another story altogether.

“This is where artificial intelligence really shines, especially if these input parameters are correlated, like the momentum of the particles around an electron and the momentum of the electron itself,” says Hopkins. “These correlations are hard to deal with analytically, but because we have so much simulation data, we can teach artificial intelligence and it can tell us, it’s an electron with this probability because I have all this information from Entrance. “

Exascale computing and the way forward

In preparation for Aurora, the team continues to work on programming languages ​​for new architectures and code to run on Intel hardware that will be used on Aurora, as well as hardware from other vendors.

“Part of the R&D we do with our partner Intel is to make sure the hardware does what we expect it to do and does it efficiently,” says Childers. “Having a machine like Aurora will give us lots of computing power and lots of nodes to effectively reduce solution time, especially when we move to the upgraded LHC.”

The solution is an answer to a fundamental question: is there more beyond the standard model? — and one that could have repercussions unimaginable a hundred years from now, Hopkins notes.

“Basic research can give us knowledge that can lead to societal transformation, but if we don’t do research, it won’t lead anywhere,” he says.

The ALCF is a DOE Office of Science user facility.

Funding for this project was provided by the DOE Office of Science: Offices of High Energy Physics and Advanced Scientific Computing Research. ATLAS is an international collaboration supported by the DOE.

About the ALCF

The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding across a wide range of disciplines. Supported by the Advanced Scientific Computing Research (ASCR) program of the U.S. Department of Energy’s (DOE) Office of Science, the ALCF is one of two DOE advanced computing facilities dedicated to open science.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts cutting-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state, and municipal agencies to help them solve their specific problems, advance American scientific leadership, and prepare the nation for a better future. With employees in more than 60 countries, Argonne is managed by UChicago Argonne, LLC for the US Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the largest supporter of basic physical science research in the United States and works to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.

Click here to find out more.


Source: JOHN SPIZZIRRI, ALCF

]]>
Muons don’t fit the standard model of particle physics https://polkinghorne.org/muons-dont-fit-the-standard-model-of-particle-physics/ Fri, 16 Apr 2021 07:00:00 +0000 https://polkinghorne.org/muons-dont-fit-the-standard-model-of-particle-physics/ Share this Item You are free to share this article under the Attribution 4.0 International License. Fundamental particles called muons behave in ways that scientists’ best theory to date, the Standard Model of particle physics, does not predict, the researchers report. The discovery comes from early results from the Muon g-2 experiment at the US […]]]>

Fundamental particles called muons behave in ways that scientists’ best theory to date, the Standard Model of particle physics, does not predict, the researchers report.

The discovery comes from early results from the Muon g-2 experiment at the US Department of Energy’s Fermi National Accelerator Laboratory.

“This experience is a bit like a detective novel.”

This historic result confirms a discrepancy that has plagued researchers for decades.

The strong evidence that muons deviate from the Standard Model calculation could hint at some exciting new physics. The muons in this experiment act as a window into the subatomic world and could interact with as yet unknown particles or forces.

“This experiment is a bit like a detective novel,” says team member David Hertzog, a University of Washington physics professor and founding spokesperson for the experiment. “We analyzed data from the inaugural Muon g-2 test at Fermilab and discovered that the Standard Model alone cannot explain what we found. Something else, perhaps beyond the standard model, may be required. »

A muon is about 200 times more massive than its cousin the electron. They occur naturally when cosmic rays hit the Earth’s atmosphere. Fermilab’s particle accelerators can produce large numbers of them. Like electrons, muons act as if they have a small internal magnet. In a strong magnetic field, the direction of the muon magnet precedes, or “wobbles,” much like the axis of a spinning top. The strength of the internal magnet determines the muon’s precession rate in an external magnetic field and is described by a number known as the g-factor. This number can be calculated with ultra-high precision.

As muons flow through the Muon g-2 magnet, they also interact with a “quantum foam” of subatomic particles that appear and disappear. Interactions with these short-lived particles affect the value of the g-factor, causing muon precession to accelerate or slightly slow down. The standard model predicts with great accuracy what the value of this “abnormal magnetic moment” should be. But if the quantum foam contains additional forces or particles not accounted for by the Standard Model, it would further alter the g-factor of the muon.

Hertzog, then at the University of Illinois, was a lead scientist in the previous experiment at Brookhaven National Laboratory. This attempt ended in 2001 and offered clues that the behavior of the muon did not conform to the Standard Model. The new measurement from the Muon g-2 experiment at Fermilab strongly agrees with the value found at Brookhaven and deviates from theory with the most accurate measurement to date.

The accepted theoretical values ​​for the muon are:

  • g-factor: 2.00233183620(86)
  • abnormal magnetic moment: 0.00116591810(43)

The new experimental global mean results announced today by the Muon g-2 collaboration are:

  • g-factor: 2.00233184122(82)
  • abnormal magnetic moment: 0.00116592061(41)

The combined results from Fermilab and Brookhaven show a difference with theoretical predictions at a significance of 4.2 sigma, a little short of the 5 sigma – or 5 standard deviations – that scientists prefer as a claim of discovery. But it’s still compelling evidence of new physics. The probability that the results are a statistical fluctuation is approximately 1 in 40,000.

“This result from the first run of the Fermilab Muon g-2 experiment is arguably the most anticipated result in particle physics in recent years,” says Martin Hoferichter, assistant professor at the University of Bern and member of the theoretical collaboration that predicts the value of the standard model. “After almost a decade, it’s great to see this massive effort finally come to fruition.”

The Fermilab experiment, which is underway, reuses the main component of the Brookhaven experiment, a superconducting magnetic storage ring 50 feet in diameter. In 2013, it was transported 3,200 miles by land and sea from Long Island to suburban Chicago, where scientists were able to take advantage of Fermilab’s particle accelerator and produce the states’ most intense muon beam. -United. Over the next four years, researchers mounted the experiment; tuned and calibrated an incredibly uniform magnetic field; developed new techniques, instruments and simulations; and thoroughly tested the entire system.

The Muon g-2 experiment sends a beam of muons into the storage ring, where they circulate thousands of times at near the speed of light. Detectors lining the ring allow scientists to determine how fast muons “wobble”.

Many Fermilab sensors and detectors have been built at the University of Washington, such as instruments to measure the muon beam as it enters the storage ring and to detect the telltale particles that appear when the muons decay . Dozens of scientists – including professors, postdoctoral researchers, technicians, graduate students and undergraduates – worked to assemble these sensitive instruments, then set them up and monitor them at Fermilab.

“The outlook for the new result has triggered a coordinated theoretical effort to provide our experimental colleagues with a robust and consensus prediction of the Standard Model,” says Hoferichter. “Future runs will motivate further refinements, to enable a conclusive statement whether physics beyond the Standard Model lurks in the muon’s anomalous magnetic moment.”

In its first year of operation, 2018, the Fermilab experiment collected more data than all previous muon g-factor experiments combined. The Muon g-2 collaboration has now completed the analysis of the movement of more than 8 billion muons from this first period.

Analysis of data from the second and third cycles of the experiment is in progress; the fourth is in progress and a fifth is planned. Combining the results from all five tests will give scientists an even more precise measurement of the muon’s ‘wobble’, revealing with greater certainty whether new physics is lurking in the quantum foam.

“So far, we’ve analyzed less than 6% of the data the experiment will eventually collect,” says Fermilab scientist Chris Polly, who is a co-spokesperson for the current experiment and was a graduate student. of the University of Illinois under Hertzog during the Brookhaven Experiment. “While these early results tell us there is an intriguing difference to the standard model, we will learn much more over the next two years.”

“With these exciting results, our team, especially our students, are excited to push hard on analyzing the remaining data and taking future data to achieve our ultimate goal of accuracy,” says Peter Kammel, research professor of physics at the University of Washington.

An article on the research appears in Physical examination letters. Hertzog will present the findings at a University of Washington Physics Department Symposium on April 12.

The Muon g-2 experiment is an international collaboration between Fermilab in Illinois and more than 200 scientists from 35 institutions in seven countries.

Source: University of Washington

]]>
Brookhaven Lab Appoints New Director of Nuclear and Particle Physics Branch https://polkinghorne.org/brookhaven-lab-appoints-new-director-of-nuclear-and-particle-physics-branch/ Thu, 15 Apr 2021 07:00:00 +0000 https://polkinghorne.org/brookhaven-lab-appoints-new-director-of-nuclear-and-particle-physics-branch/ Haiyan Gao, nuclear physicist and professor, will join the lab as associate lab director for nuclear and particle physics UPTON, NY – Haiyan Gao, currently the Henry W. Newson Professor Emeritus of Physics at Duke University, will join the U.S. Department of Energy’s Brookhaven National Laboratory as Associate Laboratory Director (ALD) for Nuclear Physics and […]]]>

Haiyan Gao, nuclear physicist and professor, will join the lab as associate lab director for nuclear and particle physics

UPTON, NY – Haiyan Gao, currently the Henry W. Newson Professor Emeritus of Physics at Duke University, will join the U.S. Department of Energy’s Brookhaven National Laboratory as Associate Laboratory Director (ALD) for Nuclear Physics and particles (NPP) from or around June 1, 2021.

Gao, who has a long background in nuclear physics, will help develop Brookhaven’s collective long-term vision for the next 10 years. She will also work throughout the lab and beyond to develop her emerging expertise at the future Electron-Ion Collider (EIC), a one-of-a-kind nuclear physics research facility to be built at the lab over the next decade after Brookhaven’s flagship nuclear physics facility, the Relativistic Heavy Ion Collider, completes its research mission.

“The Nuclear and Particle Physics Branch is internationally well-known in the fields of accelerator science, high-energy physics and nuclear physics,” Gao said. “I am very excited about the opportunity and the impact that I will be able to have in collaboration with many people at the Lab.”

Gao will replace ALD Deputy for High Energy Physics Dmitri Denisov, who became interim NPP ALD after Berndt Mueller left office last year to return to teaching and research full-time at Duke.

“We are delighted to welcome Haiyan to Brookhaven at such an exciting time for nuclear and particle physics,” said Brookhaven Laboratory Director Doon Gibbs. “His perspective and experience will be instrumental in advancing science here in the lab and beyond.”

Gao joins Brookhaven Lab as he develops the EIC in collaboration with scientists at the DOE’s Thomas Jefferson National Accelerator Facility. The EIC will offer scientists a deeper look at the building blocks of visible matter and the most powerful force in nature.

“What’s important in the end is that we really deliver the science,” she said.

The facility is one that the nuclear physics community has been campaigning for for many years, to work towards a more complete understanding of nucleons and atomic nuclei in quantum chromodynamics, the physical theory that describes strong interactions, Gao noted. . It will also allow scientists to discover new physics beyond the Standard Model of particle physics, Gao said.

“This facility also gives us a wonderful opportunity to train a highly motivated scientific and technical workforce in this country,” she added.

In addition to his expertise in nuclear physics, Gao has a keen interest in promoting diversity, equity and inclusion in science.

Gao obtained his doctorate. in physics from the California Institute of Technology in 1994. Since then, she has held several positions in the field, including as assistant physicist at Argonne National Laboratory and assistant and associate professor of physics at Massachusetts Institute of Technology.

While at Duke, Gao also served as the Founding Professor of Physics and Vice Chancellor for Academic Affairs at Duke Kunshan University in Kunshan, China, where she spent some of her childhood years.

Gao’s research interests at Duke have included the structure of the nucleon, the search for exotic states of quantum chromodynamics, fundamental studies of low-energy symmetry to search for new physics beyond the standard model of electroweak interactions, and the development of polarized targets.

She was elected a Fellow of the American Physical Society in 2007 and won the U.S. Department of Energy’s Best Junior Researcher Award in 2000.

Brookhaven National Laboratory is supported by the US Department of Energy’s Office of Science. The Office of Science is the largest supporter of basic physical science research in the United States and works to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.

Follow @BrookhavenLab on Twitter or find us on Facebook.

]]>
Coffea accelerates the analysis of particle physics data https://polkinghorne.org/coffea-accelerates-the-analysis-of-particle-physics-data/ Fri, 19 Feb 2021 08:00:00 +0000 https://polkinghorne.org/coffea-accelerates-the-analysis-of-particle-physics-data/ Analyzing the mountains of data generated by the Large Hadron Collider at the European CERN laboratory takes so long that even computers need coffee. Or rather, Coffea — Columnar Object Framework for Effective Analysis. A package in the Python programming language, Coffea (pronounced like the stimulating drink) speeds up the analysis of massive datasets in […]]]>

Analyzing the mountains of data generated by the Large Hadron Collider at the European CERN laboratory takes so long that even computers need coffee. Or rather, Coffea — Columnar Object Framework for Effective Analysis.

A package in the Python programming language, Coffea (pronounced like the stimulating drink) speeds up the analysis of massive datasets in high-energy physics research. Although Coffea streamlines the calculations, the main objective of the software is to optimize the scientists’ time.

“A human’s efficiency in producing scientific results is of course affected by the tools you have,” said Matteo Cremonesi, postdoctoral fellow at the US Department of Energy’s Fermi National Accelerator Laboratory. “If it takes me more than a day to get a single number out of a calculation – which often happens in high-energy physics – it’s going to hurt my effectiveness as a scientist.”

Frustrated by the tedious manual labor they faced when writing computer code to analyze LHC data, Cremonesi and Fermilab scientist Lindsey Gray assembled a team of Fermilab researchers in 2018 to adapt the techniques of big data to solve the most difficult questions of high energy physics. . Since then, a dozen research groups on the CMS experiment, one of the two large general-purpose detectors of the LHC, have adopted Coffea for their work.

A dozen research groups on the CMS experiment at the Large Hadron Collider have adopted the Coffea data analysis tool for their work. Using information about particles generated in collisions, Coffea enables broad statistical analyzes that improve researchers’ understanding of the underlying physics, enabling faster run times and more efficient use of computing resources. Photo: CERN

Using information about particles generated in collisions, Coffea enables broad statistical analyzes that refine researchers’ understanding of the underlying physics. (The LHC data processing facilities perform the initial conversion of the raw data into a format that particle physicists can use for analysis.) A typical analysis of the current LHC dataset involves the processing of approximately 10 billion particle events that can total over 50 terabytes of data. That’s the data equivalent of about 25,000 hours of streaming video on Netflix.

At the heart of Fermilab’s analysis tool is the move from a method known as event loop analysis to one called column analysis.

“You have a choice if you want to iterate over each row and do an operation in the columns or if you want to iterate over the operations you do and attack all the rows at once,” explained Fermilab postdoctoral researcher Nick Smith, the main developer of Coffea. “It’s kind of an order of operations.”

For example, imagine that for each row, you wanted to sum the numbers in three columns. In the event loop analysis, you will start by adding the three numbers in the first line. Then you will add the three numbers in the second row, then move on to the third row, and so on. With a columnar approach, on the other hand, you’ll start by adding the first and second columns for all rows. Then you would add this result to the third column for all rows.

“Either way, the end result would be the same,” Smith said. “But there are trade-offs you make under the hood, in the machine, that have a big impact on efficiency.”

In datasets with many rows, columnar analysis performs about 100 times faster than event loop analysis in Python. Yet before Coffea, particle physicists primarily used event loop analysis in their work, even for datasets with millions or billions of collisions.

The Fermilab researchers decided to pursue a columnar approach, but they faced a daunting challenge: high-energy physics data cannot easily be represented in tabular form with rows and columns. One particle collision might produce a multitude of muons and few electrons, while the next might produce no muons and many electrons. Using a library of Python code called Awkward Array, the team devised a way to convert the jagged, nested structure of LHC data into arrays compatible with columnar analysis. Generally, each row corresponds to a collision, and each column corresponds to a property of a particle created during the collision.

The benefits of Coffea extend beyond faster execution times (minutes instead of hours or days when it comes to interpreted Python code) and more efficient use of computing resources. The software takes mundane coding decisions out of the hands of scientists, allowing them to work at a more abstract level with less chance of making mistakes.

“Researchers aren’t here to be programmers,” Smith said. “They are there to be data scientists.”

Cremonesi, who researches dark matter at CMS, was among the first researchers to use Coffea without a backup system. At first, he and the rest of the Fermilab team actively sought to persuade other groups to try the tool. Now researchers frequently approach them asking how to apply Coffea to their own work.

Soon, the use of Coffea will extend beyond the CMS. Researchers at the Institute for Software Research and Innovation for High Energy Physics, supported by the US National Science Foundation, plan to integrate Coffea into future CMS analysis systems and ATLAS, the LHC’s other large general-purpose experimental detector. An LHC upgrade known as the High-Luminosity LHC, due for completion in the mid-2020s, will record around 100 times more data, making the efficient data analysis offered by Coffea even more valuable for international collaborators in the LHC experiments.

Going forward, the Fermilab team also plans to break Coffea down into multiple Python packages, allowing researchers to use only the stuff that’s relevant to them. For example, some scientists use Coffea primarily for its histogram function, Gray said.

For the Fermilab researchers, the success of Coffea reflects a necessary shift in the mindset of particle physicists.

“Historically, the way we do science has focused a lot on the material component of creating an experiment,” Cremonesi said. “But we have reached an era in physics research where managing the software component of our scientific process is just as important.”

Coffea promises to synchronize high-energy physics with recent advances in big data in other scientific fields. This cross-pollination may prove to be Coffea’s most important benefit.

“I think it’s important for us as a high-energy physics community to think about what kind of skills we’re imparting to the people we’re training,” Gray said. “Ensuring that we, as a field, are relevant to the rest of the world when it comes to data science is a good thing to do.”

US participation in CMS is supported by the Department of Energy Office of Science.

Fermilab is supported by the US Department of Energy’s Office of Science. The Office of Science is the largest supporter of basic physical science research in the United States and works to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

]]>
Experimental and theoretical physicists can now apply for the Rosen Scholar Fellowship to work at the Los Alamos Neutron Science Center https://polkinghorne.org/experimental-and-theoretical-physicists-can-now-apply-for-the-rosen-scholar-fellowship-to-work-at-the-los-alamos-neutron-science-center/ Mon, 11 Jan 2021 08:00:00 +0000 https://polkinghorne.org/experimental-and-theoretical-physicists-can-now-apply-for-the-rosen-scholar-fellowship-to-work-at-the-los-alamos-neutron-science-center/ Newswise – LOS ALAMOS, New Mexico, January 11, 2021—Experimental and theoretical scientists seeking an opportunity to pursue research in neutron scattering, dynamic materials, isotope production, and applied and basic nuclear physics research at the Los Alamos Neutron Science Center (LANSCE) can apply for the Rosen Scholar Scholarship. Applications are due March 1, 2021. “As the […]]]>

Newswise – LOS ALAMOS, New Mexico, January 11, 2021—Experimental and theoretical scientists seeking an opportunity to pursue research in neutron scattering, dynamic materials, isotope production, and applied and basic nuclear physics research at the Los Alamos Neutron Science Center (LANSCE) can apply for the Rosen Scholar Scholarship.

Applications are due March 1, 2021.

“As the flagship experimental facility of Los Alamos National Laboratory, LANSCE, research touches nearly every aspect of the laboratory’s mission, and we are always looking for opportunities to advance this work in innovative ways,” said Mike Furlanetto, LANSCE User Installation Manager. “The Rosen Scholar Fellowship offers the opportunity to combine the unique tools of LANSCE with some of the most creative ideas from academia to answer cutting-edge scientific questions. It’s also a perfect way to commemorate the creativity of Louis Rosen, the visionary behind LANSCE.

Research at LANSCE currently includes materials science using neutron scattering at the Lujan Center, dynamic materials at the Proton X-Ray Facility, isotope production at the Isotope Production Facility, and applied and basic research in nuclear physics at the Ultracold Neutron Facility, Weapons Neutron Research Center and the Lujan Center.

The Rosen Scholar scholarship is reserved for individuals recognized as scientific leaders in a field of research currently carried out at LANSCE and who exemplify the innovative and visionary qualities of Louis Rosen. The scholarship was created to honor Rosen’s memory, accomplishments, hard work, and affection for the wide range of sciences practiced at LANSCE. Louis Rosen’s outstanding leadership and scientific career at Los Alamos spanned six and a half decades and included both the initial concept of the Los Alamos Meson Physics Facility in the 1960s and its commissioning in 1972.

The Rosen Fellow is expected to be a resident at LANSCE and contribute scientific expertise to both LANSCE and the wider Los Alamos scientific community. The position will support the Rosen Fellow at his current salary, including relocation costs, for up to one year. The start date, end date and duration (maximum of 12 months) of the scholarship are flexible, but must be between October 1, 2021 and September 30, 2022.

Past Rosen Scholars can attest to the worth of the scholarship. “I was extremely excited and honored to be named a 2020 Rosen Fellow, which gave me the opportunity to dedicate a full semester to working in the lab and with kindred spirits in the subatomic physics group,” said Tim. Chupp, professor of physics, applied physics and biomedical engineering at the University of Michigan.

“We developed the lab’s neutron electric dipole moment experiment,” Chupp said. “Los Alamos has the best source of ultracold neutrons in the world. The dipole moment would arise due to as yet unknown elementary particle forces that may also have produced the dominance of matter over antimatter in the early universe. I especially enjoyed working and learning from the physicists, engineers and staff at Los Alamos and hopefully bringing some of my experience to this awesome project.

“Being the 2019 Rosen Scholar has been an incredible experience from a technical, professional and human point of view,” said Paolo Rech, associate professor at the Institute of Computer Science at the Federal University of Rio Grande do Sul at the Brazil. Rech called the Los Alamos National Laboratory “a unique place” where exceptional researchers from the most varied fields meet.

“Whenever you have a doubt or a question, you’ll be sure to find someone with an answer or better yet, with more questions,” Rech said. “It stimulates research. Los Alamos is the right place to have new ideas and implement them. In addition, the staff is very helpful, which makes you enthusiastic and productive from day one. Finally, Los Alamos is a wonderful place, where it is easy to be inspired, to discover impressive landscapes and peaceful corners. I couldn’t be more grateful and proud of what we’ve accomplished during my year at the Lab.

Further information on LANSCE is available at http://lansce.lanl.gov

On Los Alamos National Laboratory
Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Triad, a public service-focused national security science organization equally owned by its three founding members. : the Battelle Memorial Institute (Battelle), the Texas A&M University System (TAMUS), and the University of California (UC) Regents for the Department of Energy’s National Nuclear Security Administration.

Los Alamos strengthens national security by ensuring the safety and reliability of America’s nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and addressing issues related to energy, environment, infrastructure, to global health and security issues.
LA-UR-21-20085

]]>
Secondary school teachers, meet particle physics https://polkinghorne.org/secondary-school-teachers-meet-particle-physics/ Tue, 05 Jan 2021 08:00:00 +0000 https://polkinghorne.org/secondary-school-teachers-meet-particle-physics/ Imagine this: a stationary object such as a vase suddenly explodes, sending fragments flying. Given the final energies and momentum of the fragments, can you determine the mass of the object before it shattered? Dave Fish introduces his students to this common momentum conservation problem, with a twist. Instead of describing the explosion of a […]]]>

Imagine this: a stationary object such as a vase suddenly explodes, sending fragments flying. Given the final energies and momentum of the fragments, can you determine the mass of the object before it shattered?

Dave Fish introduces his students to this common momentum conservation problem, with a twist. Instead of describing the explosion of a macroscopic object like a vase, it describes the transformation of a top quark and a top antiquark into other fundamental particles.

Fish teaches high school physics and is a professor in residence at the Perimeter Institute for Theoretical Physics in Ontario. In Canada, particle physics is “one of those things that teachers tend to leave until the end of class and then they run out of time,” says Fish. “Most of us as secondary school teachers feel overwhelmed by the content.”

Particle physics makes its appearance in the curriculum of the International Baccalaureate, a program recognized as an entrance qualification to higher education by many universities around the world. The subject also appears in some state curricula, such as that of North Rhine-Westphalia in Germany. But in general, “there aren’t many programs that deal explicitly with particle physics,” says Jeff Wiener, head of teacher programs at CERN. “Those who do usually focus on rather boring stuff like, ‘Name two leptons.'”

Put particles in the program

Many high school science teachers who would like to teach particle physics say they feel insufficiently informed about the subject or don’t know how to include it without sacrificing required curriculum topics.

Fish and Wiener are two of many people hoping to change that. They see many opportunities to incorporate particle physics into standard curricula focused on general physics concepts. To teach conservation of momentum, try using real data from the discovery of the top quark (an activity developed by educators at the US Department of Energy’s Fermi National Accelerator Laboratory). To demonstrate the movement of charged particles in magnetic fields, show photographs of particle detectors called bubble chambers. To give an example of circular motion, discuss the mystery of dark matter.

One of Fish’s former students, Nikolina Ilic, considers a dark matter project she undertook in her class a turning point in her education. “I realized that we don’t know what 95% of the universe is made of, and that blew my mind,” she says. “That’s when I decided to pursue particle physics.”

Ilic continued her doctoral research at CERN, where she contributed to the statistical analysis for the discovery of the Higgs boson.

In the years he is not teaching high school students, Fish leads workshops at the Perimeter Institute to help other teachers bring particle physics into their classrooms. Each year, approximately 40 or 50 teachers from Canada and other countries attend a week-long EinsteinPlus workshop, participating in a variety of collaborative activities designed to teach them about modern physics. One of the most popular is a card sorting game that teaches standard pattern patterns and symmetries. In each activity, “we ask the teachers to be the students and ask the questions that the students would ask,” says Fish.

Fermilab organizes similar teacher workshops covering various physics topics for primary to secondary school teachers.

As the COVID-19 pandemic has forced many programs to move online, Fermilab has focused on finding ways to interact with teachers and students virtually. “We have career talks with lab staff, classroom presentations that we create with teachers and host virtually, Virtual Ask-a-Scientist, and Saturday Morning Physics,” says Amanda Early, program manager at Education at Fermilab which runs K-12 physical science programs. .

Each year, Fermilab organizes programs for educators and students, engaging them with the science of Fermilab. “The more you expose students to particle physics — the size and scale of it and its benefits — the more opportunities children will see to engage in science,” says Early.

In 2020, one of the Education Group’s summer science institutes focused specifically on helping high school teachers adapt modern physics lessons to the next-generation science standards used in many US states. Approximately 80 teachers from the Chicago area and across the country participated in the five-day interactive workshop, which in 2020 was offered online.

Next Generation Science Standards do not explicitly mention particle physics. But the cross-cutting concepts and scientific and engineering practices that frame them dovetail nicely with the subject, says David Torpe, an Illinois high school science teacher who has led professional development workshops at Fermilab for six years.

“Let’s talk about process, let’s talk about how particle physicists analyze data, let’s talk about how they solve problems,” says Torpe. “The ideas of energy and cause and effect naturally fit in too. I think a good strategy is to find a bit of particle physics that you find interesting and insert it here or there.

Bringing teachers to CERN, and CERN to teachers

Across the Atlantic, in Europe, CERN’s teacher programs attract more than 1,000 secondary school teachers from around the world to Geneva each year. Between physics lessons, professors visit the laboratories and have question-and-answer sessions with CERN scientists.

“The idea was that when we returned to Mexico, we would be ambassadors and encourage certain students to see that it is possible to go and do research at CERN”, explains Eduardo Morales Gamboa, who followed the program of teaching Spanish in 2019.

Since visiting the massive CMS detector and seeing particle tracks in a homemade cloud chamber, he has incorporated particle physics – and the many useful applications that have come from it – into his class discussions of intersections. of science, technology and society. Eventually, he says, he hopes to build a cloud chamber with his students.

According to Wiener, Morales Gamboa’s experience is common. Many alumni of teacher programs even return to CERN, this time with their students for the trip, to ignite the next generation’s enthusiasm for particle physics.

The success of CERN’s outreach efforts stems in part from integration with physics education research. Indeed, CERN teacher programs are designed to equip participants with knowledge not only of particle physics, but also of the best pedagogical practices for science education.

One such practice is to have students move through “predict-observe-explain” cycles. “You encourage students to make a prediction of what will happen before doing the experiment. This way you make sure that they first activate their previous knowledge and become curious about the result,” says Julia Woithe, who coordinates the hands-on learning labs at CERN. “Then, if they’re surprised by the observed result, they have to work out as a team how to explain the differences between their predictions and their observations. This usually leads to a powerful ‘eureka!’ moment.”

In addition to organizing events at CERN, Wiener traveled to India to collaborate with educators from the International School of Geneva in the first science education program in South Asia last year. Eighty teachers from the region participated in the week-long program at Shiv Nadar Noida School in New Delhi.

Vinita Sharat, the school’s STEAM coordinator, taught particle physics for a decade but remembers initially facing resistance from organizations where she previously worked. “The first challenge is to change the mentality of authority,” she says. “They asked why I was teaching it since it’s not part of the curriculum.”

His students, on the other hand, had no scruples. Some found particle physics so fascinating that they stayed online until midnight to discuss quarks and leptons with Sharat. “Students will always be ready to learn something related to nature,” she says.

Sharat fosters the creative side of students in her particle physics classes by encouraging them to write poems, make videos or choreograph dances to explain the concepts they are studying. Like Fish, Sharat stayed in touch with several former students whom she inspired to pursue careers in physics.

“The basis of everything”

After the CERN program at her school, Sharat hopes more teachers across South Asia will incorporate particle physics into their classrooms. And Wiener plans to lead more teaching workshops around the world in the future.

For now, COVID-19 has interrupted in-person professional development workshops. But teachers can still access some online resources: CERN’s hands-on learning lab S’Cool LAB (until recently run by Woithe), the Perimeter Institute, Fermilab and QuarkNet offer free downloads of their teaching materials interactive.

For Morales Gamboa, the benefits of teaching particle physics in high school go beyond encouraging a few students to pursue careers in this field. Talking about connections to engineering shows how abstract scientific ideas are linked to everyday life, while describing massive international projects conveys the key collaborative spirit of modern science.

Stacy Gates, an Illinois high school science teacher who taught at Fermilab’s Summer High School Physics Institute alongside Torpe in 2020, points out that teaching particle physics fosters critical thinking. “I encourage my students to question me when they don’t believe that particles can behave in a certain way,” she says. “It’s such an important skill because that’s what scientists do. They question everything and try to prove and disprove.

Sharat agrees that particle physics holds valuable lessons. No matter where her students go in life, she wants them to understand that “particle physics is the foundation of everything,” she says.

“We should know the reason for our existence. We should know what we are made of.

]]>
What is the Standard Model of Particle Physics? https://polkinghorne.org/what-is-the-standard-model-of-particle-physics/ Tue, 17 Nov 2020 08:00:00 +0000 https://polkinghorne.org/what-is-the-standard-model-of-particle-physics/ Through US Department of Energy November 17, 2020 The standard model includes particles of matter (quarks and leptons), force-carrying particles (bosons) and the Higgs boson. Credit: Illustration courtesy of Sandbox Studio, Chicago for Symmetry The Standard Model of particle physics is scientists’ current best theory for describing the most basic building blocks of the universe. […]]]>

Through

The standard model includes particles of matter (quarks and leptons), force-carrying particles (bosons) and the Higgs boson. Credit: Illustration courtesy of Sandbox Studio, Chicago for Symmetry

The Standard Model of particle physics is scientists’ current best theory for describing the most basic building blocks of the universe. It explains how particles called quarks (which make up protons and neutrons) and leptons (which include electrons) make up all known matter. It also explains how force-carrying particles, which belong to a larger group of bosons, influence quarks and leptons.

The Standard Model explains three of the four fundamental forces that govern the universe: electromagnetism, the strong force and the weak force. Electromagnetism is carried by photons and involves the interaction of electric fields and magnetic fields. The strong force, which is carried by the gluons, binds the atomic nuclei to make them stable. The weak force, carried by the W and Z bosons, causes nuclear reactions that have powered our Sun and other stars for billions of years. The fourth fundamental force is gravity, which is not sufficiently explained by the standard model.

Despite its success in explaining the universe, the Standard Model has limitations. For example, the Higgs boson gives mass to quarks, charged leptons (like electrons), and W and Z bosons. However, we don’t yet know if the Higgs boson also gives mass to neutrinos – ghostly particles that very rarely interact with other matter in the universe. Additionally, physicists understand that approximately 95% of the universe is not made of ordinary matter as we know it. Instead, much of the universe is made up of dark matter and dark energy that doesn’t fit the Standard Model.

DOE Office of Science: Contributions to the Standard Model of Particle Physics

The DOE has a long history of supporting fundamental particle research. Five of the six types of quarks, one type of lepton, and all three neutrinos were discovered at what are now DOE National Laboratories. Researchers supported by the DOE Office of Science, often in collaboration with scientists around the world, contributed to the Nobel Prize-winning discoveries and measurements that refined the Standard Model. These efforts continue today, with experiments performing precision tests of the Standard Model and further improving measurements of particle properties and their interactions. Theorists work with experimental scientists to develop new ways to explore the Standard Model. This research may also provide insight into the kinds of particles and unknown forces that could explain dark matter and dark energy, as well as what happened to antimatter after the big bang.

Standard Model of Particle Physics Facts

  • Any ordinary matter, including any atom on the periodic table of elements, consists of only three types of matter particles: up and down quarks, which make up the protons and neutrons in the nucleus, and the electrons which surround the nucleus.
  • The complete standard model took a long time to build. Physicist JJ Thomson discovered the electron in 1897, and Large Hadron Collider scientists found the final piece of the puzzle, the Higgs boson, in 2012.
]]>
Scientists work to shed light on the Standard Model of particle physics https://polkinghorne.org/scientists-work-to-shed-light-on-the-standard-model-of-particle-physics/ Thu, 05 Nov 2020 08:00:00 +0000 https://polkinghorne.org/scientists-work-to-shed-light-on-the-standard-model-of-particle-physics/ Typical magnetic field variations as mapped by the trolley at different positions in the storage ring of the Muon g-2 experiment, presented at the parts per million level. Credit: Argonne National Laboratory. As scientists await the highly anticipated first results of the Muon g-2 experiment at the US Department of Energy’s (DOE) Fermi National Accelerator […]]]>

Typical magnetic field variations as mapped by the trolley at different positions in the storage ring of the Muon g-2 experiment, presented at the parts per million level. Credit: Argonne National Laboratory.

As scientists await the highly anticipated first results of the Muon g-2 experiment at the US Department of Energy’s (DOE) Fermi National Accelerator Laboratory, collaborating scientists at the DOE’s Argonne National Laboratory continue to employ and to maintain the unique system that maps the magnetic field in the experiment with unprecedented precision.

Argonne scientists have improved the measurement system, which uses an advanced communications system and new magnetic field probes and electronics to map the field in the 45-meter circumference ring in which the experience.

The experiment, which started in 2017 and continues today, could have big implications for the field of particle physics. As a follow-up to a past experiment at the DOE’s Brookhaven National Laboratory, it has the power to affirm or refute previous findings, which could shed light on the validity of parts of the reigning standard model of science. particle physics.

High precision measurements of large quantities in the experiment are crucial to produce meaningful results. The main quantity of interest is the muon’s g-factor, a property that characterizes the magnetic and quantum mechanical attributes of the particle.

The standard model predicts very accurately the value of the g-factor of the muon. “Because the theory predicts this number so clearly, testing the g-factor by experiment is an effective way to test the theory,” said Simon Corrodi, a postdoctoral fellow appointed in the High Energy Physics (HEP) division of Argonne. . “There was a large discrepancy between the Brookhaven measurement and the theoretical prediction, and if we confirm this discrepancy, it will signal the existence of undiscovered particles.”

Just as the Earth’s axis of rotation precedes – meaning the poles move gradually in circles – the spin of the muon, a quantum version of angular momentum, precedes in the presence of a magnetic field. The strength of the magnetic field surrounding a muon influences the rate at which its spin precedes. Scientists can determine the g-factor of the muon by measuring the rate of spin precession and the strength of the magnetic field.

The more precise these initial measurements are, the more conclusive the final result will be. Scientists are on their way to make accurate field measurements at 70 parts per billion. This level of precision allows the final calculation of the g-factor to be accurate to four times the precision of the results of the Brookhaven experiment. If the experimentally measured value differs significantly from the value expected from the Standard Model, this may indicate the existence of unknown particles whose presence disturbs the local magnetic field around the muon.

Trolley ride

During data collection, a magnetic field causes a beam of muons to travel around a large hollow ring. To map the magnetic field strength throughout the ring with high resolution and precision, scientists designed a cart system to drive measurement probes around the ring and collect data.

Heidelberg University developed the cart system for the Brookhaven experiment, and Argonne scientists refurbished the equipment and replaced the electronics. In addition to the 378 probes mounted in the ring to continuously monitor field drifts, the cart contains 17 probes that periodically measure the field with higher resolution.

“Every three days, the cart circles the ring in both directions, taking about 9,000 measurements per probe per direction,” Corrodi said. “Then we take the measurements to build slices of the magnetic field and then a full 3D map of the ring.”

Scientists know the cart’s exact location within the ring thanks to a new barcode reader that registers marks at the bottom of the ring as it moves.

The ring is filled with a void to facilitate controlled muon decay. To preserve the vacuum inside the ring, a garage connected to the ring and the vacuum stores the trolley between measures. Automating the process of loading and unloading the cart into the ring reduces the risk of scientists compromising the vacuum and magnetic field by interacting with the system. They also minimized the power consumption of the cart’s electronics to limit the heat introduced into the system, which would otherwise disrupt measurement accuracy in the field.

Scientists work to shed light on the Standard Model of particle physics

Fully assembled cart system with wheels for rolling on rails and the new external barcode reader for exact position measurement. The 50 cm long cylindrical shell houses the 17 NMR probes and the custom-made reading and control electronics. Credit: Argonne National Laboratory.

The scientists designed the trolley and the garage to operate in the strong magnetic field of the ring without influencing it. “We used a motor that operates in a strong magnetic field and with a minimal magnetic signature, and the motor moves the cart mechanically, using ropes,” Corrodi said. “This reduces noise in the field measurements introduced by the equipment.”

The system uses as little magnetic material as possible, and scientists tested the magnetic footprint of each component using test magnets at the University of Washington and Argonne to characterize the system’s overall magnetic signature. cart.

The power to communicate

Of the two cables pulling the trolley around the ring, one of them also serves as a power supply and communication cable between the control station and the measurement probes.

To measure the field, scientists send a radio frequency through the cable to the cart’s 17 probes. The radio frequency rotates the spins of the molecules inside the probe in the magnetic field. The radio frequency is then cut off at the right moment, causing the spins of the water molecules to precess. This approach is called nuclear magnetic resonance (NMR).

The frequency at which the probes’ rotations precede depends on the magnetic field in the ring, and a digitizer on board the cart converts the analog radio frequency into multiple digital values ​​communicated via cable to a monitoring station. At the control station, scientists analyze the digital data to construct the spin precession frequency and, from this, a complete map of the magnetic field.

During the Brookhaven experiment, all signals were sent simultaneously over the cable. However, due to the conversion from analog to digital signal in the new experiment, a lot more data has to travel over the wire, and this increased speed could disrupt the very precise radio frequency needed for the probe measurement. To avoid this disruption, the scientists separated the signals in time, switching between the radio frequency signal and the data communication in the cable.

“We feed the probes a radio frequency via an analog signal,” Corrodi said, “and we use a digital signal to communicate the data. The cable switches between these two modes every 35 milliseconds.”

The tactic of switching between signals traveling through the same cable is called “time division multiplexing”, and it helps scientists achieve specifications not only for accuracy, but also for noise levels. An upgrade of the Brookhaven experiment, time division multiplexing allows for higher resolution mapping and new capabilities for analyzing magnetic field data.

Upcoming results

The field-mapping NMR system and its motion control were successfully commissioned at Fermilab and operated reliably for the first three data-taking periods of the experiment.

Scientists achieved unprecedented precision for field measurements, as well as record uniformity of the ring’s magnetic field, in this Muon g-2 experiment. Scientists are currently analyzing the first set of data from 2018 and plan to publish the results by the end of 2020.

The scientists detailed the complex setup in a paper titled “Design and Performance of a Vacuum Magnetic Field Mapping System for the Muon g-2 Experiment,” published in the Instrumentation review.


Muons tell tales of undiscovered particles


More information:
S. Corrodi et al, Design and performance of a vacuum magnetic field mapping system for the Muon g-2 experiment, Instrumentation review (2020). DOI: 10.1088/1748-0221/15/11/P11008

Provided by Argonne National Laboratory

Quote: Scientists work to shed light on the Standard Model of Particle Physics (2020, November 5) Retrieved February 10, 2022 from https://phys.org/news/2020-11-scientists-standard-particle-physics .html

This document is subject to copyright. Except for fair use for purposes of private study or research, no part may be reproduced without written permission. The content is provided for information only.

]]>