Experimental physicists – Polkinghorne http://polkinghorne.org/ Mon, 27 Jun 2022 20:17:04 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.3 https://polkinghorne.org/wp-content/uploads/2022/01/icon-2022-01-25T202759.511-150x150.png Experimental physicists – Polkinghorne http://polkinghorne.org/ 32 32 Experimental physicists design new technology for the CERN Collider https://polkinghorne.org/experimental-physicists-design-new-technology-for-the-cern-collider/ Fri, 26 Jun 2020 07:00:00 +0000 https://polkinghorne.org/experimental-physicists-design-new-technology-for-the-cern-collider/ The Large Hadron Collider is the largest machine on Earth and one of the most complex scientific instruments ever built. It uses powerful electromagnets to propel beams of charged particles at nearly the speed of light and manipulates these beams into controlled collisions that create showers of billions of tiny particles. Most of these particles […]]]>

The Large Hadron Collider is the largest machine on Earth and one of the most complex scientific instruments ever built. It uses powerful electromagnets to propel beams of charged particles at nearly the speed of light and manipulates these beams into controlled collisions that create showers of billions of tiny particles. Most of these particles are not particularly remarkable, but some can reveal the underlying physical properties of our universe.

Operated by the European Organization for Nuclear Research (CERN), the Large Hadron Collider consists of two 27-kilometre circular tubes buried deep underground along the border between Switzerland and France. Powerful compressors remove air from these tubes and beams of particles are propelled in opposite directions through them. The tubes are lined with over 1,200 large magnets that keep the particles centered inside, so they don’t collide with the machine itself.

Along the circuit there are 16 radio frequency cavities – metal chambers that maximize resonance to create a powerful electromagnetic field. This field oscillates 400 million times per second, which separates the particle beams into many bunches. As the particles pass through each radio frequency cavity, their electromagnetic force accelerates the particles to ever greater speeds until they reach their maximum speed – 99.999999% of the speed of light.

Welding and assembly of the superconducting crab cavities of the HL_LHC

Finally, another set of magnets focuses these bunches of particles, directing them to collide into one of CERN’s four main detectors. This results in a rain of particles and a lot of radiation. Sensors in CERN’s detectors must be sensitive enough to detect subatomic particles, and the chips that process this data must be able to record more than a billion particle interactions per second. And all of this has to happen in an environment where radiation levels approach those at the heart of a nuclear reactor.

Experimental physicists at Carleton are validating new sensors and readout chips that will be used in the internal tracker of CERN’s largest detector: ATLAS. The Higgs boson was first observed in ATLAS in 2012, and the facility is being upgraded as part of the High-Luminosity Large Hadron Collider project. Scheduled to be completed in 2027, the facility upgrade will significantly improve the performance of the Large Hadron Collider – and enable experiments aimed at demonstrating the existence of dark matter and other dimensions.

In particle accelerators, luminosity is a measure of how many particles can pass through a particular space in a given amount of time. More particles means more collisions to observe and study. The new Large High-Luminosity Hadron Collider will increase the luminosity of the accelerator by an order of magnitude – and so will increase the number of particle collisions it can generate.

High-luminosity upgrade kicks off with installation of two HL-LHC connecting cryostats

To accomplish all of this, significant hardware updates will be required.

“We need to be able to detect individual elementary particles as a single electron,” says Thomas Koffas, associate professor of experimental particle physics at Carleton University.

“The new sensors are so sensitive that if you breathe on them, they will most likely be damaged.

“But in the ATLAS Inner Tracker, they’ll be exposed to full-throttle radiation. There’s nothing in front of them and thousands of particles will hit each sensor with every collision. We want to be able to catch them all. To see what they are, and decide if we care about a particular collision, or let it go and wait for the next one.

The Inner Tracker has an area of ​​about 200 square meters, and about three-quarters of that will be covered with sensors measuring about 10 centimeters by 10 centimeters. That’s extremely large for a sensor. Most are only a few millimeters in diameter.

“Maintaining electrical performance over such a large area was one of the main challenges,” Koffas explains.

“The sensors must be able to withstand at least half a kilovolt without failing. The larger the surface area of ​​a semiconductor, the more difficult it is to achieve this.

Masters student at Carleton, Robert Hunter, Professor Dag Gillberg and Professor Thomas Koffas

From left to right: Robert Hunter, Master’s student at Carleton, Professor Dag Gillberg and Professor Thomas Koffas (Photo: Justin Tang)

The R&D of the project was led by the optoelectronics and microelectronics team at CERN. Carleton joined the initiative in 2014 and contributed to the stereo ring geometry of the sensor silicon wafer design. Due to the irregular shape of the ATLAS Inner Tracker, eight different sensor shapes are required. To correct the irregularities, the researchers had to incorporate rotation angles into the designs. Final prototypes were approved in 2019 and the first sensors were shipped this spring to Hammamatsu Photonics in Japan.

Inside ATLAS, each sensor will transmit data to application-specific embedded chips (ASICs) that record what they have detected. These chips were custom designed for this application by CERN’s Microelectronics Department, in collaboration with Carleton and Rutherford Appleton Laboratories in Oxford, UK. The ASIC chips are manufactured in Vermont by Global Foundries, and over 300,000 will be installed during the upgrade. Each of them must be able to handle around 640 megabytes of data in the brief moment a particle shower occurs. The stakes are high. If a sensor or chip fails during an experiment, data will be lost. This could prevent a major discovery.

To ensure that all chips and sensors meet rigorous performance standards, each sensor and chip will be individually tested. Carleton is the lead ASIC chip beta tester and will test about a quarter of the sensors. To meet the requirements of the project, Carleton physicists are teaming up with the Department of Electronics and DA-Integrated, a local microelectronics testing company and the only company to date to have demonstrated the ability to test the chips. DA-Integrated was awarded a start-up contract and invited to participate in a tendering process – the first time a Canadian company has been invited to do so.

To avoid damage to the sensors, testing should take place in purified air free of dust and moisture. The electrical performance of the sensors will be tested in a clean room of the Carleton University Microfabrication Facility in the Mackenzie Building, while mechanical performance and a visual inspection will take place at FANSSI Nanofabrication Facility at the Minto Center for Advanced Studies in Engineering.

A silicon tracker being worked on in the ATLAS SR1 cleaning room

The chip test will take place at Integrated AD facility in Stittsville, just outside of Ottawa. There, the processing power of each chip will be validated using a suite of tests developed by experimental physicists from Carleton and the University of Oxford to test prototypes during the R&D process.

“There’s a wafer with over 400 chips on it, and a machine tests each chip in sequence. Within seconds it runs several hundred tests to make sure it’s fully operational,” says Dag Gillberg, associate professor of physics working on the project.

“If it fails a test, the chip is removed and will not be sent to CERN.”

It is essential that each component is up to the task.

“We only have one hit. Once we get started, they’ll stay in the detector for 12 years,” says Gillberg.

“We won’t be able to repair it after a year, if it’s damaged. That’s why we have to be so careful. We have to make sure everything works perfectly. »

]]>
Experimental physicists are redefining superfast and coherent magnetism https://polkinghorne.org/experimental-physicists-are-redefining-superfast-and-coherent-magnetism/ Wed, 26 Jun 2019 07:00:00 +0000 https://polkinghorne.org/experimental-physicists-are-redefining-superfast-and-coherent-magnetism/ The electronic properties of materials can be directly influenced by absorbing light in less than a femtosecond (10-15 seconds), which is considered the limit of the maximum achievable speed of electronic circuits. On the other hand, the magnetic moment of matter could only be influenced so far by a process related to light and magnetism […]]]>

The electronic properties of materials can be directly influenced by absorbing light in less than a femtosecond (10-15 seconds), which is considered the limit of the maximum achievable speed of electronic circuits. On the other hand, the magnetic moment of matter could only be influenced so far by a process related to light and magnetism and by a circuitous route by means of magnetic fields, which is why magnetic switching takes a lot more time and at least several hundred femtoseconds. A consortium of researchers from the Max Planck Institutes for Quantum Optics and Microstructural Physics, the Max Born Institute, the University of Greifswald and the Graz University of Technology have only just been able to manipulate the magnetic properties of a ferromagnetic material both scales the electric field oscillations of visible light – and therefore in sync with the electrical properties – by means of laser pulses. This influence could be accelerated by a factor of 200 and was measured and represented by time-resolved attosecond spectroscopy. The researchers described their experience in the journal Nature.

Material composition as a crucial criterion

In attosecond spectroscopy, magnetic materials are bombarded with ultra-short laser pulses and influenced electronically. “The flashes of light trigger an intrinsic and generally retarding process in the material. The electronic excitation results in a modification of the magnetic properties”, explains Martin Schultze, who until recently worked at the Max Planck Institute for Quantum Optics in Munich, but who is now a professor at the Institute for Experimental Physics at TU Graz. Due to the combination of a ferromagnetic with a non-magnetic metal, the magnetic reaction in the described experiment is however as fast as the electronic reaction. “Thanks to the special constellation, we were able to optically cause a spatial redistribution of the charge carrier, which led to a directly related change in the magnetic properties,” explains Markus Münzenberg. Together with his team from Greifswald, he developed and produced the special material systems.

Schultze is enthusiastic about the magnitude of the research success: “Never before has such a fast magnetic phenomenon been observed. Thanks to this, ultrafast magnetism will take on a whole new meaning.” Sangeeta Sharma, a researcher at the Max Born Institute in Berlin, who predicted the underlying process using computer models, is impressed: “We expect a significant development boost for all applications in which magnetism and electron spin plays a role.

First step towards coherent magnetism

Moreover, the researchers show in their measurements that the observed process takes place in a coherent way: this means that the wave nature of the quantum mechanics of the moving charge carriers is preserved. These conditions allow scientists to use individual atoms as carriers of information instead of larger units of material or to influence changing magnetic properties using another specifically delayed laser pulse, thus advancing the technological miniaturization. “As far as new perspectives are concerned, this could lead to fantastic developments similar to those in the field of magnetism, such as electronic coherence in quantum computing,” hopes Schultze, who now leads a working group focused on attosecond physics at the Institute of Experimental Physics.

Source of the story:

Material provided by Graz University of Technology. Note: Content may be edited for style and length.

]]>
Can statisticians become experimental physicists? https://polkinghorne.org/can-statisticians-become-experimental-physicists/ Fri, 19 Oct 2018 07:00:00 +0000 https://polkinghorne.org/can-statisticians-become-experimental-physicists/ Before you brush off this post with the “sure” response, let me qualify the title. Of course, anyone can become a particle physicist, although the learning curve can be steep and difficult to climb. But what I mean here is that a student who has been trained as a statistician (through his bachelor’s and master’s) […]]]>

Before you brush off this post with the “sure” response, let me qualify the title. Of course, anyone can become a particle physicist, although the learning curve can be steep and difficult to climb. But what I mean here is that a student who has been trained as a statistician (through his bachelor’s and master’s) can become a successful experimental particle physicist, without investing any other years of his life studying quantum mechanics and many other arcane physics topics?

The question, in my opinion, is important because of the way we do science with these huge particle detectors these days. Living and working within the large scientific collaborations that conducted the CDF experiment and later the CMS experiment, I was fortunate enough to observe a statistically significant sample of PhD students (myself included), almost exclusively trained in fundamental physics, being hit by a truck – in the form of the set of skills necessary to complete a thesis on a subject of experimental physics during a collider experiment.

The work of a Ph.D. student in a HEP experiment relies little on his painstakingly acquired skills in physics. On the contrary, most of their time (if we exclude most of it, which is gobbled up by attending mostly useless meetings, and a few other smaller “service tasks”, which usually don’t require the skills of a qualified physicist) is spent writing computer programs and scripts, submitting batch jobs to computer queuing systems, and performing advanced statistical analyzes on the resulting datasets.

At this point, I’m kicking myself as this post would have been much more informative and objective if I had taken the time to run a simple Twitter poll before I started writing it. I would have phrased it this way: what percentage of time does a typical Ph.D. student of an LHC experiment goes to

A) the activities that most often require high-level knowledge of particle physics
B) activities that most of the time require high-level knowledge of statistics to be followed
C) activities that mainly require other skills?

Still, I think we can continue the discussion here even without the data, if we take my personal estimate: A – 20%; B-50%; C-30%. Yes, that’s my point: I mean a Ph.D. in experimental physics on a particle collider requires more training in statistics than training in physics. Of course, a doctorate. the pupils are intelligent: if they are confronted with a problem, they go to the library and read a book; or they google a solution to this. Thus, the fact that a significant training in statistics is required to carry out tasks relevant to their doctorate. completion of the thesis is not an obstacle for a physicist.

[Incidentally, note that I could have framed the question differently, including the request of an assessment of computing skills requirements; but here I do not need that nuisance parameter to make my point.]

If you accept for a second my estimates above of the type of education required for a Ph.D. student at the LHC, or even if you want to modify them by non-revolutionary amounts, you will conclude like me that it would be good to open the door of a doctorate. in particle physics to young statisticians.

Of course, a student who has earned his master’s degree in statistics is unlikely to be attracted by the low salaries and high job volatility of an academic career as a HEP researcher. But if they are young and dumb, some may fall under the siren song of physics. So what Bruno Scarpa, professor of statistics at the University of Padua, suggested to me is to propose a course of “Particle physics: foundations, instruments and methods of analysisto the Masters in Statistics there.

Being the lazy bum that I am, I dodged his enticing offer for a while, but eventually gave in, under the pressure of my own beliefs mentioned above. So this semester, I’m teaching this course, and surprisingly enough, there are a dozen kind souls who decided to take it. So far, so good.

My outlook is as follows: out of these 12 students, I would be delighted if one or two asked me for a thesis based on the analysis of CMS data. If then one of them applied for a doctorate. in physics next year, I’ll definitely be back here to say the plan worked!

But let’s slow down. Between a statistician and a physicist, there is a language barrier, made up of the different concepts to which they have been exposed during their studies. So my job these days is, more than teaching particle physics, to bridge that gap. I discussed ancient quantum theory, special relativity, wave-particle duality, quantum mechanics, symmetries and conservation laws, the quark model, deep inelastic scattering, etc. In the second part of the course, however, I will try to stimulate their inner statistician by touching on real topics of physics data analysis. We’ll see how it goes!

Tommaso Dorigo is an experimental particle physicist who works for the INFN at the University of Padua, and collaborates with the CMS experience at CERN’s LHC. He coordinates the European network AMVA4NewPhysics as well as accelerator physics research for INFN-Padova, and is editor of the journal Reviews in Physics. In 2016 Dorigo published the book “Anomaly! Collider Physics and the Search for New Phenomena at Fermilab”. You can get a copy of the book on Amazon.

]]>
Experimental physicists are a lot like little children https://polkinghorne.org/experimental-physicists-are-a-lot-like-little-children/ Fri, 13 Apr 2018 07:00:00 +0000 https://polkinghorne.org/experimental-physicists-are-a-lot-like-little-children/ On Twitter-land, Rhett Allain made an observation regarding his class that touched me: Sometimes students treat labs like a new gimmick. Take it out of the box, but the batteries and use it. If that doesn’t work, go back and read the instructions. — Rhett Allain (@rjallain) April 11, 2018 It instantly reminded me of […]]]>

On Twitter-land, Rhett Allain made an observation regarding his class that touched me:

It instantly reminded me of a time when I was a graduate student, working in a lab at NIST, and we received a box from a tech company containing new oscilloscopes. These were mostly replacements for our lab stock oscilloscopes that had been built into one experiment or another, but included a new handheld oscilloscope (the face was roughly the surface of my Chromebook, but it was about four times thicker (which someone ordered just for fun, to see if it would help.

We opened it up in the break room to play with, and I distinctly remember someone casually tossing the shrink-wrapped manual to the side in order to gain access to a probe cable. Said probe was immediately inserted into a wall outlet, to provide a convenient source of oscillating voltage that could be used to check the capabilities of the oscilloscope. Which thankfully didn’t include “being damaged by running 110v wall current directly into the entrances”, not that anyone would have known that.

Now, to be fair, it was a room full of professional physicists for whom oscilloscopes were everyday tools, so we had a pretty good idea of ​​the basic functionality we expected to find in the oscilloscope; the manual was not really essential. But that happened with just about every new toy we brought to the lab – in another memorable case, one of the experiments needed to massively amplify a signal to audio frequencies, so they bought an amplifier designed for small rock concerts. Opening the box they almost immediately unscrewed and removed the lid of the device to take a look at the power transistors inside, blowing right past a sticker warning that opening it would void the warranty or something of the kind. Lifting the top plate revealed a second lid held closed by screws, with a huge warning in bright red letters stating that under no circumstances should it be opened by anyone with long hair or hanging jewelry, or anyone who drinks alcohol or uses drugs. (They knew their target audience, and they weren’t physicists…)

This habit has always seemed to me to illustrate one of the characteristic attitudes of physicists, especially experimental physicists, namely the feeling that almost everything is good to play. Presented with new equipment, we tend to start playing the violin immediately and only turn to actual instructions and documentation after hitting some sort of roadblock. This extends to just about anything – lab equipment, computer software, household appliances – and sometimes leads to funny problems. These mostly happen when there is a slightly non-standard way to start things. I have a doctorate. and a full professorship, and every once in a while I’ll stop at a minor point in a software package, like a senator trying to figure out the inner workings of Facebook, which happens to be explained in the first paragraph of the documentation that I didn’t read before I started poking pimples.

(There are limits to this, of course. Anything really expensive – the threshold for this varies, but a price tag in the thousands will usually do – or dangerous – high voltage, toxic chemicals, etc. – is treated with care and according to established procedures.)

Chad Orzel

There is something essentially childish about this approach, as a recent scene at Chateau Steelypips reminded me. One of the kids was sent home with a fever about a week ago, at which point we found we no longer had a reliable thermometer (where these things go I have no idea. ..). I picked up one of those “no-contact” thermometers that use infrared light to measure the temperature of a forehead, and brought it home last weekend. This launched a long period of trials to take the temperature of everything. The children ran around measuring their body temperature on the forehead, cheeks, neck, hands and other parts, but also tried to get the temperature of furniture, pieces of food and especially a very cute Charlie the puppy. tolerant, who endure an incredible amount of nudging and nudging. It was a lot of fun, at least for the humans, and Charlie got a few treats, so he was okay with that in the end.

(Critical findings from all of these experiments: the thermometer reads in a fairly narrow range and does not register temperatures at all for most household objects, or through dog fur. It will, however, read from bare skin to the inside of Charlie’s ear, and, oddly enough, the plastic cover on the recessed lights in our kitchen. Last but not least, our study conclusively shows that Charlie is a Very Good Boy.)

Of course, this “just start playing” isn’t a one-size-fits-all approach. In recent years I’ve changed the way I teach labs a bit, trying to move away from elaborate handouts with step-by-step instructions on exactly what to do, to giving students a bit of a role more active in determining what to do. I ask them to play around with the equipment a bit and judge the best way to do the measurement and how to process the data. A colleague of mine takes this approach much further than I have ventured, providing students with essentially no guidance, just a stack of materials that may or may not be helpful in gauging what they are looking for.

Some of them really care about it and do a great job of figuring things out. Others are really uncomfortable without step-by-step instructions, and spend a lot of time trying to tell me exactly what to do. I don’t know exactly what the source is. Part of it seems to be learned – a kind of loss aversion applied to grades, where they fear losing points for not doing exactly the right thing – and some students will come back once they realize that there really is no perfect secret procedure that they will be penalized for not finding out. Others never really let go, so I suspect there’s an innate component to personality as well. Some people just aren’t well suited to “push it with a stick!” approach that is characteristic of many experimental physicists.

]]>