Application of high energy physics methods in quantum computing

A wheel-shaped muon detector is part of an ATLAS particle detector update at CERN. A new study applies “disclosure”, or error correction techniques used for particle detectors, to noise problems in quantum computing. Credit: Julien Marius Ordan / CERN

‘Disclosure’ techniques used to improve accuracy of particle detector data can also improve the reading of quantum states by a quantum computer.

Borrowing a page from top physics and astronomy textbooks, a team of physicists and computer scientists at the US Department of Energy’s Lawrence Berkeley National Laboratory (Lab Berkeley) has successfully adapted and implemented a common error reduction technique in the field of quantum computing.

In the world of subatomic particles and giant particle detectors, and distant galaxies and giant telescopes, scientists have learned to live and work with uncertainty. They are often trying to excite ultra-rare particle interactions from a massive tangle of other particle interactions and background “noise” that can complicate their hunting, or trying to filter out the effects of atmospheric distortions. and interstellar dust to improve the resolution of astronomical images.

Also, inherent problems with detectors, such as their ability to record all particle interactions or to accurately measure particle energies, can result in misinterpretation of data by the electronics to which they are connected, thus that scientists need to design complex filters, in computer algorithms, to reduce the error limit and return the most accurate results.

Noise problems and physical defects and the need for error correction and error mitigation algorithms, which reduce the frequency and severity of errors, are also common in the field of quantum initiation, and a study published in the journal Quantitative information of NPj found that some common solutions also appear to exist.

Ben Nachman, a Berkeley Lab physicist who is involved with particle physics experiments in CERN as a member of the Berkeley Lab ATLAS group, saw the connection of quantum computing while working on a particle physics calculation with Christian Bauer, a theoretical Berkeley Lab physicist who co-authored the study. ATLAS is one of four giant particle detectors in CERN’s Large Hadron Collider, the largest and most powerful particle collider in the world.

“At ATLAS, we often have to ‘unfold’ or correct the effects of the detector,” said Nachman, lead author of the study. “People have been developing this technique for years.”

In experiments at the LHC, particles called protons collide at a rate of about 1 billion times per second. To cope with this overcrowded, “noisy” environment and internal problems related to energy resolution and other detector-related factors, physicists use “disclosure” error correction techniques and other filters to decompose this particle in the most useful, accurate data.

“We realized that even current quantum computers are very noisy,” Nachman said, so finding a way to reduce that noise and minimize errors – error mitigation – is key to advancing quantum computing. “One kind of error is related to the actual operations you do, and one is related to reading the state of the quantum computer,” he noted – the first type is known as a gateway error, and the latter is called a read error.

Quantum computing of high energy physics

These graphs show the relationship between ordered high-energy physics measurements related to particle scattering – called differential cross-sectional measurements (left) – and repeated measurements of results from quantum computers (right). These similarities provide an opportunity to apply similar error correction techniques to data from both domains. Credit: Berkeley Lab; npj Quantum Inf 6, 84 (2020), DOI: 10.1038 / s41534-020-00309-7

The latest study focuses on a technique to reduce reading errors, called the “Bayesian repetitive unfolding” (IBU), which is known to the high-energy physics community. The study compares the effectiveness of this approach with other error correction and mitigation techniques. The IBU method is based on Bayes’ theorem, which provides a mathematical way to find the probability of an event that can occur when there are other conditions related to that event that are already known.

Nachman noted that this technique can be applied to the quantum analogue of classical computers, known as universal gate-based quantum computers.

In quantum computing, which relies on quantum bits, or cubits, to bring information, the brittle state known as quantum superposition is difficult to maintain and can be broken over time, causing a dome to display a zero instead of one – this is a common example of a reading error.

Overlap ensures that a quantum bit can represent a zero, one, or both quantities at the same time. This enables unique computing capabilities that are not possible in conventional computing, which rely on bits that represent either one or a zero, but not both at the same time. Another source of reading error in quantum computers is simply an erroneous measurement of the condition of a dome due to computer architecture.

In the study, the researchers simulated a quantum computer to compare the performance of three different error correction techniques (either error mitigation or unfolding). They found that the IBU method is stronger in a very noisy, error-prone environment and slightly better than the other two in the presence of the most common noise patterns. Its performance was compared to an error correction method called Ignis that is part of a collection of open source quantum computer software development tools developed for IBM Quantum Computers, and a very basic form of disclosure known as the matrix inversion method.

The researchers used the simulated quantum computing environment to produce more than 1,000 pseudo-experiments and they found that the results for the IBU method were closest to the predictions. The noise models used for this analysis were measured on a 20-cubic quantum computer called the IBM Q Johannesburg.

“We took a very common technique from high-energy physics and applied it to quantum computing, and it worked really well – as it should,” Nachman said. There was a steep learning curve. “I had to learn all sorts of things about quantum computing to be sure I knew how to translate this and apply it to a quantum computer.”

He said he was also very fortunate to find collaborators for expert study in quantum computing at Berkeley Lab, including Bert de Jong, who heads a DOE Office of the Advanced Scientific Informatization Quantum Algorithms Research Team and a Accelerated Quantum Computing Research project at Berkeley Laboratory Computing Research Division.

“It is exciting to see how the abundance of knowledge created by the high-energy physics community to get the most out of noisy experiments can be used to get the most out of noisy quantum computers,” de Jong said.

The simulated and real quantum computers used in the study ranged from five bits to 20 bits, and the technique needs to be scaled up to larger systems, Nachman said. But the error correction and mitigation techniques the researchers tested will require more computing resources as the size of quantum computers increases, so Nachman said the team is focused on how to make the most manageable methods for quantum computers with larger dome strings.

Nachman, Bauer, and de Jong also participated in a previous study proposing a way to reduce gate errors, which is the other major source of quantum computation errors. They believe that error correction and error mitigation in quantum computing may ultimately require a mixing and matching approach – using a combination of several techniques.

“It’s an exciting time,” Nachman said, as the field of quantum computing is still young and there is plenty of room for innovation. “People have at least got the message about these types of approaches, and there is still room for improvement.” He noted that quantum computing provided a “push to think about problems in a new way”, adding, “It has opened up a new scientific potential”.

Reference: “Disclosure of computer quantum reading noise” by Benjamin Nachman, Miroslav Urbanek, Wibe A. de Jong and Christian W. Bauer, September 25, 2020, Quantum information for NPj.
DOI: 10.1038 / s41534-020-00309-7

The Oak Ridge Leadership Computing Facility, a user facility of the DOE Science Office at Oak Ridge National Laboratory, provided researchers with access to IBM quantum computing resources, including the IBM Quantum Experience and Q Hub Network.

Miroslav Urbanek in the Berkeley Lab Computational Research Division also participated in the study, which was supported by the US DOE Science Office and the Aspen Center for Physics.

Related articles

Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share article

Latest articles

River colors are changing in the United States

1984 - 2018: Over the past 35 years, one-third of the major rivers in the United States have changed their predominant color, often due to...

The crystals found in the stomach of a fossil bird complicate the advice of his diet

Restoration of Bohayornitis Sulkavis, a close relative of Bohayornis Guo, insect hunting. Loan: © S. Abramowicz, Dinosaur Institute, Los Angeles County Museum of...

Teenagers “tyrants” – a carnivorous dinosaur family – do you explain the lack of dinosaur diversity?

New research shows that the descendants of giant carnivorous dinosaurs such as Tyrannosaurus rex have largely reshaped their societies with smaller rival species. ...

Watch the world’s first video of a space-time crystal

The periodic pattern consisting of magnons is formed at room temperature. A team of researchers have managed to create a space-time micrometer-sized crystal composed of...

Treatment for Egypt’s Mummy Rare Mud Carapace – Misidentified

A mummified person and a coffin in the Nicholson collection of the Chau Chak Wing Museum, University of Sydney. A. Mummified individual wearing...

Newsletter

Subscribe to stay updated.