Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Coevolution of particle physics & computing + Female pioneer of computer simulation

#1
C C Offline
The coevolution of particle physics and computing
https://www.symmetrymagazine.org/article...-computing

EXCERPTS: In the mid-twentieth century, particle physicists were peering deeper into the history and makeup of the universe than ever before. Over time, their calculations became too complex to fit on a blackboard—or to farm out to armies of human “computers” doing calculations by hand. To deal with this, they developed some of the world’s earliest electronic computers.

Physics has played an important role in the history of computing. The transistor—the switch that controls the flow of electrical signal within a computer—was invented by a group of physicists at Bell Labs. The incredible computational demands of particle physics and astrophysics experiments have consistently pushed the boundaries of what is possible. They have encouraged the development of new technologies to handle tasks from dealing with avalanches of data to simulating interactions on the scales of both the cosmos and the quantum realm.

But this influence doesn’t just go one way. Computing plays an essential role in particle physics and astrophysics as well. As computing has grown increasingly more sophisticated, its own progress has enabled new scientific discoveries and breakthroughs.

[...] These computational systems worked well for particle physicists for a long time, says Berkeley Lab astrophysicist Peter Nugent. That is, until Moore’s Law started grinding to a halt.

Moore’s Law is the idea that the number of transistors in a circuit will double, making computers faster and cheaper, every two years. The term was first coined in the mid-1970s, and the trend reliably proceeded for decades. But now, computer manufacturers are starting to hit the physical limit of how many tiny transistors they can cram onto a single microchip.

Because of this, says Nugent, particle physicists have been looking to take advantage of high-performance computing instead. Nugent says high-performance computing is “something more than a cluster, or a cloud-computing environment that you could get from Google or AWS, or at your local university.”

What it typically means, he says, is that you have high-speed networking between computational nodes, allowing them to share information with each other very, very quickly. When you are computing on up to hundreds of thousands of nodes simultaneously, it massively speeds up the process... (MORE - missing details)


An unsung female pioneer of computer simulation
https://www.scientificamerican.com/artic...imulation/

EXCERPTS: In 1952, at Los Alamos Scientific Laboratory, theoretical physicists Enrico Fermi, John Pasta and Stanislaw Ulam brainstormed ways to use the MANIAC, one of the world’s first supercomputers, to solve scientific problems. At the time, problems were solved by performing either laboratory experiments or mathematical calculations by hand. Fermi, Pasta and Ulam wanted to use their new problem-solving tool—computer simulation—to virtually zoom in on a system and observe atomistic interactions at the molecular level, with a realism that was not possible before.

They chose to simulate a chain of point masses connected by springs, designed to represent atoms connected by chemical bonds, then observe what happened to energy as it moved around on the chain. The system, which resembled objects on a vibrating string, was important because it was nonlinear—unable to be solved by being broken into smaller pieces. Interactions between atoms are universally nonlinear, but they couldn’t be observed with a microscope. This experiment on the MANIAC would allow scientists to virtually observe, for the first time, interactions between individual atoms.

[...] The experiment has historically been named the Fermi-Pasta-Ulam problem, or FPU, for the three physicists who authored the 1955 report, but many scientists now refer to it as the Fermi-Pasta-Ulam-Tsingou problem, or FPUT. In the original Los Alamos report, a column lists “work by” the three authors plus Mary Tsingou, and the first page includes a footnote reading, “We thank Miss Mary Tsingou for efficient coding of the problems and for running the computations on the Los Alamos MANIAC machine.”

Mary Tsingou Menzel is a very humble scientific game changer. Still living in Los Alamos with her husband, Joe Menzel, she expresses surprise at the significance of the experiment she programmed almost 70 years ago. She also continually asserts that she has never felt slighted by not being included in the naming of the problem. “It never bothered me,” Tsingou says. “They did acknowledge that I did the programming.”

The impact of the experiment on modern science is difficult to overstate. “Nonlinear science destroyed the clockwork view of the classical universe by showing how chaos places limits on predictability,” says David Campbell, a professor of physics at Boston University. “Nonlinear studies are now a part of the canon of modern science.”

Most systems are, in fact, nonlinear. “Quantum gravity, cancer, the immune system, the economy, the resilience of ecosystems, the origin of life, climate change—all of these problems are characterized by thickets of feedback loops and interactions among the various parts of the systems that make the whole more or less than the sum of its parts,” says Strogatz. These types of systems could not be studied before computer simulation, and computer simulations could not take place without programmers.

Tsingou originally came to work at Los Alamos as a mathematician, but when the opportunity arose, she became one of only a few people at the time who learned to program the MANIAC. It was then that she began working with Fermi, Pasta and Ulam in a theoretical group that had been given use of the MANIAC, and she became instrumental in a groundbreaking experiment. “We were all sitting there together,” Tsingou remembers, “and they [say], ‘We’ve got this machine; we’ve got to come up with some problems’ that couldn’t be solved before theoretically.” They went through several options but decided to try the vibrating string.

Once she knew what the physicists wanted to test, Tsingou handwrote an algorithm that would be her pathway for obtaining the results. “We made flowcharts,” she says, “because when you’re debugging a problem, you want to know where you are so you can stop at different places and look at things. Like any project, you have some idea, but as you go along, you have to make adjustments and corrections, or you have to back up and try a different approach.” (MORE - missing details)
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Underdog technologies gain ground in quantum-computing race C C 4 145 Feb 8, 2023 04:37 AM
Last Post: Kornee
  How to test if we’re living in a computer simulation C C 11 379 Nov 27, 2022 07:53 AM
Last Post: Kornee
  Machine learning reimagines the building blocks of computing C C 0 85 Mar 16, 2022 05:18 PM
Last Post: C C
  What’s the real science behind Google’s time crystal? (quantum computing) C C 1 96 Sep 18, 2021 07:45 PM
Last Post: Syne
  Teaching AI to see depth in photographs & paintings + Wildfire simulation heats up C C 0 76 Aug 12, 2021 12:51 AM
Last Post: C C
  What makes quantum computing so hard to explain? C C 0 152 Jun 9, 2021 06:35 PM
Last Post: C C
  Waiting for quantum computing? Try probabilistic computing C C 0 130 Apr 1, 2021 08:46 PM
Last Post: C C
  Army researchers see path to quantum computing at room temperature C C 0 413 May 4, 2020 05:22 AM
Last Post: C C
  Maxwell's demon spurs future of quantum computing + Don't adopt AI, get left behind C C 0 495 Sep 12, 2018 03:35 AM
Last Post: C C
  Could quantum computing make perfect human behavior prediction possible? C C 1 549 Jun 30, 2018 11:33 PM
Last Post: Syne



Users browsing this thread: 1 Guest(s)