Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Meet the new math + Dimensions problem solved + 80 year mystery lifted

#1
C C Offline
Meet the New Math, Unlike the Old Math
https://www.quantamagazine.org/20161005-...education/

INTRO SUMMARY: The latest effort to overhaul math and science education offers a fundamental rethinking of the basic structure of knowledge. But will it be given time to work?



Researchers solve the problem of the dimensions of space-time in theories relating to the Large Hadron Collider
https://www.sciencedaily.com/releases/20...084521.htm

RELEASE: Researchers at the universities of Valencia and Florence propose an approach to the experimental data generated by the Large Hadron Collider that solves the infinity problem without breaching the four dimensions of space-time.

The theories currently used to interpret the data emerging from CERN's Large Hadron Collider (LHC), which have so far most notably led to the discovery of the Higgs boson, are poorly defined within the four dimensions of space-time established by Einstein in his Theory of Special Relativity. In order to avoid the infinities resulting from the calculations that these theories inspire, new dimensions are added in a mathematical trick which, although effective, does not reflect what we now know about our Universe.

Now though, a group of researchers at the Institute of Corpuscular Physics (IFIC, CSIC-UV) in Valencia has devised a way to side-step the infinity issue and keep the theory within the bounds of the four standard dimensions of space-time.

The crux of the issue lies in the fact that it is theoretically possible for particles with zero energy to be produced in LHC collisions, not to be confused with another problematic theoretical outcome of zero particle emissions. A similar issue arises when two particles are produced in exactly the same direction: they are indistinguishable from a single particle. Another of the problems with existing theories derives from the need to apply quantum corrections to their calculations, which requires the validity of the theories to be extrapolated to infinite energies, never reached in a particle accelerator. However, these situations are hard to reconcile with the theory and doing so has a price: the issue of infinity in the four dimensions of space-time. Infinities do not work well with theoretical predictions.

As mentioned above, the solution, found in 1972 by Nobel Prize winners Gerardus' t Hooft and Martinus J. G. Veltman, was to alter the dimensions of space-time. Known as Dimensional Regularization, it consists of defining the theory in a space-time that has more than four dimensions. That way, the infinities that emerge in four dimensions become contributions that depend on their dimensional difference with respect to four. It is a mathematical trick that deals with these infinities in the intermediate stages of the calculations, allowing predictions to be made that would otherwise be impossible.

But today, a group of researchers from the University of Valencia, led by Germán Rodrigo, has devised a new approach that redefines the theory in a way that avoids the infinity issue and keeps it within the bounds of the four standard dimensions of space-time. It entails a fundamental change in the way the predictions used to interpret LHC experimental data are obtained, simplifying the underlying calculations and solving one of the main problems faced by particle physicists when moving from theory to experiment.

Their approach is based on establishing a direct correspondence between different Feynman diagrams that generate infinities. These diagrams, proposed by Nobel Prize winner Richard P. Feynman in 1965, are used by physicists to pictorially represent the collisions produced between subatomic particles at very high energies in large particle accelerators like the LHC.

Known as the 'loop-tree duality', this new relationship of correspondence developed by IFIC researchers, in collaboration with a University of Florence research group led by Stefano Catani, unifies quantum states which, for theoretical purposes, are different but which experimentally are not, like those commented above.

The new algorithm was presented by IFIC researcher Germán Sborlini at the top particle physics conference, ICHEP 2016, held early last August in Chicago. It has also been published in Journal of High Energy Physics.



When quantum scale affects the way atoms emit and absorb particles of light: Exact simulation lifts the 80-year-old mystery of the degree to which atoms can be dressed with photons
https://www.sciencedaily.com/releases/20...122646.htm

RELEASE: In 1937, US physicist Isidor Rabi introduced a simple model to describe how atoms emit and absorb particles of light. Until now, this model had still not been completely explained. In a recent paper, physicists have for the first time used an exact numerical technique: the quantum Monte Carlo technique, which was designed to explain the photon absorption and emission phenomenon. These findings were recently published in EPJ D by Dr Flottat from the Nice -Sophia Antipolis Non Linear Institute (INLN) in France and colleagues. They confirm previous results obtained with approximate simulation methods.

According to the Rabi model, when an atom interacts with light in a cavity, and they reach a state of equilibrium, the atom becomes "dressed" with photons. Because this takes place at the quantum scale, the system is, in fact, a superposition of different states -- the excited and unexcited atom -- with different numbers of photons.

In the study, the team adapted a quantum Monte Carlo algorithm to address this special case. They created a novel version of the existing algorithm, one which accounts for the fluctuating number of photons. This made it possible to study atoms dressed with up to 20 photons each. No other existing exact simulation method -- including the exact diagonalisation and density matrix renormalisation group approaches -- can factor in these effects.

The authors found that there are dramatic consequences at quantum scale for strongly coupled light-atom systems. They showed that it is essential to take into account the effects resulting from the number of excitations not being conserved, because the atom-photon coupling is substantial enough for these effects to matter. For example, in a conventional light-atom coupling experiment in a macroscopic cavity, the coupling is so small that an atom is, on average, dressed with much less than one photon. With a coupling that is increased by a factor of, say, ten thousands, physicists have observed dressed states with tens of photons per atom.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  'Light speed' electrons discovered for the 1st time, described by 4 dimensions C C 0 41 Mar 20, 2024 05:36 PM
Last Post: C C
  Research A century later, new math smooths out general relativity C C 0 76 Dec 2, 2023 08:45 AM
Last Post: C C
  Research Meet strange metals: where electricity may flow without electrons C C 1 94 Nov 29, 2023 01:31 AM
Last Post: Magical Realist
  Article More on Pines' demon + Boundary of the quantum world + Möbius strip mystery solved C C 0 81 Oct 11, 2023 05:22 PM
Last Post: C C
  Article Möbius strip problem solved + A tower of conjectures that rests upon a needle C C 0 79 Sep 13, 2023 06:32 PM
Last Post: C C
  Article What math can teach us about standing up to bullies + How math achieved transcendence C C 0 71 Jun 29, 2023 01:21 PM
Last Post: C C
  Article Doubt cast on leading theory of the nucleus + Invariant subspace problem solved? C C 0 73 Jun 13, 2023 03:22 PM
Last Post: C C
  String theory is wrecking physics + Attempt to solve quantum problem deepens mystery C C 0 70 Feb 17, 2023 07:36 PM
Last Post: C C
  Coordinates in two dimensions: a Reality-based perspective Ostronomos 0 115 Jan 1, 2023 05:30 PM
Last Post: Ostronomos
  ‘Monumental’ math proof solves triple bubble problem and more C C 0 191 Oct 7, 2022 07:12 PM
Last Post: C C



Users browsing this thread: 1 Guest(s)