process of assuring meaningful mathematical results in quantum field theory and related disciplines

In quantum field theory, the statistical mechanics of fields, and the theory of self-similar geometric structures, renormalization is any of a collection of techniques used to treat infinities arising in calculated quantities.

Quotes edit

  • Feynman comments that the renormalization theory is simply way to sweep difficulties under the rug. All the players, Tomonaga, Schwinger and Feynman feel that the theory that they have developed is intellectually not satisfactory. What they have provided is only a conservative solution but what is needed is a radical innovation and a revolutionary departure similar to what has been made in the nineteen thirties by Bohr, Heisenberg, Schrödinger and Dirac.
  • Hence most physicists are very satisfied with the situation. They say: "Quantum electrodynamics is a good theory, and we do not have to worry about it any more." I must say that I am very dissatisfied with the situation, because this so-called "good theory" does involve neglecting infinities which appear in its equations, neglecting them in an arbitrary way. This is just not sensible mathematics. Sensible mathematics involves neglecting a quantity when it turns out to be small—not neglecting it just because it is infinitely great and you do not want it!
    • P. A. M. Dirac, Directions in Physics (1978), 2. Quantum Electrodynamics
  • A new technique has been developed for carrying out the renormalization of mass and charge in quantum electrodynamics, which is completely general in that it results not merely in divergence-free solutions for particular problems but in divergence-free equations of motion which are applicable to any problem. Instead of using a power-series expansion in the whole radiation interaction, the new method uses expansions in powers of the high-frequency part of the interaction. The convergence of the perturbation theory is thereby much improved. The method promises to be especially useful in applications to meson theory.
    • Freeman John Dyson, (1951). "The renormalization method in quantum electrodynamics". Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences 207 (1090): 395–401. DOI:10.1098/rspa.1951.0127.
  • So it appears that the only things that depend on the small distances between coupling points are the values for n and j-theoretical numbers that are not directly obseroable any- way; everything else, which can be observed, seems not to be affected. The shell game that we play to find n and j is technically called "renormalization." But no matter how clever the word, it is what I would call a dippy process! Having to resort to such hocus-pocus has prevented us from proving that the theory of quantum electrodynamics is mathematically self-consistent. It's surprising that the theory still hasn't been proved self-consistent one way or the other by now; I suspect that renormalization is not mathematically legitimate. What is certain is that we do not have a good mathematical way to describe the theory of quantum electrodynamics: such a bunch of words to describe the connection between n and j and m and e is not good mathematics.
    • Richard Feynman, QED: The Strange Theory of Light and Matter (1985), Chap. 4. Loose Ends
  • During the Symposium on the Past Decade in Particle Theory at the University of Texas at Austin in April 1970, I had occasion to bring Dirac and Feynman together for a discussion at dinner. Dirac told Feynman that the relativistic quantum electrodynamics in its present form was an ugly theory, and before tackling the more difficult problems of elementary particle physics 'one must try to solve the problems of quantum electrodynamics. Electrodynamics is something we know most about, and we must find a consistent theory of it rather than get rid of the infinities in an arbitrary manner.' Feynman agreed with Dirac.
    • Jagdish Mehra, The Beat of a Different Drum (1994), Introduction
  • After such successes, it is not surprising that quantum electrodynamics in its simple renormalizable version has become generally accepted as the correct theory of photons and electrons. Nevertheless, despite the experimental success of the theory, and even though the infinities in this theory all cancel when one handles them correctly, the fact that the infinities occur at all continues to produce grumbling about quantum electrodynamics and similar theories. Dirac in particular always referred to renormalization as sweeping the infinities under the rug. I disagreed with Dirac and argued the point with him at conferences at Coral Gables and Lake Constance. Taking account of the difference between the bare charge and mass of the electron and their measured values is not merely a trick that is invented to get rid of infinities; it is something we would have to do even if everything was finite. There is nothing arbitrary or ad hoc about the procedure; it is simply a matter of correctly identifying what we are actually measuring in laboratory measurements of the electron’s mass and charge. I did not see what was so terrible about an infinity in the bare mass and charge as long as the final answers for physical quantities turn out to be finite and unambiguous and in agreement with experiment. It seemed to me that a theory that is as spectacularly successful as quantum electrodynamics has to be more or less correct, although we may not be formulating it in just the right way. But Dirac was unmoved by these arguments. I do not agree with his attitude toward quantum electrodynamics, but I do not think that he was just being stubborn; the demand for a completely finite theory is similar to a host of other aesthetic judgments that theoretical physicists always need to make.
    • Steven Weinberg, Dreams of a Final Theory (1992), Chap. 5 : Tales of Theory and Experiment
  • The quantum field theory of electrons and photons in the late 1940s had scored a tremendous success. Theorists – Feynman, Schwinger, Tomonaga, Dyson – had figured out after decades of effort how to do calculations preserving not only Lorentz invariance but also the appearance of Lorentz invariance at every stage of the calculation. This allowed them to sort out the infinities in the theory that had been noticed in the early 1930s by Oppenheimer and Waller, and that had been the bête noire of theoretical physics throughout the 1930s. They were able to show in the late 1940s that these infinities could all be absorbed into a redefinition, called a renormalization, of the electron mass and charge and the scales of the various fields. And they were able to do calculations of unprecedented precision, which turned out to be verified by experiment: calculations of the Lamb shift and the anomalous magnetic moment of the electron.
  • Little things affect big things, but they rarely affect very big things. Instead, little things affect slightly bigger things. And these, in turn, affect slightly bigger things too. But as you go up the chain, you lose the information about what came long before... In the 1970s a mathematical formalism was developed that makes these ideas concrete. This formalism is called the renormalisation group and provides a framework to describe physics at different scales. The renormalisation group gets little coverage in popular science articles, yet is arguably the single most important advance in theoretical physics in the past 50 years. While zoologists may have little need to talk to particle physicists, the right way to understand both the Higgs boson and the flocking of starlings is through the language of the renormalisation group.
  • [W]hen we have ideas and pictures that are extremely useful, they acquire elements of reality in and of themselves. But, philosophically, it is instructive to look at the degree to which such objects are purely instrumental—merely useful tools—and the extent to which physicists seriously suppose they embody an essence of reality... it is possible to view the renormalization group as merely an instrument or a computational device. On the other hand, at one extreme, one might say: ‘‘Well, the partition function itself is really just a combinatorial device.’’ But most practitioners tend to think of it (and especially its logarithm, the free energy) as rather more basic!
  • To assert that there exists an order parameter in essence says: ‘‘I may not understand the microscopic phenomena at all’’ (as was historically, the case for superfluid helium), ‘‘but I recognize that there is a microscopic level and I believe it should have certain general, overall properties especially as regards locality and symmetry: those then serve to govern the most characteristic behavior on scales greater than atomic.’’ ... Know the nature of the order parameter—suppose, for example, it is a complex number and like a wave function—then one knows much about the macroscopic nature of a physical system! ... Landau's introduction of the order parameter exposed a novel and unexpected foliation or level in our understanding of the physical world. Traditionally, one characterizes statistical mechanics as directly linking the microscopic world of nuclei and atoms (on length scales of 10-13 to 10-8 cm) to the macroscopic world of say, millimeters to meters. But the order parameter, as a dynamic, fluctuating object in many cases intervenes on an intermediate or mesoscopic level characterized by scales of tens or hundreds of angstroms up to microns (say, 10-6.5 to 10-3.5 cm).

External links edit

Wikipedia has an article about: