This entry is about the notion of naturalness in (particle-)physics. For the notion in mathematics see at natural transformation.
physics, mathematical physics, philosophy of physics
theory (physics), model (physics)
experiment, measurement, computable physics
Axiomatizations
Tools
Structural phenomena
Types of quantum field thories
In (particle-)physics the term “naturalness” refers to the vague idea that a model of physics is expected to work without requiring unlikely-looking ad-hoc coincidences or “fine-tuning” of its parameters, and that instead all such “conspiracies” among parameter values are to have some systematic cause, such as some symmetry of the model.
An archetypical example, which at times has been referred to as the naturalness probem of the standard model of particle physics, is that the mass of the Higgs boson is first of all many orders of magnitude below the Planck scale, while at the same time potentially receiving very large potential quantum corrections from the presence of (potentially undetected) heavy particles, which therefore must coincidentally cancel out. This is also called the hierarchy problem.
Accordingly, a popular suggestion has been that there should be a symmetry in nature which is to naturally explain this otherwise coincidental cancellation, namely low-energy supersymmetry. The failure of such a symmetry to be observed at the LHC experiment is leading the high energy physics community to a re-examination of the naturalness paradigm and/or to focus on alternative mechanisms, such as a composite Higgs boson.
The next main example of (lack of) naturalness often considered is the cosmological constant (dark energy), which seems tiny when compared to the quantum corrections which it naively receives from the vacuum energy of all fields in the observable universe.
A key problem is arguably that the principle of naturalness has been used in a very vague and very ambiguous sense. Even apart from the problem of which large or small numbers are to be regarded as “unlikely” (see the quote below from Wilson 04), there is technical fine print.
Both the mass of the Higgs particle as well as the cosmological constant are subject to renormalization in perturbative quantum field theory, whence “quantum corrections”.
(For more on the renormalization-freedom in the cosmological constant see there.)
A key fact, that seems to be mostly overlooked in discussion of naturalness/fine-tuning, is that renormalization parameters a priori take values in an affine space (see this theorem).
This means that without choosing an arbitrary origin and then a set of coordinates, there is no sense in which a quantum correction is “large” or “small”. An origin and a set of coordinates is however provided by a choice of renormalization scheme, and discussions in the literature typically implicitly take such a choice for granted.
Even if one assumes that the parameters under consideration are invariantly defined numbers, independent on choices like renormalization schemes (see above) it has been questioned whether it is fruitful to consider the “naturalness” of the values they take
The following is from Kane 17 (there in a discussion of G2-MSSM model building):
Until recently there were no theories predicting the values of superpartner masses. The arguments based on ‘naturalness’ are basically like saying the weather tomorrow should be the same as today. The opposite of naturalness is having a theory. $[$…$]$ Claims $[$superpartners$]$ should have been seen would be valid given so called naturalness arguments, but are wrong in actual theories. Many of us think that is a misuse of the idea of naturalness, but it is the fashionable use.
(p. 33 (3-2))
Some arguments (‘naturalness’) can be used to estimate what values $[$MSSM parameters$]$ might have. If those arguments were correct some superpartners would already have been discovered at the CERN LHC. It would have been nice if the naturalness arguments had worked, but they did not. Since they were not predictions from a theory it is not clear how to interpret that.
(p. 39 (4-3))
The failure of naïve naturalness to describe the world tells us we should look harder for a theory that does, an ‘ultraviolet completion’. Compactified string/ M-theories appear to be strong candidates for such a theory.
The alternative to naturalness, often neglected as an alternative, is having a theory.
(p. 57 (6-1))
Similarly Wilson 04, p. 10:
$[ \cdots ]$ The claim was that it would be unnatural for such particles to have masses small enough to be detectable soon. But this claim makes no sense when one becomes familiar with the history of physics. There have been a number of cases where numbers arose that were unexpectedly small or large. An early example was the very large distance to the nearest star as compared to the distance to the Sun, as needed by Copernicus, because otherwise the nearest stars would have exhibited measurable parallax as the Earth moved around the Sun. Within elementary particle physics, one has unexpectedly large ratios of masses, such as the large ratio of the muon mass to the electron mass. There is also the very small value of the weak coupling constant. In the time since my paper was written, another set of unexpectedly small masses was discovered: the neutrino masses. There is also the riddle of dark energy in cosmology, with its implication of possibly an extremely small value for the cosmological constant in Einstein’s theory of general relativity. This blunder was potentially more serious, if it caused any subsequent researchers to dismiss possibilities for very large or very small values for parameters that now must be taken seriously.
Early discussion:
Murray Gell-Mann, introductory talk at Shelter Island II, 1983 (pdf)
in: Shelter Island II: Proceedings of the 1983 Shelter Island Conference on Quantum Field Theory and the Fundamental Problems of Physics. MIT Press. pp. 301–343. ISBN 0-262-10031-2.
Survey and discussion includes
Gian Francesco Giudice, Naturally Speaking: The Naturalness Criterion and Physics at the LHC (arXiv:0801.2562)
Porter Williams, Naturalness, the autonomy of scales, and the 125 GeV Higgs, Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 51, August 2015, Pages 82–96 (publisher, PhilSciArchive)
Simon Friedrich, Fine-Tuning, Stanford Encyclopedia of Philosophy, 2017
Gian Francesco Giudice, The Dawn of the Post-Naturalness Era (arXiv:1710.07663)
Arthur Hebecker, Lectures on Naturalness, String Landscape and Multiverse (arXiv:2008.10625)
An attempt to make the notion precise is due to
Greg Anderson, Diego Castano, Measures of fine tuning, Phys. Lett.B 347:300-308, 1995 (arXiv:hep-ph/9409419)
James Wells, Evaluation and Utility of Wilsonian Naturalness (arXiv:2107.06082)
Critical comments are in
Kenneth Wilson, The Origins of Lattice Gauge Theory, (arXiv:hep-lat/0412043)
Petr Horava, Surprises with Nonrelativistic Naturalness, Int. J. Mod. Phys. D25 (2016) 1645007 (arXiv:1608.06287)
Gordon Kane, String theory and the real world, Morgan & Claypool, 2017 (doi:0.1088/978-1-6817-4489-6)
James Wells, Naturalness, Extra-Empirical Theory Assessments, and the Implications of Skepticism (arXiv:1806.07289)
James Wells, Finetuned Cancellations and Improbable Theories (arXiv:1809.03374)
Matěj Hudec, Michal Malinský, Hierarchy and decoupling (arXiv:1902.04470)
Goran Senjanovic, Natural Philosophy versus Philosophy of Naturalness (arXiv:2001.10988)
See also
Last revised on July 14, 2021 at 04:06:05. See the history of this page for a list of all contributions to it.