Archive for the ‘fine tuning’ Category

Last time, we looked at historian Richard Carrier’s article, “Neither Life nor the Universe Appear Intelligently Designed”. We found someone who preaches Bayes’ theorem but thinks that probabilities are frequencies, says that likelihoods are irrelevant to posteriors, and jettisons his probability principles at his leisure. In this post, we’ll look at his comments on the fine-tuning of the universe for intelligent life. Don’t get your hopes up.

Simulating universes

Here’s Carrier.

Suppose in a thousand years we develop computers capable of simulating the outcome of every possible universe, with every possible arrangement of physical constants, and these simulations tell us which of those universes will produce arrangements that make conscious observers (as an inevitable undesigned by-product). It follows that in none of those universes are the conscious observers intelligently designed (they are merely inevitable by-products), and none of those universes are intelligently designed (they are all of them constructed merely at random). Suppose we then see that conscious observers arise only in one out of every 10^{1,000,000} universes. … Would any of those conscious observers be right in concluding that their universe was intelligently designed to produce them? No. Not even one of them would be.

To see why this argument fails, replace “universe” with “arrangement of metal and plastic” and “conscious observers” with “driveable cars”. Suppose we could simulate the outcome of every possible arrangement of metal and plastic, and these simulations tell us which arrangements produce driveable cars. Does it follow that none of those arrangements could have been designed? Obviously not. This simulation tells us nothing about how actual cars are produced. The fact that we can imagine every possible arrangement of metal and plastic does not mean that every actual car is constructed merely at random. This wouldn’t even follow if cars were in fact constructed by a machine that produced every possible arrangement of metal and plastic, since the machine itself would need to be designed. The driveable cars it inevitably made would be the product of design, albeit via an unusual method.

Note a few leaps that Carrier makes. He leaps from bits in a computer to actual universes that contain conscious observers. He leaps from simulating every possible universe to producing universes “merely at random”. As a cosmological simulator myself, I can safely say that a computer program able to simulate every possible universe would require an awful lot of intelligent design. Carrier also seems to assume that a random process is undesigned. Tell that to these guys. Random number generators are a common feature of intelligently designed computer programs. This argument is an abysmal failure.

How to Fail Logic 101

Carrier goes on … (more…)


Read Full Post »

After a brief back and forth in a comments section, I was encouraged by Dr Carrier to read his essay “Neither Life nor the Universe Appear Intelligently Designed”. I am assured that the title of this essay will be proven “with such logical certainty” that all opposing views should be wiped off the face of Earth.

Dr Richard Carrier is a “world-renowned author and speaker”. That quote comes from none other than the world-renowned author and speaker, Dr Richard Carrier. Fellow atheist Massimo Pigliucci says,

The guy writes too much, is too long winded, far too obnoxious for me to be able to withstand reading him for more than a few minutes at a time.

I know the feeling. When Carrier’s essay comes to address evolution, he recommends that we “consider only actual scholars with PhD’s in some relevant field”. One wonders why, when we come to consider the particular intersection of physics, cosmology and philosophy wherein we find fine-tuning, we should consider the musings of someone with a PhD in ancient history. (A couple of articles on philosophy does not a philosopher make). Especially when Carrier has stated that there are six fundamental constants of nature, but can’t say what they are, can’t cite any physicist who believes that laughable claim, and refers to the constants of the standard model of particle physics (which every physicist counts as fundamental constants of nature) as “trivia”.

In this post, we will consider Carrier’s account of probability theory. In the next post, we will consider Carrier’s discussion of fine-tuning. The mathematical background and notation of probability theory were given in a previous post, and follow the discussion of Jaynes. (Note: probabilities can be either p or P, and both an overbar \bar{A} and tilde \sim A denote negation.)

Probability theory, a la Carrier

I’ll quote Carrier at length.

Bayes’ theorem is an argument in formal logic that derives the probability that a claim is true from certain other probabilities about that theory and the evidence. It’s been formally proven, so no one who accepts its premises can rationally deny its conclusion. It has four premises … [namely P(h|b), P(~h|b), P(e|h.b), P(e|~h.b)]. … Once we have [those numbers], the conclusion necessarily follows according to a fixed formula. That conclusion is then by definition the probability that our claim h is true given all our evidence e and our background knowledge b.

We’re off to a dubious start. Bayes’ theorem, as the name suggests, is a theorem, not an argument, and certainly not a definition. Also, Carrier seems to be saying that P(h|b), P(~h|b), P(e|h.b), and P(e|~h.b) are the premises from which one formally proves Bayes’ theorem. This fails to understand the difference between the derivation of a theorem and the terms in an equation. Bayes’ theorem is derived from the axioms of probability theory – Kolmogorov’s axioms or Cox’s theorem are popular starting points. Any necessity in Bayes’ theorem comes from those axioms, not from the four numbers P(h|b), P(~h|b), P(e|h.b), and P(e|~h.b). (more…)

Read Full Post »

I recently read philosopher of science Tim Maudlin’s book Philosophy of Physics: Space and Time and thought it was marvellous, so I was expecting good things when I came to read Maudlin’s article for Aeon Magazine titled “The calibrated cosmos: Is our universe fine-tuned for the existence of life – or does it just look that way from where we’re sitting?“. I’ve got a few comments. Indented quotes below are from Maudlin’s article unless otherwise noted.

In a weekend?

Theories now suggest that the most general structural elements of the universe — the stars and planets, and the galaxies that contain them — are the products of finely calibrated laws and conditions that seem too good to be true. … The details of these sorts of calculations should be taken with a grain of salt. No one could sit down and rigorously work out an entirely new physics in a weekend.

Two few quick things. “Theories” has a ring of “some tentative, fringe ideas” to the lay reader, I suspect. The theories on which one bases fine-tuning calculations are precisely the reigning theories of modern physics. These are not “entirely new physics” but the same equations (general relativity, the standard model of particle physics, stellar structure equations etc.) that have time and again predicted the results of observations, now applied to different scenarios. I think Maudlin has underestimated both the power of order of magnitude calculations in physics,  and the effort that theoretical physicists have put into fine-tuning calculations. For example, Epelbaum and his collaborators, having developed the theory and tools to use supercomputer lattice simulations to investigate the structure of the C12 nucleus, write a few papers (2011, 2012) to describe their methods and show how their cutting-edge model successfully reproduces observations. They then use the same methods to investigate fine-tuning (2013). My review article cites upwards of a hundred papers like this. This is not a back-of-the-envelope operation, not starting from scratch, not entirely new physics, not a weekend hobby. This is theoretical physics.

Telling your likelihood from your posterior

It can be unsettling to contemplate the unlikely nature of your own existence … Even if your parents made a deliberate decision to have a child, the odds of your particular sperm finding your particular egg are one in several billion. … after just two generations, we are up to one chance in 10^27. Carrying on in this way, your chance of existing, given the general state of the universe even a few centuries ago, was almost infinitesimally small. You and I and every other human being are the products of chance, and came into existence against very long odds.

The slogan I want to invoke here is “don’t treat a likelihood as if it were a posterior”. That’s a bit to jargon-y. The likelihood is the probability of what we know, assuming that some theory is true. The posterior is the reverse – the probability of the theory, given what we know. It is the posterior that we really want, since it reflects our situation: the theory is uncertain, the data is known. The likelihood can help us calculate the posterior (using Bayes theorem), but in and of itself, a small likelihood doesn’t mean anything. The calculation Maudlin alludes to above is a likelihood: what is the probability that I would exist, given that the events that lead to my existence came about by chance? The reason that this small likelihood doesn’t imply that the posterior – the probability of my existence by chance, given my existence – is small is that the theory has no comparable rivals. Brendon has explained this point elsewhere. (more…)

Read Full Post »

I’ve spent a lot of time critiquing articles on the fine-tuning of the universe for intelligent life. I should really give the other side of the story. Below are some of the good ones, ranging from popular level books to technical articles. I’ve given my recommendations for popular cosmology books here.

Books – Popular-level

  • Just Six Numbers, Martin Rees – Highly recommended, with a strong focus on cosmology and astrophysics, as you’d expect from the Astronomer Royal. Rees gives a clear exposition of modern cosmology, including inflation, and ends up giving a cogent defence of the multiverse.
  • The Goldilocks Enigma, Paul Davies – Davies is an excellent writer and has long been an important contributor to this field. His discussion of the physics is very good, and includes a description of the Higgs mechanism. When he strays into metaphysics, he is thorough and thoughtful, even when he is defending conclusions that I don’t agree with.
  • The Cosmic Landscape: String Theory and the Illusion of Intelligent Design, Leonard Susskind – I’ve reviewed this book in detail in a previous blog posts. Highly recommended. I can also recommend his many lectures on YouTube.
  • Constants of Nature, John Barrow – A discussion of the physics behind the constants of nature. An excellent presentation of modern physics, cosmology and their relationship to mathematics, which includes a chapter on the anthropic principle and a discussion of the multiverse.
  • Cosmology: The Science of the Universe, Edward Harrison – My favourite cosmology introduction. The entire book is worth reading, not least the sections on life in the universe and the multiverse.
  • At Home in the Universe, John Wheeler – A thoughtful and wonderfully written collection of essays, some of which touch on matters anthropic.

I haven’t read Brian Greene’s book on the multiverse but I’ve read his other books and they’re excellent. Stephen Hawking discusses fine-tuning in A Brief History of Time and the Grand Design. As usual, read anything by Sean Carroll, Frank Wilczek, and Alex Vilenkin.

Books – Advanced

  • The Cosmological Anthropic Principle, Barrow and Tipler – still the standard in the field. Even if you can’t follow the equations in the middle chapters, it’s still worth a read as the discussion is quite clear. Gets a bit speculative in the final chapters, but its fairly obvious where to apply your grain of salt.
  • Universe or Multiverse (Edited by Bernard Carr) – the new standard. A great collection of papers by most of the experts in the field. Special mention goes to the papers by Weinberg, Wilczek, Aguirre, and Hogan.

Scientific Review Articles

The field of fine-tuning grew out of the so-called “Large numbers hypothesis” of Paul Dirac, which is owes a lot to Weyl and is further discussed by Eddington, Gamow and others. These discussions evolve into fine-tuning when Dicke explains them using the anthropic principle. Dicke’s method is examined and expanded in these classic papers of the field: (more…)

Read Full Post »

It’s always a nervous moment when, as a scientist, you discover that a documentary has been made on one of your favourite topics. Science journalism is rather hit and miss. So it was when the Australian Broadcasting Corporation (ABC), our public TV network, aired a documentary about the fine-tuning of the universe for intelligent life as part of their Catalyst science series. (I’ve mentioned my fine-tuning review paper enough, haven’t I?).

The program can be watched on ABC iView. (International readers – does this work for you?). It was hosted by Dr Graham Phillips, who has a PhD in Astrophysics. The preview I saw last week was promising. All the right people’s heads were appearing – Sean Carroll, Brian Greene, Paul Davies, Leonard Susskind, Lawrence Krauss, Charley Lineweaver. John Wheeler even got a mention.

Overall – surprisingly OK. They got the basic science of fine-tuning correct. Phillips summarises fine-tuning as:

When scientists look far into the heavens or deeply down into the forces of nature, they see something deeply mysterious. If some of the laws that govern our cosmos were only slightly different, intelligent life simply couldn’t exist. It appears that the universe has been fine-tuned so that intelligent beings like you and me could be here.

Not bad, though I’m not sure why it needed to be accompanied by such ominous music. There is a possibility for misunderstanding, however. Fine-tuning is a technical term in physics that roughly means extreme sensitivity of some “output” to the “input”. For example, if some theory requires an unexplained coincidence between two free parameters, then the “fine-tuning” of the theory required to explain the data counts against that theory. “Fine-tuned” does not mean “chosen by an intelligent being” or “designed”. It’s a metaphor.

Ten minutes in, the only actual case of fine-tuning that had been mentioned was the existence of inhomogeneities in the early universe. Sean Carroll:

If the big bang had been completely smooth, it would just stay completely smooth and the history of the universe would be very, very boring. It would just get more and more dilute but you would never make stars, you would never make galaxies or clusters of galaxies. So the potential for interesting complex creatures like you and me would be there, but it would never actually come to pass. So we’re very glad that there was at least some fluctuation in the early universe.

Paul Davies then discussed the fact that there not only need to be such fluctuations, but they need to be not-too-big and not-too-small. Here’s the scientific paper, if you’re interested.

The documentary also has a cogent discussion of the cosmological constant problem – the “mother of all fine-tunings” – and the fine-tuning of the Higgs field, which is related to the hierarchy problem. Unfortunately, Phillips calls it “The God Particle” because “it gives substance to all nature’s other particles”. Groan.

Once we move beyond the science of fine-tuning, however, things get a bit more sketchy.

The Multiverse

Leonard Susskind opens the section on the multiverse by stating that the multiverse is, in his opinion, the only explanation available for the fine-tuning of the universe for intelligent life. At this point, both the defence and the prosecution could have done more.

Possibilities are cheap. Sean Carroll appears on screen to say “Aliens could have created our universe” and then is cut off. We are told that if we just suppose there is a multiverse, the problems of fine-tuning are solved. This isn’t the full story on two counts – the multiverse isn’t a mere possibility, and it doesn’t automatically solve the fine-tuning problem. (more…)

Read Full Post »

A commenter over at my post “Got a cosmology question?” asks:

Someone told me “there is not a single paper which finds fine tuning that has allowed multivariation”. Can you please refute this?

Incidentally, cosmology questions are still very welcome over there.

“Multivariation” is not a word, but in this context presumably means varying more than one variable at a time. There is an objection to fine-tuning that goes like this: all the fine-tuning cases involve varying one variable only, keeping all other variables fixed at their value in our universe, and then calculating the life-permitting range on that one variable. But, if you let more than one variable vary at a time, there turns out to be a range of life-permitting universes. So the universe is not fine-tuned for life.

This is a myth. The claim quoted by our questioner is totally wrong. The vast majority of fine-tuning/anthropic papers, from the very earliest papers in the 70’s until today, vary many parameters1. I’ve addressed these issues at length in my review paper. I’ll summarise some of that article here.

The very thing that started this whole field was physicists noting coincidences between the values of a number of different constants and the requirements for life. Carter’s classic 1974 paper “Large number coincidences and the anthropic principle in cosmology” notes that in order for the universe to have both radiative and convective stars we must have (in more modern notation to his equation 15, but it’s the same equation),

\alpha_G^{1/2} \approx \alpha^6 \beta^2

where, in Planck units, \alpha_G = m_{proton}^2\alpha = e^2\beta = m_{electron}/m_{proton}, and e is the charge on the electron. (Interestingly, Barrow and Tipler show that the same condition must hold for stars emit photons with the right energy to power chemical reactions e.g. photosynthesis.) Similarly for cosmological cases: for the universe to live long enough for stars to live and die, we must have,

|\kappa| \lesssim \left( \frac{\eta^2}{m_{proton}} \right)^{1/3} m_{proton}^3

where \kappa is related to the curvature of space and \eta is roughly the baryon to photon ratio.

This continues in the classic anthropic papers. Carr and Rees (1977) show that to have hydrogen to power stars left over from big bang nucleosynthesis, and to have supernovae distribute heavy elements, we must have (in Planck units, rearranging their equation 61),

m_{electron}^{-3/2} \sim g_w

where g_w is the weak coupling constant.

Barrow and Tipler’s “The Anthropic Cosmological Principle” shows that, for carbon and larger elements to be stable, we must have:

\alpha_s \lesssim 0.3 \alpha ^{1/2}

where \alpha_s is the strong force coupling constant (evaluated at m_Z, if you’re interested).

The whole point of these relations and more like them, which the early anthropic literature is entirely concerned with, is that they relate a number of different physical parameters. There are approximations in these calculations – they are order-of-magnitude – but this usually involves assuming that a dimensionless mathematical constant is approximately one. At most, a parameter may be assumed to be in a certain regime. For example, one may assume that \alpha and \beta are small (much less than one) in order to make an approximation (e.g. that the nucleus is much heavier than the electron, and the electron orbits non-relativistically). These approximations are entirely justified in an anthropic calculation, because we have other anthropic limits that are known to (not merely assumed to) involve one variable – e.g. if \beta is large, all solids are unstable to melting, and if \alpha is large then all atoms are unstable. See section 4.8 of my paper for more information and references.

More modern papers almost always vary many variables. Examples abound. Below is figure 2 from my paper, which shows Figures from Barr and Khan and Tegmark, Aguirre, Rees and Wilczek. (Seriously, people … Wilczek is a Nobel prize winning particle physicist and Martin Rees is the Astronomer Royal and former president of the Royal Society. These people know what they are doing.)

figure2 from my paperThe top two panels show the anthropic limits on the up-quark mass (x axis) and down-quark mass (y axis). 9 anthropic limits are shown. The life-permitting region is the green triangle in the top right plot. The lower two panels show cosmological limits on the cosmological constant (energy density) \rho_\Lambda, primordial inhomogeneity Q, and the matter density per CMB photon. Tegmark et al. derive from cosmology 8 anthropic constraints on the 7 dimensional parameter space (\alpha, \beta, m_{proton}, \rho_\Lambda, Q, \xi,\xi_{baryon}). Tegmark and Rees (1997) derive the following anthropic constraint on the primordial inhomogeneity Q:

equation for Q(1)

Needless to say, there is more than one variable being investigated here. For more examples, see Figures 6, 7 (from Hogan), 8 (from Jaffe et al.) and 9 (from Tegmark) of my paper. The reason that the plots above only show two parameters at a time is because your screen is two dimensional. The equations and calculations from which these plots are constructed take into account many more variables than can be plotted on two axes.

This myth may have started because, when fine-tuning is presented to lay audiences, it is often illustrated using one-parameter limits. Martin Rees, for example, does this in his excellent book “Just Six Numbers“. Rees knows that the limits involve more than one parameter – he derived many of those limits. But equation (1) above would be far too intimidating in a popular level book.

My paper lists about 200 publications relevant to the field. I can only think of a handful that only vary one parameter. The scientific literature does not simply vary one parameter at a time when investigating life-permitting universes. This is a myth, born of (at best) complete ignorance.


Postscript: The questioner’s discussion revolves around the article of Harnik, Kribs & Perez (2006) on a universe without weak interactions. It’s a very clever article. Their weakless universe requires “judicious parameter adjustment” and so is also fine-tuned. Remember that fine-tuning doesn’t claim that our universe is uniquely life-permitting, but rather that life-permitting universes are rare in the set of possible universe. Thus, the weakless universe is not a counterexample to fine-tuning. There are also concerns about galaxy formation and oxygen production. See the end of Section 4.8 of my paper for a discussion.


1. Even if fine-tuning calculations varied only one parameter, it wouldn’t follow that fine-tuning is false. Opening up more parameter space in which life can form will also open up more parameter space in which life cannot form. As Richard Dawkins (1986) rightly said: “however many ways there may be of being alive, it is certain that there are vastly more ways of being dead, or rather not alive.” For more, see section 4.2.2 of my paper.

More of my posts on fine-tuning are here.

Read Full Post »

Beginning with Hugh Ross, I undertook to critique various articles on the fine-tuning of the universe for intelligent life that I deemed to be woeful, or at least in need of correction. A list of previous critiques can be found here. I generally looked for published work, as correcting every blog post, forum or YouTube comment is a sure road to insanity. I was looking to maximise prestige of publication, “magic bullet” aspirations and wrongness about fine-tuning. I may have a new record holder.

It’s an article published in the prestigious British Journal for the Philosophy of Science by a professor of philosophy who has written books like “Introduction to the Philosophy of Science”. It claims to expose the “philosophical naivete and mathematical sloppiness on the part of the astrophysicists who are smitten with [fine-tuning]”. The numbers, we are told, have been “doctored” by a practice that is “shrewdly self-advantageous to the point of being seriously misleading” in support of a “slickly-packaged argument” with an “ulterior theological agenda”. The situation is serious, as [cue dramatic music] … “the fudging is insidious”. (Take a moment to imagine the Emperor from Star Wars saying that phrase. I’ll wait.)

It will be my task this post to demonstrate that the article “The Revenge of Pythagoras: How a Mathematical Sharp Practice Undermines the Contemporary Design Argument in Astrophysical Cosmology” (hereafter TROP, available here) by Robert Klee does not understand the first thing about the fine-tuning of the universe for intelligent life – its definition. Once a simple distinction is made regarding the role that Order of Magnitude (OoM) calculations  play in fine-tuning arguments, the article will be seen to be utterly irrelevant to the topic it claims to address.

Note well: Klee’s ultimate target is the design argument for the existence of God. In critiquing Klee, I am not attempting to defend that argument. I’m interested in the science, and Klee gets the science wrong.

Warning Signs

Klee, a philosopher with one refereed publication related to physics (the one in question), is about to accuse the following physicists of a rather basic mathematical error: Arthur Eddington, Paul Dirac, Hermann Weyl, Robert Dicke, Brandon Carter, Hermann Bondi, Bernard Carr, Martin Rees, Paul Davies, John Barrow, Frank Tipler1, Alan Lightman, William H. Press and Fred Hoyle. Even John Wheeler doesn’t escape Klee’s critical eye. That is quite a roll call. Eddington, Dirac, Weyl, Bondi, Rees, Hoyle and Wheeler are amongst the greatest scientists of the 20th century. The rest have had distinguished careers in their respective fields. They are not all astrophysicists, incidentally.

That fact should put us on edge when reading Klee’s article. He may, of course, be correct. But he is a philosopher up against something of a physicist dream team.

Klee’s Claim

The main claim of TROP is that fine-tuning is “infected with a mathematically sharp practice: the concepts of two numbers being of the same order of magnitude, and of being within an order of each other, have been stretched from their proper meanings so as to doctor the numbers”. The centrepiece of TROP is an examination of the calculations of Carr and Rees (1979, hereafter CR79) – “[this] is a foundational document in the area, and if the sharp practice infests this paper, then we have uncovered it right where it could be expected to have the most harmful influence”.

CR79 derives OoM equations for the levels of physical structure in the universe, from the Planck scale to nuclei to atoms to humans to planets to stars to galaxies to the whole universe. They claim that just a few physical constants determine all of these scales, to within an order of magnitude. Table 1 of TROP shows a comparison of CR79’s calculations to the “Actual Value”.

Klee notes that only 8 of the 14 cases fall within a factor of 10. Hence “42.8%” of these cases are “more than 1 order magnitude off from exact precision”. The mean of all the accuracies is “19.23328, over 1 order of magnitude to the high side”. Klee concludes that “[t]hese statistical facts reveal the exaggerated nature of the claim that the formulae Carr and Rees devise determine ‘to an order of magnitude’ the mass and length scales of every kind of stable material system in the universe”. Further examples are gleaned from Paul Davies’ 1982 book “The Accidental Universe”, and his “rudimentary” attempt to justify “the sharp practice” as useful approximations is dismissed as ignoring the fact that these numbers are still “off from exact precision – exact fine tuning”.

And there it is …

I’ll catalogue some of Klee’s mathematical, physical and astrophysical blunders in a later section, but first let me make good on my promise from the introduction – to demonstrate that this paper doesn’t understand the definition of fine-tuning. The misunderstanding is found throughout the paper, but is most clearly seen in the passage I quoted above:

[Davies’] attempted justification [of an order of magnitude calculation] fails. 10^2 is still a factor of 100 off from exact precision – exact fine-tuning – no matter how small a fraction of some other number it may be [emphasis added].

Klee thinks that fine-tuning refers to the precision of these OoM calculations: “exact precision” = “exact fine-tuning”. Klee thinks that, by pointing about that these OoM approximations are not exact and sometimes off by more than a factor of 10, he has shown that the universe is not as fine-tuned as those “astrophysicists” claim.

Wrong. Totally wrong. (more…)

Read Full Post »

« Newer Posts - Older Posts »