Feeds:
Posts

## Units First: A Fine-Tuned Critique of William Lane Craig (Part 3)

An emailer asked for my comments on this video, so I thought I’d post them here. It’s a video by William Lane Craig, with help from some nifty graphics and a narrator. Craig here defends the fine-tuning argument for the existence of God, as he has been doing for some time.

While Craig has done his homework on fine-tuning, the video has problems. I’ll be commenting here on the physics of fine-tuning, not the fine-tuning argument for God. I’ll leave the metaphysics to the philosophers, for now. (The previous two sentences will be copied and pasted into the comments section as many times as necessary.)

Before addressing the video, I’ve heard Craig say a few times that “there are about fifty constants and physical quantities simply given in the Big Bang themselves that if they were altered even to one part in a hundred million million million the universe would not have permitted the existence of life.” There can’t be 50 fine-tuned constants. There aren’t even 50 fundamental constants of nature, including cosmic initial conditions. There are, in the usual count, 31. (I have a sneaking suspicion that Craig is thinking of the large numbers of fine-tuning criteria compiled by Hugh Ross, which are of varying quality.)

Let’s look at the video; all quotes are from the transcript.

From galaxies and stars, down to atoms and subatomic particles, the very structure of our universe is determined by these numbers.

So far, so good.

Speed of Light: $c = 299,792,458 ~ m ~ s^{-1}$
Gravitational Constant:$G = 6.673 \times 10^{-11} ~ m^3~ kg^{-1} ~ s^{-2}$
Planck’s Constant: $1.05457148 \times 10^{-34} ~ m^2 ~ kg ~ s^{-2}$

The final value is actually the reduced Planck constant $(h / 2 \pi )$, and the units are wrong; it should be $m^2 ~ kg ~ s^{-1}$. But there’s a bigger problem here.

[Edit (14/4/2016). I don’t know why I didn’t realise this sooner, but the list of constants and their values is from my review paper. Including the mistake in the units of the reduced Planck constant. So the culprit, it turns out, is me! Oops.]

There is no fact about what number we should attach to a certain length, mass or interval of time. There is only a fact about the ratio of two such quantities. When I report that I am 1.78m tall, what I am really reporting is the ratio of my height to a standard unit of measurement. I can use any units I like, and so the number 1.78 is not a property of me only. I am also 5 feet 10 inches, and 0.97 fathoms. The number only has meaning in a certain system of units, and the choice of units is arbitrary. It is for convenience only. (I thoroughly recommend the metric system).

Thus, we cannot simply talk about the fine-tuning of any constant with units. Until we have specified the system of units, changing the number is meaningless. In fact, in the metric system, $c = 299,792,458 ~ m ~ s^{-1}$ is true by definition. Given the definition of the second, this equation doesn’t really state the speed of light; it defines the metre.

In fine-tuning, we want to explore different universes, that is, other ways that the universe could have been. Asking ‘what if the universe had $c = 299,792,459 ~ m ~ s^{-1}$?’ makes no sense – it would merely redefine our unit of length. More generally, specifying the numerical value of G, c or h doesn’t fully specify the physics of the universe in question. We might merely have the same universe as ours, but described in a different units.

To avoid these problems, we should first fix a system of units in a given possible universe. The best way is to choose three constants with (some combination of length, mass and time) dimensions and set their values. I think that the most useful system is Planck units, which set the three constants above (c,G,h) to one. These, then, cannot be changed, but any change to the other constants of nature really does change the physics of the universe.

Planck Mass-Energy: $1.2209 \times 10^{22}$ MeV

The Planck mass is $m_{Pl} = \sqrt{\hbar c / G}$. Thus, it is not an independent constant, and it depends only on the arbitrary constants (c,G,h). Once we have specified that we are using Planck units, $m_{Pl} = 1$. It is doubly not fine-tuned: it is not fundamental, and it is not a free parameter.

Mass of Electron, Proton, Neutron: 0.511; 938.3; 939.6 MeV

The mass of the electron is a fundamental constant, and can be expressed in Planck units as $m_e = 4.1849 \times 10^{-23}$. However, the masses of the proton and neutron are not fundamental, but derived. They have contributions from the strong force binding energy (about 780 MeV), the masses of the constituent (~10 MeV) and virtual (~150 MeV) quarks, and, for the proton, electromagnetism (~ 1 MeV).

Some very important fine-tuning cases consider the masses of the proton and the neutron, and particularly their mass difference. Most recently, see Hall, Pinner, and Ruderman. But these should be translated into limits on the relevant fundamental constants: up, down and strange quark mass, Higgs vev, QCD scale and strength of electromagnetism.

Mass of Up, Down, Strange Quark: 2.4; 4.8; 104 MeV (Approx.)

Now we’re talking. See, for example, Figure 2 of my review paper.

Ratio of Electron to Proton Mass: $(1836.15)^{-1}$

Clearly not a fundamental parameter. Again, we can consider the effects of changing this number, but we should translate our findings to be in terms of the fundamental constants. Otherwise, we will overstate the degree of fine-tuning by artificially inflating the number of required tunings.

Gravitational Coupling Constant: $5.9 \times 10^{-39}$

This number, known as $\alpha_G$, is in fact the ratio of the proton mass to the Planck mass (squared). So, having decided to work in Planck units, it simply is the proton mass squared, and is a derived rather than fundamental parameter.

Cosmological Constant: $2.3 \times 10^{-3} eV$

Good. A fundamental constant and perhaps the best example of a fine-tuned parameter.

Hubble Constant: 71 km/s/Mpc (today)

Nope. Given a certain set of cosmological parameters, we can compute the entire expansion history of the universe. That is, we can calculate its relative size at all times. But there is one parameter that remains: when is now? At what point in cosmic history are we observing?

This is the age of the universe, and can equivalently be specified by the Hubble constant as (roughly) $1/H_0$.So this is not a fundamental parameter. To consider a universe with $1/H_0$ increased by 86,400 seconds is simply to consider this universe tomorrow.

I think that the video means to refer to the fine-tuning of the universes’s expansion rate. But this fine-tuning case is about the expansion rate at very early times, not today. (Victor Stenger made exactly this mistake in his book on fine-tuning. See Appendix A.1 of my review paper.)

Higgs Vacuum Expectation Value: 246.2 GeV

An excellent example of a fine-tuned constant; see, for example, Agrawal et al. 1998.

Scientists have come to the shocking realization that each of these numbers have been carefully dialed to an astonishingly precise value – a value that falls within an exceedingly narrow, life-permitting range. If any one of these numbers were altered by even a hair’s breadth, no physical, interactive life of any kind could exist anywhere.

Actually, some of the fundamental constants don’t seem to have a significant effect on life – e.g. the various matrix angles, the QCD vacuum phase.

Don’t get me wrong: fine-tuning, as a phenomenon in physics, is real. But not every fundamental constant is fine-tuned.

Consider gravity, for example. The force of gravity is determined by the gravitational constant. If this constant varied by just one in $10^{60}$ parts, none of us would exist. … the universe would either have expanded and thinned out so rapidly that no stars could form and life couldn’t exist, or it would have collapsed back on itself with the same result: no stars, no planets, no life.

This is referring to the flatness problem, though expressing it in terms of the gravitational constant is a little unusual. The full case is as follows: for the universe to live long enough and create structure, we require that at early times,

$\left| \frac{8 \pi G \rho_i}{3 H^2_i } - 1 \right| \lesssim \epsilon_i$.

where $H_i$ is the expansion rate, $\rho_i$ is the mass-energy density, and $\epsilon_i$ is a small number. If we take the “early time” to be one second (as BBN starts), then $\epsilon_i \sim 10^{-16}$; if the Planck time, then $\epsilon_i \sim 10^{-60}$. So, if (not taking my advice above) we do not fix G with our choice of units, and we consider the density and expansion rate to be fixed at a certain time, then we do indeed have a tight constraint on the gravitational constant. However, this is an unusual way of presenting the case, and in particular, it obscures the dependence on the initial conditions of the universe.

However, this case of fine-tuning also implies that our cosmological model only explains our observations of the universe (not just that it is life-permitting) for a seemingly narrow range of its parameters. It has thus motivated the development of inflationary theory, according to which the universe undergoes a rapid burst of accelerating expansion in its earliest stages. If inflation happened (and lasted long enough), then the condition above is not particularly surprising. Also, there are further theoretical reasons to view the flatness problem with suspicion. This is not a typical fine-tuning case. A priori arguments and dynamical mechanisms might explain it, without simply shifting the problem elsewhere. Note, however, that many inflationary models do exactly this, exchanging on fine-tuned parameter for another.

Or consider the expansion rate of the universe. This is driven by the cosmological constant. A change in its value by a mere 1 part in $10^{-60}$ would cause the universe to expand too rapidly or too slowly. In either case, the universe would, again, be life-prohibiting.

Two minor quibbles. Firstly, the cosmological constant does not necessarily drive the expansion of the universe, so this case should not be referred to as fine-tuning “the expansion rate of the universe”. It is the fine-tuning of the cosmological constant. Secondly, the problem is not that the universe might expand too slowly. The problem is that a negative value for the cosmological constant will, regardless of the expansion rate at any particular time, eventually cause the universe to transition from expansion to contraction and recollapse. If the cosmological constant is too large and negative, the universe will not live long enough to create life (E.g. Peacock).

Or, another example of fine-tuning: If the mass and energy of the early universe were not evenly distributed to an incomprehensible precision of 1 part in 10^10^123, the universe would be hostile to life of any kind.

Again, a choice has been made here between technical precision and explaining at a popular level. This case is really about the initial entropy of the universe, which is known to be very low because the early universe is very nearly perfectly homogeneous. I think I can forgive this one.

For future reference, the fundamental parameters upon which interesting fine-tuning constraints can be placed are as follows. From particle physics: Higgs vev, the masses (or, equivalently, Yukawa parameters) of the electron mass, up, down and strange quark, and neutrino, the strong force coupling constant, and the fine-structure constant. From cosmology: the cosmological constant, the scalar fluctuation amplitude (“lumpiness”, Q), the number of spacetime dimensions, the baryonic and dark matter mass-to-photon ratios, and the initial entropy of the universe.

After a few quotes, the rest of the video is philosophy. While the video needs a minor overhaul, the science upon which Craig wants to make his case is sound, in my opinion. And the opinion of many other scientists, believing and non-believing alike. The pressing question is: what, if anything, should we conclude from fine-tuning?

### 10 Responses

1. on May 6, 2016 at 2:05 am | Reply Martin Rodriguez

Hey Luke,

Another author says that our universe is not fine tuned at all, contrary to what you say, is he right?

http://arxiv.org/abs/1505.05359

2. I am certainly no expert on this, but to me the abstract indicates not that fine tuning doesn’t exist, but that it isn’t a sign of a Designer nor a Multiverse.

Further, he turns the problem on the head, saying that life is fine tuned to the universe (perhaps then taking the universe as a brute fact?), not the other way around.

People more knowledgable than me will have to expand 🙂

3. Without reading it in detail, he seems to be making a similar argument to that of Halvorson (http://philsci-archive.pitt.edu/11004/). I don’t find it convincing. I see no real argument for the conclusion that “Our Universe has not been fine-tuned for life: life has been fine-tuned to our Universe”. I have a half written draft about such issues. Stay tuned.

• Other critique ideas include Colyvan and Preist, Hugh Ross, Jimmy Licon, and that Iron Chariots page on fine tuning.

• I’m also addressing Colyvan. I hadn’t heard of Licon – I’ll have a look.

4. on May 9, 2016 at 9:08 am | Reply Martin Rodriguez

I too have a doctorate in astrophysics (I am from South America, sorry about English! My dissertation was on FLRW cosmology) that I earned a few years back and have read up some of the sources that ‘Michael’ gave.

I looked up Jimmy Lico. He seems to to be a philosophy graduate student, who simply argues that the universe is of poor design; nothing new.

I wish to digress a little: another Philosopher, John Roberts seems to have managed to dent some of the attacks against fine tuning by some like Bradley and Jonathan Weisberg, it is impressive, but somewhat speculative, please see: http://www.apologeticsinthechurch.com/uploads/7/4/5/6/7456646/infraredbullseye-philstud-final-wtp.pdf – understand Friedmann formulas, not bayesian probability 😦

In regards to Iron Chariots, that page is horrendously flawed, and irrelevant.

Although I am not a proponent of ID per se, I am Roman Catholic, although I tend to stay to the likes of Luke Barnes on fine tuning, rather than those like Hugh Ross (don’t even get me started on people like Jason Lisle or Danny Faulkner, making us Christians all look like a bunch of idiots).

Luke Barnes is impartial; unlike Stenger on one end, and Hugh Ross on the other. This is why I love this blog so much. No nonsense or lies. 🙂

Thanks Luke! Please get back to me if you can.

5. Yeah; the Klaas Landsman paper really flies against Luke Barnes.

6. on May 16, 2016 at 11:37 pm | Reply Martin Rodriguez

Hi Luke, when is your next post coming?

-Martin

7. It seems that Dr. Craig took note of his mistakes and corrected them See: https://www.youtube.com/watch?v=EE76nwimuT0