Today I’ll be looking at a paper on the fine-tuning of the universe by Professor Fred Adams. He is professor of physics at the University of Michigan, where his main field of research is astrophysical theory focusing on star formation, background radiation fields, and the early universe.
Fred Adams published a paper in 2008 entitled “Stars In Other Universes: Stellar structure with different fundamental constants”. The paper garnered some interest from the science blogosphere and popular science magazines. Here are the relevant parts of the abstract:
Motivated by the possible existence of other universes, with possible variations in the laws of physics, this paper explores the parameter space of fundamental constants that allows for the existence of stars. To make this problem tractable, we develop a semi-analytical stellar structure model. [We vary] the gravitational constant G, the fine structure constant $\latex alpha$, and a composite parameter C that determines nuclear reaction rates. Our main finding is that a sizable fraction of the parameter space (roughly one fourth) provides the values necessary for stellar objects to operate through sustained nuclear fusion. As a result, the set of parameters necessary to support stars are not particularly rare.
The first thing to admit is that I am a galaxies-and-larger astronomer. Stars aren’t my speciality. So I can’t provide an in depth assessment of the model.
That said, this seems to be a very nice piece of work, and the sort of careful, detailed research that the fine-tuning field needs. Section 4 of the paper, titled “Unconventional Stars”, is an interesting discussion on the possibilities for objects like radiating black holes filling the role of stars in other universes. And he is aware that he has only considered one life-permitting criterion, and thus doesn’t overstate his conclusions by passing judgement on all claims of fine-tuning.
I have one minor quibble, and one major quibble. The minor quibble is the use of the parameter C. This parameter is related to the nuclear reaction rates. We are told that C “depends in a complicated manner on the strong and weak forces, as well as the particle masses”. Actually, the expression for C (equation 18) also contains the fine structure constant, which doesn’t help. Later in the paper, Adams varies C by a factor of 100 in either direction. But we have no way of knowing what a variation in C means for the strong force, or the weak force, or the particle masses. It would have been much more instructive to vary a fundamental constant, even if it was just the strong force (keeping the weak force and particle masses constant), because of the centrality of the strong force to the properties of stars.
This is a minor quibble, because it wouldn’t surprise me if Adams’ range of C corresponds to a reasonable range of the strong force. I’m fairly sure Adams knows what he’s doing.
My major quibble regards the claim that “a sizable fraction of the parameter space (roughly one fourth) provides the values necessary for stellar objects to operate through sustained nuclear fusion”. I’m always sceptical of these sorts of probability estimates, and in this case it is with good reason. The basis of this claim is figure 5:
G is the gravitational constant, and is the fine structure (electromagnetism) constant. The triangle is our universe, and the area under the line is where stars are permitted. The dashed (dotted) line is for C increased (decreased) by a factor of 100.
We see that, indeed, roughly one fourth of the plot is under the solid line. However, Adams has made two crucial assumptions that aren’t mentioned, yet are crucial to the probability estimate. The first is the limits of the plot. Ten orders of magnitude seems like a large range, but remember that the ratio of the strong force to the gravitational force is about . If we extended parameter space to , then the probability drops to about 9%. We could make this probability as small as we like by enlarging parameter space, and Adams has given us no reason not to do so.
More important is Adams’ use of a logarithmic scale. This is equivalent to assuming that the Probability Density Function (PDF) for a model parameter is uniform in over the allowed range.
Adams has not justified this assumption, so what happens if we change it? If, instead, we were to plot in normal (not log) space (i.e. the PDF is uniform in ), then the probability would drop from 25% to . And if we extend the possible range of the gravitational force to where it is as strong as the strong force (), then the probability drops to .
The word “robust” in statistics has a very specific meaning. A statistical estimate is robust if it remains relatively unchanged by reasonable changes in the model’s assumptions. Any probability estimate that we are free to alter by 42 orders of magnitude is worse than non-robust. It is meaningless. Adams should at least acknowledge this problem – if he can’t justify his prior probability distribution for his parameters, then he can’t make a probability estimate at all. His claim that “a sizable fraction of the parameter space (roughly one fourth) provides the values necessary for [stars]” is thus utterly baseless.
My biggest concern is that when Adams’ work is cited in the context of the fine-tuning of the universe for life, the figure of 25% is simply accepted as scientifically proven. Here are some quotes (mostly from internet articles and blogs):
Serious scientists such as Fred Adams have clearly negated the fine-tuning claim. About a QUARTER of the Adams’ universes turned out to be populated by energy-generating stars.
Adams reckons his results suggest that the “specialness” of our universe could well be an illusion. (From New Scientist)
Fred Adams has investigated the problem of the life times of stars … His conclusion was that stable, long-lived stars existed in vast regions of parameter space. (Steve Zara and Øystein Elgarøy)
Fred Adams has recently shown that all this talk of the universe being fine-tuned is based on a false, simplistic premise.
Recent studies by Fred Adams indicate that the existence of life may not even require fine tuning at all.
Adams’ work cannot support these claims. Even if the figure of 25% were robust, it still wouldn’t follow that fine-tuning has been “negated” – there are plenty more fine-tuning claims that Adams hasn’t addressed (even with regard to stars), and hasn’t claimed to address. But most importantly, he could have just as easily concluded that only 1 part in of parameter space allows for stars.
The question of the prior PDF for the fundamental constants of nature is a very difficult one – where do we start? A multiverse proposal (see Ellis), like the string landscape, should be able to calculate it – in theory. In the absence of such a model, we would need to reason very carefully, considering the full consequences of any assumptions we make. That being the case, the best approach is to carefully separate these two questions: 1. What range of the fundamental constants is life-permitting? 2. What is the probability that a universe, chosen at random from the range of possible universes, will fall in the life-permitting range? We are on much safer ground answering the first question.
It’s the same as the difference between asking, “where on the dartboard do I need to hit to score a bullseye?”, and asking, “what is the probability that a bullseye will be hit in a single throw?” Answering the first question is necessary but not sufficient to answer the second. The answer to the second question depends crucially on how the darts are thrown.
More of my posts on fine-tuning are here.