Intelligent Design, the best explanation of Origins

This is my personal virtual library, where i collect information, which leads in my view to Intelligent Design as the best explanation of the origin of the physical Universe, life, and biodiversity


You are not connected. Please login or register

Intelligent Design, the best explanation of Origins » Astronomy & Cosmology and God » Fine-tuning of the Big Bang

Fine-tuning of the Big Bang

Go down  Message [Page 1 of 1]

1Fine-tuning of the Big Bang  Empty Fine-tuning of the Big Bang on Fri Aug 22, 2014 2:05 pm

Admin


Admin
Fine-tuning of the Big Bang

http://reasonandscience.catsboard.com/t1866-fine-tuning-of-the-big-bang

The Big Bang was the most precisely planned event in all of history

Professor Stephen Hawking:
'If the rate of expansion one second after the Big Bang had been smaller by even one part in a hundred thousand million million, the Universe would have recollapsed before it ever reached its present state.' -

Steven Weinberg Department of Physics, University of Texas 
There are now two cosmological constant problems. The old cosmological constant problem is to understand in a natural way why the vacuum energy density ρV is not very much larger. We can reliably calculate some contributions to ρV , like the energy density in fluctuations in the gravitational field at graviton energies nearly up to the Planck scale, which is larger than is observationally allowed by some 120 orders of magnitude. Such terms in ρV can be cancelled by other contributions that we can’t calculate, but the cancellation then has to be accurate to 120 decimal places.

When one calculates, based on known principles of quantum mechanics, the "vacuum energy density" of the universe, focusing on the electromagnetic force, one obtains the incredible result that empty space "weighs" 1,093g per cubic centimetre (cc). The actual average mass density of the universe, 10-28g per cc, differs by 120 orders of magnitude from theory. 5 Physicists, who have fretted over the cosmological constant paradox for years, have noted that calculations such as the above involve only the electromagnetic force, and so perhaps when the contributions of the other known forces are included, all terms will cancel out to exactly zero, as a consequence of some unknown fundamental principle of physics.  But these hopes were shattered with the 1998 discovery that the expansion of the universe is accelerating, which implied that the cosmological constant must be slightly positive. This meant that physicists were left to explain the startling fact that the positive and negative contributions to the cosmological constant cancel to 120-digit accuracy, yet fail to cancel beginning at the 121st digit.

Curiously, this observation is in accord with a prediction made by Nobel laureate and physicist Steven Weinberg in 1987, who argued from basic principles that the cosmological constant must be zero to within one part in roughly 10^123 (and yet be nonzero), or else the universe either would have dispersed too fast for stars and galaxies to have formed, or else would have recollapsed upon itself long ago. In short, numerous features of our universe seem fantastically fine-tuned for the existence of intelligent life. While some physicists still hold out for a "natural" explanation, many others are now coming to grips with the notion that our universe is profoundly unnatural, with no good explanation.

Lee Smolin wrote in his 2006 book The Trouble with Physics:
We physicists need to confront the crisis facing us. A scientific theory [the multiverse/ Anthropic Principle/ string theory paradigm] that makes no predictions and therefore is not subject to experiment can never fail, but such a theory can never succeed either, as long as science stands for knowledge gained from rational argument borne out by evidence.

Max Tegmark:
“How far could you rotate the dark-energy knob before the “Oops!” moment? If rotating it…by a full turn would vary the density across the full range, then the actual knob setting for our Universe is about 10^123 of a turn away from the halfway point. That means that if you want to tune the knob to allow galaxies to form, you have to get the angle by which you rotate it right to 123 decimal places!

That means that the probability that our universe contains galaxies is akin to exactly 1 possibility in 1,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,
000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 . Unlikely doesn’t even begin to describe these odds. There are “only” 10^81 atoms in the observable universe, after all. 4

The cosmological constant
The smallness of the cosmological constant is widely regarded as the single the greatest problem confronting current physics and cosmology. The cosmological constant is a term in Einstein’s equation that, when positive, acts as a repulsive force, causing space to expand and, when negative, acts as an attractive force, causing space to contract. Apart from some sort of extraordinarily precise fine-tuning or new physical principle, today’s theories of fundamental physics and cosmology lead one to expect that the vacuum that is, the state of space-time free of ordinary matter fields—has an extraordinarily large energy density. This energy density, in turn, acts as an effective cosmological constant, thus leading one to expect an extraordinarily large effective cosmological constant, one so large that it would, if positive, cause space to expand at such an enormous rate that almost every object in the Universe would fly apart, and would, if negative, cause the Universe to collapse almost instantaneously back in on itself. This would clearly, make the evolution of intelligent life impossible. What makes it so difficult to avoid postulating some sort of highly precise fine-tuning of the cosmological constant is that almost every type of field in current physics—the electromagnetic field, the Higgs fields associated with the weak force, the inflaton field hypothesized by inflationary cosmology, the dilaton field hypothesized by superstring theory, and the fields associated with elementary particles such as electrons—contributes to the vacuum energy. Although no one knows how to calculate the energy density of the vacuum, when physicists make estimates of the contribution to the vacuum energy from these fields, they get values of the energy density anywhere from 10^53 to 10^120 higher than its maximum life-permitting value, max.6 (Here, max is expressed in terms of the energy density of empty space.) 3

The atheist scientists Dyson, Kleban and Susskind from the Department of Physics at Stanford University, write in their paper: 
Disturbing Implications of a Cosmological Constant: 
The low entropy starting point is the ultimate reason that the universe has an arrow of time, without which the second law would not make sense. However, there is no universally accepted explanation of how the universe got into such a special state. Far from providing a solution to the problem, we will be led to a disturbing crisis. Present cosmological evidence points to an inflationary beginning and an accelerated de Sitter end. Most cosmologists accept these assumptions, but there are still major unresolved debates concerning them. For example, there is no consensus about initial conditions. Neither string theory nor quantum gravity provide a consistent starting point for a discussion of the initial singularity or why the entropy of the initial state is so low. High scale inflation postulates an initial de Sitter starting point with Hubble constant roughly 10−5 times the Planck mass. This implies an initial holographic entropy of about 10^10, which is extremely small by comparison with today’s visible entropy. Some unknown agent initially started the inflaton high up on its potential, and the rest is history. 

We are forced to conclude that in a recurrent world like de Sitter space our universe would be extraordinarily unlikely. A possibility is an unknown agent intervened in the evolution, and for reasons
of its own restarted the universe in the state of low entropy characterizing inflation. 4

Big Bang models attribute an energy density — the amount of energy per cubic centimetre — to the initial state of the cosmos, as well as an initial rate of expansion of space itself. The subsequent evolution of the universe depends sensitively on the relation between this energy density and the rate of expansion. Pack the energy too densely and the universe will eventually recontract into a big crunch; spread it out too thin and the universe will expand forever, with the matter diluting so rapidly that stars and galaxies cannot form. Between these two extremes lies a highly specialised history in which the universe never recontracts and the rate of expansion eventually slows to zero. In the argot of cosmology, this special situation is called W = 1. Cosmological observation reveals that the value of W for the visible universe at present is quite near to 1. This is, by itself, a surprising finding, but what’s more, the original Big Bang models tell us that W = 1 is an unstable equilibrium point, like a marble perfectly balanced on an overturned bowl. If the marble happens to be exactly at the top it will stay there, but if it is displaced even slightly from the very top it will rapidly roll faster and faster away from that special state.

This is an example of cosmological fine-tuning. In order for the standard Big Bang model to yield a universe even vaguely like ours now, this particular initial condition had to be just right at the beginning.

1. https://evo2.org/big-bang-precisely-planned/
2. http://aeon.co/magazine/science/why-does-the-universe-appear-fine-tuned-for-life/
3. GOD AND DESIGN The teleological argument and modern science , page 180
4. https://arxiv.org/pdf/hep-th/0208013v3.pdf
5. https://phys.org/news/2014-04-science-philosophy-collide-fine-tuned-universe.html

http://elshamah.heavenforum.com

Back to top  Message [Page 1 of 1]

Similar topics

-

» scalding hot water

Permissions in this forum:
You cannot reply to topics in this forum