All living things on the earth ultimately obtain their energy from the sun, as do the wind and water cycles. And nuclear reactions power the sun. In theory, as its nuclear fuel ‘burns’ up, the sun’s core should shrink, and this would make the reactions occur more readily. Therefore, the sun should shine more brightly as it ages (see panel below).
But this means that if billions of years were true, the sun would have been much fainter in the past. However, there is no evidence that the sun was fainter at any time in the earth’s history. Astronomers call this the ‘faint young sun paradox’, but it is no paradox at all if the sun is only as old as the Bible says—about 6,000 years.
Evolutionists and long-agers believe that life appeared on the earth about 3.8 billion years ago. But if that timescale were true, the sun would be 25% brighter today than it was back then. This implies that the earth would have been frozen at an average temperature of –3ºC. However, most paleontologists believe that, if anything, the earth was warmer in the past.1 The only way around this is to make arbitrary and unrealistic assumptions of a far greater greenhouse effect at that time than exists today,2 with about 1,000 times more CO2 in the atmosphere than there is today.3
The scientific evidence is consistent with the sun having the age that we would expect from a straightforward reading of the Bible. In 6,000 years or so, there would have been no significant increase in energy output from the sun. It is a problem only for old-age ideas.