Many scientists believe it is several billion years old, but creationists believe it is only thousands. Which theory is true, and can it be relatively proved through scientific calculation and observable phenomena? I believe it can be fairly accurately proved to be quite young, but let’s first take a look at the calculations made by many brilliant men and women, expert scientists in their respective fields, who believe it is actually several billion years old.Radiometric dating is used by naturalist scientists today to show the earth is quite old, but how does it work and is this method accurate? Carbon-14 is a radioactive isotope of carbon. It is unstable and tends to decay into carbon-12, but the process takes thousands of years. For example, it would take half of the carbon-14 isotope found in a dinosaur bone about 5760 years to decay into carbon-12. Only things that are or at one time were alive contain carbon-14. Neither rock nor fossils found in the rocks can be dated this way, because rocks were never alive and the fossils found in them have turned to rock. This is why fossils found in layers of rock are dated according to the perceived age of the rock layer, and not the fossil itself. Therefore, some other method is used to discover the dates of various rock formations.
Often uranium dating is used to measure the age of rocks where it is found. Uranium decays into lead, and the theory is that one can measure the age of the rock by observing how much lead it contains. Uranium takes about 700 million years for half the uranium to turn into lead, but in order to use this method of dating rocks, one must make three major assumptions. First, we must assume that all of the lead in the rock was uranium, yet this isn’t always true. The presence of lead in the rock strata when it was formed would give a false indication of a more ancient age of the rock layer Secondly, scientists must assume no uranium has leaked out of the rock formation. However, this is an unrealistic viewpoint if we know the rocks were under water, as would be in the case of a flood. Uranium salts would leak out and dissolved in the water, giving the rock strata a greater age when measured by this method of dating. For example, the Hawaiian volcano, Kilauea, erupted in 1823, and some of its lava spilled into the ocean. When it was subsequently dated with potassium-argon, it was measured to be 22 million years old when it shouldn’t have been more than 200. Obviously the highly soluble potassium salt had leaked out, giving the molten rock a false reading of age. Likewise, the eruption of Hualalai in Hawaii, in 1801 was dated to be 160 million to 3 billion years old in those deposits there were submerged in the ocean.
Finally, scientists must assume the rate of decay has remained constant since the rocks were formed. For example, the production of neutrinos through cosmic radiation could have been enhanced, or the reversal of Earth’s magnetic field in the past could have affected the rate of decay, as could the explosion of a nearby star. So, the rate of decay of these radioactive isotopes is not as clear as what might at first be thought. The rates of decay must be assumed to have always been at the same rates we observe them today.
Therefore, radiometric dating is not so much an accurate method of measuring the age of the earth, as it is a method of thinking. There are other natural means of dating the earth, but these have been rejected by naturalist scientists, due to the fact that their worldview requires a universe that is billions of years old. Nevertheless, the earth’s magnetic field has been measured for over 100 years, and scientists have kept accurate records of its strength. Dr. Thomas G. Barnes, physicist, had calculated that the earth’s magnetic field has a half-life of only 1400 years. Therefore, there are limits that strength of the magnetic field could have been and still retain life on our planet. Even at only 10,000 years Earth’s magnetic would have been comparable to that of a magnetic star. Such a thing would be impossible for life to exist. Therefore, our planet must be younger than 10,000 years, according to this theory.
Another, theory concerns cosmic dust. The earth and the moon were calculated to be about the same age, but when the astronauts landed on the moon there was only a few inches of dust when several feet was expected if the moon was indeed billions of years old. There is no water or wind on the mood to erode the dust layer on its surface; so, many scientists were astonished to find such a thin layer of cosmic dust there.
The naturalist **must** have something like a Big Bang theory to account for the existence of the universe. His worldview is wrapped up in the theory of evolution, which has a need of great ages for it to have any place in the realm of believability. If there were no theory of evolution, there wouldn’t be a need for a Big Bang!
 That is, it takes half of the existing uranium-235 in a rock to decay into lead-207 about 700 million years, but it takes half of uranium-238 in a rock formation to decay into lead-206 about 4.5 billion years. So the rate of decay is dependent upon the type of uranium in question.
 If you liked this article, you may want to click on Creation.com ‘s article “How Old Is the Earth?” It contains much more information there than I could have offered in a short blog here. Moreover, most if not all of their articles are written by Christian scientists who are authorities on the subjects that folks like me are only able to read about.