Let’s continue the story from the last blog. Scientists have a real problem when it comes to dating old things – our methods are very unreliable and give a wide range of dates (as seen last time). That means that artifacts are dated by theory rather than by hard science.
Science has every right to observe the present state of a system (or thing), including any changes it is undergoing at the moment such as slowing changing from one element to another – uranium to lead, rubidium to strontium, etc. They also have the right to measure the rate of the processes they observe. However, when they try to extrapolate their observations into the very distant past or future, they have to put assumptions into their raw data and that makes all the difference. I’ll use an illustration originally given by John Morris (I think).
Let’s say you are listening to a boring lecturer – me, perhaps. You don’t want to listen any more so you look around the meeting hall and see a man sitting in a chair peeling potatoes. There is a basket of unpeeled potatoes to his left and a basket of peeled potatoes to his right. As you watch him, he picks up an unpeeled potato about the time the second hand of the hall’s clock gets to 12. He finishes peeling the potato and tosses it into the basket with other peeled potatoes in time to pick up another unpeeled potato as the second hand once again gets to 12. You watch this for ten or fifteen minutes and see that the pattern remains unchanged. You have now observed and measured the available data but you have a question: how long has this been going on? You go look at the basket of peeled potatoes and see 20 in there. You take the data you have and determine that he has been peeling potatoes for twenty minutes and that seems good – it sounds like science… but it isn’t.
You have made an assumption about the rate: it has always been constant, it has never sped up or slowed down throughout its history, and there has never been a single interruption or interference in this process.
You assumed the rate of potato peeling had always been constant because it was constant during the short period of time you observed it. But what if the man used to take longer to peel a potato but now was much better at it and had found his rhythm? What if he had been much faster in the past but was now tired and slowing down?
You also assumed that the basket of peeled potatoes consisted only of those peeled by that man at that rate. You weren’t there when the basket was placed and so you don’t know if it already contained some potatoes or if it once contained many more and, somehow, some of them were removed.
These three assumptions kill your “scientific” findings. You assumed something about the process rate which cannot be verified. You assumed the process was isolated from other systems (no one added potatoes) and you assumed you knew the conditions that existed at the beginning of the process (the basket now containing peeled potatoes was empty).
And those are the same assumptions made by those who use radiometric dating systems and Carbon 14 to date artifacts.
But it gets worse – some of the evidence is shamefully mishandled in order to make it seem as if the earth were older than it is. A case in point is Niagara Falls. I used to live just a matter of a 4 hour drive from the falls and went there often. They are now artificially stabilized (at great cost and great effort – it’s really quite impressive the engineering that went into that) but it had been observed for a long time that the falls eroded at the rate of 4 to 5 feet each year. They erode the Niagara escarpment, bringing the falls closer to Lake Erie as they wear away the boundary between Lake Erie and Lake Ontario. They are now seven miles from Lake Ontario so scientists who measured their rate of erosion back in the 1700 and 1800s extrapolated the rate backward in time and said that they began their erosion (and were thus “formed”) 9,000 years ago. If the rate has always been constant, they were right.
But what if there were more water in the past? Why is it necessary to think that the entire seven miles were eroded out by these falls; in other words, what did it look like when this geology was laid down by God, ice, or time?
When the father of modern geology – and the inventor of the concept of vast ages for rocks and the earth – arrived at the falls in 1841, Charles Lyell completely disregarded all the earlier measurements of how quickly the escarpment eroded. It didn’t give him the age he needed the falls to be, so he junked the 4-5 feet measured rate and claimed they only eroded 1 foot a year, giving the falls the age of 35,000 years. It was deceptive and not based on observation but on the need of his theory – the same way rocks are dated today as we saw in the last blog.
And Lyell’s dates are now discarded because we have now seen that 35,000 years isn’t old enough for the evolutionary processes (and geologic time) required by Darwin’s theory. So the numbers are now multiplied several times over without even pretending to measure the rate of erosion.
When we measure something radiometrically, such as the rate of uranium turning into lead, we make assumptions: that the rate of decay never changed from the day the rocks were laid down (by what or Whom is another question) to the present day. There is no reason to assume this, but it is assumed anyway. Remember that the half life of uranium (we are speaking here of uranium 238 which changes to lead 206) is 4.51 billion years. We are assuming – though we have only been measuring for much less than 100 years – that the rate has not changed during that time.
It is also assumed that neither the parent (in this case, uranium) nor the daughter (in this case, lead) element had their concentrations changed in the bit of rock we are measuring… and since the earth is made of rocks, we are always only measuring a trillionth of one percent of the available rock. Even picking our rock sample is adjusting the percentage of uranium to lead but that is never mentioned in textbooks. Most scientists are good and decent people and they try very hard to find a specimen that looks like it is clean and hasn’t been interfered with by the cosmos, ground water, etc. Still, it is a tiny specimen and we have to assume what the ratio of elements was at its beginning before we can measure the rate and guess at an age.
And when the guesses don’t match fossils found in the area, the ages are changed, NOT the presumed ages of the fossils, but the ages of the rocks. The theory dates the rocks every single time.
We KNOW that leeching and contamination occur in every rock system. That should be enough to junk all our dates, but, instead, this is just ignored.
But there’s more… and it’s even worse. We can actually see new rock being laid down. How exciting! Here’s a chance to get new rock and see the ratio of elements in it, measure it, and come up with zero or nearly zero. And yet, every time we do this we come up with dates radically different from that which we know is true.
For example, Sunset Crater in Arizona is a relatively young volcano that blew in the recent past. Native American artifacts are found in its lava including the remains of their villages and agricultural tools. The Native Americans say this happened just over 900 years ago and tree ring data (which records events such as local eruptions) agrees, dating the eruption at 1065AD. The lava flows were dated by every known radiometric method and the results were embarrassing. Rather than telling us that the rocks were 900-1000 years old, as we KNOW they are, they gave us dates of around 230,000 years old. When questioned about this, they will normally tell you that excess argon threw off the dates…but they will not discuss the possibility of excess elements throwing off other dates they’ve touted for over a century now.
This happens every single time we know the age of the rocks being measured. In New Zealand, Mount Rangitoto blew up 300 years ago (proven by tree damage, carbon 14, and native accounts). Radiometric dating puts the 300 year old rocks at more than 485,000 years old.
You will be told by Grand Canyon guides that volcanoes in that region are very, very young and blew only 10,000 years ago but Native Americans, who have only lived in that region for a few thousand years, talk about the eruptions as happening in their history. When some of the rocks were dated, they came back with an age of 117 million years. Somebody is a wee bit off here.
Kaupelehu blew up in Hawaii in 1800 and 1801. When those rocks were dated, they showed dates from 140 million to 3 billion years old even though the scientists dating them knew them to be just over 200 years old. Lake Crater in Oahu has the same problem. So does Kilauea. Rocks from Kilauea known to have been formed about two hundred years ago routinely give dates of 12 to 21 million years (+/- 8 million).
So we see dishonest handling of evidence here, yet again. When the rate is known (Niagara), it is changed or ignored to fit the theory. When the age is known and the dating methods we currently possess are shown to be farcical, we confidently assert the dates anyway.
This sort of thing happens around Darwinists far too often. It made me uncomfortable when I walked with them and tried to be one of them. And then I saw how they handled other evidence and I cringed even more. We’ll look at that next time.
I know some of you are wondering if Christians handle evidence faithfully at all times. No, they don’t. But they are much better at dealing with evidence fairly than the folk referenced in this blog. I’ll prove that eventually. Hang in there. We have a fair distance to go before we are done with this subject.