Posted by:
Homeless
(
)
Date: January 25, 2013 04:20AM
Michaelm,
I read the other two links. Now I see where those on the board are getting these ideas. You are regurgitating the literature like a "science Ensign" on the living room table. For example, the link has a cute writing here:
"Imagine you have two clocks. One thinks there are 86,400 seconds in a day, the other thinks that there are 86,401, so the second clock runs a tad bit slower than the first [ie but at constant rate]. Every day, it’s one second behind, clicking over to midnight one second after the first clock does [at a constant rate]. Mind you, it keeps accurate time according to its own gears: every day has 86,401 seconds, so it’s not slowing down [constant rate].
"However, to keep it synchronized with the other clock, we’d either have to subtract a second from the second clock (yikes, terminology is a bit confusing there!) or add one to the first clock every day. So we’d need a leap second every day, but not because the clock is slowing. It’s only because it runs at a different [but constant] rate."
My comments: The logic in this explanation breaks down when applied to the "real world" because we have two clocks, yes, but the definition of a "second" is the same for the two clocks. It is the "second" defined by the atomic clock. Thus both clocks use the same definition of a second, which was decided in 1967. As such, the logic in the article shatters like glass. The reason the seconds need to be added is because the earth rotation is *NOT* constant. It slows down by 1 millisecond EACH MONTH, according to the US navy. Thus, the definition of a "second" as the earth rotates is different each month because the rate of spin is not constant.
The quote above, however, assumes that each clock has a different "CONSTANT RATE" to define a second, with separate rates, and as such, it confuses the readers, like everyone here that believes the logic, that it's simply two clocks out of sync. The fact is, however, that there is only ONE definition of a second, which is the atomic second.
Look at it this way. I will make it dramatic. Imagine having a clock on the wall that did not keep a consistent time. Each day the hands went slower and slower and slower, until they stopped. That's the earth's rotation. If the earth stopped spinning, that's the end of time because the earth would no longer be earth as we know it. No one would be on earth to change the atomic clocks. Everyone would be dead, let's presume.
However, before the clock stopped, you need to keep adding seconds to the good clock, the atomic clock, that didn't keep slowing down. You need to measure in "atomic seconds" how long that the "slowing" clock takes to have its hands go around the clock twice for one day. Eventually, the time to measure the slowing clock would not be possible, since the hands would stop and the atomic clock would have to add infinity seconds to the numbers, making it meaningless.
Why is that? Because the definition of "second" is defined by the atomic clock, *not* the rotation of the hands of the clock that is slowing down.
And therefore, it is not because you are getting two "clocks" in sync with two constant rates, as the article implies, but the "atomic second" is used for *both* clocks, and one is a constant rate and one is not.
Now back to reality. The earth has not stopped spinning, of course, but the dramatic illustration makes the point clearer. It's too hard to see talking about a few seconds of time. Therefore, the reason atomic clock needs to be adjusted for a "leap second" is that the earth is not spinning at a constant rate, but it is always going slower and slower. The atomic clock, which defines the second, is not slowing down, and therefore, it needs to be adjusted for the earth slowing down.
My question is what is the amount of "time" that is added each year because the earth is slowing down. Until I see the calculation of the .005 seconds per year, the "accepted rate" by scientists, whatever that means, it sounds like to me the navy calculated rate of .365 seconds per year is more credible. However, that means that if we take the .365 rate back 4.5 billions years, it adds 47 years of time, which is unreasonable. The 6000 year assumption only adds 35 minutes.
I appreciate these four links, after reading them, I can see where the proliferation of logic that the "leap second" is just a recalibration of two different clocks that has been created, by using a **bad assumption** of two clocks with constant rates, leading to the wrong conclusion.
Now that I've identified that bad assumption, that should clear up the misunderstanding for you. It also gives a great example of why I look for the core "hidden" assumptions, so I can make sure I'm choosing the right ones.