J
JoeSixPack
Why is it that the multitude of electronic clocks that now inhabit nearly
every device and appliance in most households are so often fast or slow,
often by 15 minutes or more? Oddly enough, the most accurate clock I own is
a mechanical wall clock that I haven't touched for over 6 months, and still
doesn't need setting.
The answer is that mechanical clocks had achieved a high degree of accuracy
before they were mostly replaced by digital clocks. One way they achieved
this was to accomplish a way to capture the information gained when a clock
is reset to the correct time. When such a mechanical clock is advanced 3
minutes in a 24 hour period, it speeds up the mechanism by 3 minutes in a 24
hour period. A very significant feat when accomplished mechanically, but it
would be much easier to accomplish in a the circuitry of a digital clock.
For some reason this wasn't done, at least not in any devices I'm familiar
with.
I know that the clock in Windows is now reset from a central time server,
but that's a crude solution to what should have been a simple design
feature. It would have been a simple thing to calculate the time between
clock settings, and adjust the speed of the clock accordingly by correcting
the factors that are applied to the quartz crystal frequency. Wide
variances in time setting differences such as moving to a different time
zone or daylight savings adjustments could easily be filtered out of the
correction factor.
I guess it's the old American way of building something. As soon as the
thing works, the job is done and it's Miller time. It's nice when they can
budget a little extra time for enhancements. That's the difference between
"good enough" and "excellence."
every device and appliance in most households are so often fast or slow,
often by 15 minutes or more? Oddly enough, the most accurate clock I own is
a mechanical wall clock that I haven't touched for over 6 months, and still
doesn't need setting.
The answer is that mechanical clocks had achieved a high degree of accuracy
before they were mostly replaced by digital clocks. One way they achieved
this was to accomplish a way to capture the information gained when a clock
is reset to the correct time. When such a mechanical clock is advanced 3
minutes in a 24 hour period, it speeds up the mechanism by 3 minutes in a 24
hour period. A very significant feat when accomplished mechanically, but it
would be much easier to accomplish in a the circuitry of a digital clock.
For some reason this wasn't done, at least not in any devices I'm familiar
with.
I know that the clock in Windows is now reset from a central time server,
but that's a crude solution to what should have been a simple design
feature. It would have been a simple thing to calculate the time between
clock settings, and adjust the speed of the clock accordingly by correcting
the factors that are applied to the quartz crystal frequency. Wide
variances in time setting differences such as moving to a different time
zone or daylight savings adjustments could easily be filtered out of the
correction factor.
I guess it's the old American way of building something. As soon as the
thing works, the job is done and it's Miller time. It's nice when they can
budget a little extra time for enhancements. That's the difference between
"good enough" and "excellence."