Maker Pro
Maker Pro

Digital Clocks

J

JoeSixPack

Why is it that the multitude of electronic clocks that now inhabit nearly
every device and appliance in most households are so often fast or slow,
often by 15 minutes or more? Oddly enough, the most accurate clock I own is
a mechanical wall clock that I haven't touched for over 6 months, and still
doesn't need setting.

The answer is that mechanical clocks had achieved a high degree of accuracy
before they were mostly replaced by digital clocks. One way they achieved
this was to accomplish a way to capture the information gained when a clock
is reset to the correct time. When such a mechanical clock is advanced 3
minutes in a 24 hour period, it speeds up the mechanism by 3 minutes in a 24
hour period. A very significant feat when accomplished mechanically, but it
would be much easier to accomplish in a the circuitry of a digital clock.
For some reason this wasn't done, at least not in any devices I'm familiar
with.

I know that the clock in Windows is now reset from a central time server,
but that's a crude solution to what should have been a simple design
feature. It would have been a simple thing to calculate the time between
clock settings, and adjust the speed of the clock accordingly by correcting
the factors that are applied to the quartz crystal frequency. Wide
variances in time setting differences such as moving to a different time
zone or daylight savings adjustments could easily be filtered out of the
correction factor.

I guess it's the old American way of building something. As soon as the
thing works, the job is done and it's Miller time. It's nice when they can
budget a little extra time for enhancements. That's the difference between
"good enough" and "excellence."
 
P

PeteS

Part of the answer << That's the difference between "good enough" and
"excellence." >>
is cost.

Consumer devices are very cost sensitive, and how the clock is
generated is part of that. A typical 32.768kHz resonator or crystal
will have a nominal error of up to 200ppm (parts per million) but which
will get worse depending on temperature and ageing. That's by far the
most common frequency used for this function, although many processors
simply use their main clock and divide it to the closest approximation
- that, in particular, will give larger errors. (32.768kHz divided
through a 15 bit binary counter yields exactly 1Hz).

The circuitry needed to auto-adjust would add unnecessary costs (at
least in the view of the manufacturers) for a non-essential feature
(absolute accuracy over time). Typically we would just buy a simple RTC
that may even have the crystal internally - see the list at Maxim for a
typical selection:
http://www.maxim-ic.com/products/timers/real_time_clocks.cfm
(Most of the clocks are actually Dallas Semi parts)

In higher end and truly embedded equipment (where there are no user
adjustable controls), much depends on whether there is a method of
externally synchronising the clock so the error between external sync
is negligible. If so, there is no point in using (relatively) expensive
techniques.

Can we auto adjust? Sure. Do we? Only if we must.

Cheers

PeteS
 
A

Alexander

PeteS said:
Part of the answer << That's the difference between "good enough" and
"excellence." >>
is cost.

Consumer devices are very cost sensitive, and how the clock is
generated is part of that. A typical 32.768kHz resonator or crystal
will have a nominal error of up to 200ppm (parts per million) but which
will get worse depending on temperature and ageing. That's by far the
most common frequency used for this function, although many processors
simply use their main clock and divide it to the closest approximation
- that, in particular, will give larger errors. (32.768kHz divided
through a 15 bit binary counter yields exactly 1Hz).

The circuitry needed to auto-adjust would add unnecessary costs (at
least in the view of the manufacturers) for a non-essential feature
(absolute accuracy over time). Typically we would just buy a simple RTC
that may even have the crystal internally - see the list at Maxim for a
typical selection:
http://www.maxim-ic.com/products/timers/real_time_clocks.cfm
(Most of the clocks are actually Dallas Semi parts)

In higher end and truly embedded equipment (where there are no user
adjustable controls), much depends on whether there is a method of
externally synchronising the clock so the error between external sync
is negligible. If so, there is no point in using (relatively) expensive
techniques.

Can we auto adjust? Sure. Do we? Only if we must.

Cheers

PeteS
One remark about the nice tubes the 32.768kHz crystals are usualy built in
is that they are extremly fragile. One bump and the frequency is not
accurate anymore, the crystals may start making noise or stop working all
together.
The equipment with these crystals should be carefully handled. If I buy a
new product and know it has a crystal I will always check the accuracy. Way
too often it's out of spec. because of bumps.

If you make a system yourself, always fasten these crystals either by solder
or by wire. Don't just use the leads it's not enough!

Alexander
 
A

Aaron Mavrinac

A lot of digital clocks use ICs that rely on the 60Hz or 50Hz signal
from the mains to keep time. Crack open any cheap alarm clock. They're
extremely unreliable and susceptible to EM interference from other
devices, but they're also very cheap to build.
 
Top