Maker Pro
Maker Pro

Why aren't computer clocks as accurate as cheap quartz watches?

C

Chris Jones

Why do the battery powered clocks in personal computers tend to keep
worse time than quartz watches, even the $1 ones?

The computer batteries measure fine, at least 3.15V.

I thought that the problem was temperature swings in the computers
(25-38C), but a couple of cheapo watches taped inside the computers
kept better time.

Others have already supplied you with plenty of possible reasons why the
existing crystal oscillator might have poor stability.

If you have some time to work on this, you could improve the PC's clock by
overdriving the internal crystal oscillator circuit with your own stable
oscillator. Most PCs would use a 32768Hz crystal. The oscillator circuit
typically used to be made from a CMOS unbuffered logic inverter chip which
you should be able to find the datasheet for, using google. Some PCs
probably have the oscillator built into the chipset instead, but the
circuit is probably the same so you should still be able to overdrive it.

I would expect that one pin of the crystal will be connected to the output
of a logic gate and the other pin of the crystal will be connected to the
input of a logic gate. There may be various resistors or capacitors in
series too. If you can create a stable 32768Hz square wave and feed it to
the input of the logic gate which is connected to one pin of the crystal
then this should override the oscillator with your externally supplied
signal. You will have to keep the signal running even when the computer is
off or otherwise the clock will stop.

You might be able to build yourself a stable crystal oscillator, or you
might prefer to buy one.

A temperature compensated crystal oscillator or an ovenised oscillator
should be accurate enough. There are many commercial suppliers of these.

Chris
 
W

w_tom

Another way of accomplishing same is to phase lock the
oscillator to AC mains 60 Hz; create the 32 Khz signal from
that phase lock loop. There are other problems that must be
address with this solution including some technical
knowledge. But it does solve the accuracy problem.
 
D

David Maynard

Mxsmanic said:
Michael A. Terrell writes:




IBM didn't kill off Honeywell, either. Honeywell bought GE's computer
division, then Bull SA (the French computer company) bought Honewell's
computer division. Today it survives as Bull SA (the unfortunate name
of the company comes from Fredrik Bull, the Norwegian founder of a
company that ultimately evolved to Bull SA today).

Well, that's a lot of Bull ;)
 
M

Mxsmanic

David said:
Well, that's a lot of Bull ;)

The company has tried to make the best of its name in English ads,
often with slogans along the lines of what you give above, but it
hasn't been very successful. Bull doesn't mean anything in French, so
it's not a problem in France, but it's a problem in English-speaking
countries. It was just bad luck that one of the original founders had
a Norwegian name that by some weird coincidence happened to look just
like an English word (Bull doesn't look very Norwegian to me, but
maybe it is [?]).
 
D

David Maynard

Mxsmanic said:
David Maynard writes:

Well, that's a lot of Bull ;)


The company has tried to make the best of its name in English ads,
often with slogans along the lines of what you give above, but it
hasn't been very successful. Bull doesn't mean anything in French, so
it's not a problem in France, but it's a problem in English-speaking
countries. It was just bad luck that one of the original founders had
a Norwegian name that by some weird coincidence happened to look just
like an English word (Bull doesn't look very Norwegian to me, but
maybe it is [?]).

Interesting. Frankly, I wouldn't have though it a all *that* much of a
problem because 'Bull' can be more/different than the implication we've
been using. It's certainly not as bad as some of the other name/language
foopaa's I've heard about, like the Ford Pinto. Turned out Pinto meant
something akin to 'small male genitalia' in Brazil. I mean, that's not even
a seller for females. Not to be outdone, Chevrolet introduced the "Nova" to
South America only to discover it translated to "it won't go." Just what
the world needs, a car that won't go.

This one is one of my special favorites. One drug company decided to avoid
all possible language mistakes in marketing to the United Arab Emirates by
using just pictures. First one shows a person ill. Next one taking the
medication. Next one all well and cured.

Unfortunately, Arab world people read left to right.

Gerber solved the 'which way' problem, just in case, when marketing to
Africa by using only one picture: the famous Gerber Baby on the label.
Except, in Africa companies generally put a picture of what's inside
because most people there can't read. Give's "baby food" a whole new
meaning, don't it?

Gawd, it's going to take me hours to stop laughing.
 
M

Mxsmanic

David said:
Gerber solved the 'which way' problem, just in case, when marketing to
Africa by using only one picture: the famous Gerber Baby on the label.
Except, in Africa companies generally put a picture of what's inside
because most people there can't read. Give's "baby food" a whole new
meaning, don't it?

Are Africans so dense that they can't figure out that the picture
represents something _for_ a baby, rather than baby flesh? What
picture would they understand?
 
D

David Maynard

Mxsmanic said:
David Maynard writes:




Are Africans so dense that they can't figure out that the picture
represents something _for_ a baby, rather than baby flesh?

Saying they're 'dense' is a bit harsh and you're basing it on the culture
you're used to. I mean, if everything you saw was a picture of what's
inside then you'd probably expect the picture to be a picture of what's
inside too.

My *guess* would be it was just confusing. That perhaps they assumed it
couldn't possibly be 'baby' inside but... what could it be then? Salve?

On the other hand, who knows what they think of barbarian anglos?

Things that seem 'obvious' in one culture can be anything but to someone
not familiar with it. I learned that one in the middle east when I went for
a public toilet and found myself looking at two identically shaped figures
labeling which was for males and females. The only difference was one was
white and the other was black but to a westerner used to the skirt/pants
distinction it was a bit of a mystery, especially when not thinking real
clear due to the urgency ;)
What
picture would they understand?

Don't know for sure as I'm not used to that particular culture but maybe an
apple for mashed apple? On the other hand, I don't know if apples are
visually common there. Or whether that's what was inside since it only had
a picture of a baby on it;)
 
M

Mxsmanic

David said:
Saying they're 'dense' is a bit harsh and you're basing it on the culture
you're used to. I mean, if everything you saw was a picture of what's
inside then you'd probably expect the picture to be a picture of what's
inside too.

But I would still realize that a picture of, say, a mountain on the
label would not mean that a mountain was contained inside the jar.
While I can understand that they might be accustomed to having a
picture on the jar that shows what's inside, I also credit them with
enough reasoning ability to realize that an actual baby isn't going to
be crammed into the jar just because a picture of one is on the label.

It reminds me, though, of the famous story of the illiterate woman (in
the U.S.) who bought a gallon can of Crisco because she thought it had
a roast chicken inside (there was a picture of a roast chicken on the
label).
 
D

David Maynard

Mxsmanic said:
David Maynard writes:




But I would still realize that a picture of, say, a mountain on the
label would not mean that a mountain was contained inside the jar.

Maybe mountain dirt. You hear people say "he could sell snowballs to an
eskimo" so I guess someone could sell mountain dirt too ;)
While I can understand that they might be accustomed to having a
picture on the jar that shows what's inside, I also credit them with
enough reasoning ability to realize that an actual baby isn't going to
be crammed into the jar just because a picture of one is on the label.

I don't know why. We got lunatics in this supposedly 'educated' country
claming we rammed planes into our own trade towers, or blew them up.
It reminds me, though, of the famous story of the illiterate woman (in
the U.S.) who bought a gallon can of Crisco because she thought it had
a roast chicken inside (there was a picture of a roast chicken on the
label).

Seems you just disproved your own point ;)

Actually, the point isn't whether everyone thought it had a baby inside but
that any confusion at all isn't conducive to selling the product and
neither is dismissing it as them being dense. Even if you're right it don't
get the jars sold ;)
 
W

Woody Brison

After reading much of this thread, and a lot of it has been
quite insightful... I'd like to add 2 more cents.

w_tom said:
There are two ways to do as suggested. The first is to make
'Benjamins' part of the technical facts during design....
... the technical reason for high verses low accuracy
timers was provided. Computer motherboards don't have the
trimming capacitor and the oscillator is subject to wider
voltage variations. Why this technical decision was made was
not asked and would only be speculation.

So, two sides of the coin... then, there be the THIRD side of the coin.

Why do you have a clock on your computer? Can't afford a watch
or a desk clock or a wall clock?

The answer is that a clock on the computer is useful to record
creation/change time on files.

It doesn't really matter if the file was modified at 6:00.00 000000
or 6:00.00 000035

What matters is if one file was created before another. You're
compiling, but the source hasn't changed, or has; the params file
has been changed since X,Y, or Z... that kind of thing.

On a computer, Approximate Time is almost always all that's really
needed; a clock that ***always runs forward***, and keeps time within
a few minutes a day.

Even if Perry Mason drags you into the witness stand and confronts
you with file dates and times, approximate is probably good enough
to acquit you or convict you. If in the rare case it's not, bring in
your
expert to explain that computer clocks are often not accurate.

Wood
 
D

David Maynard

Woody said:
After reading much of this thread, and a lot of it has been
quite insightful... I'd like to add 2 more cents.




So, two sides of the coin... then, there be the THIRD side of the coin.

Why do you have a clock on your computer? Can't afford a watch
or a desk clock or a wall clock?

The answer is that a clock on the computer is useful to record
creation/change time on files.

It doesn't really matter if the file was modified at 6:00.00 000000
or 6:00.00 000035

What matters is if one file was created before another. You're
compiling, but the source hasn't changed, or has; the params file
has been changed since X,Y, or Z... that kind of thing.

On a computer, Approximate Time is almost always all that's really
needed; a clock that ***always runs forward***, and keeps time within
a few minutes a day.

Even if Perry Mason drags you into the witness stand and confronts
you with file dates and times, approximate is probably good enough
to acquit you or convict you. If in the rare case it's not, bring in
your
expert to explain that computer clocks are often not accurate.

Wood

Or keep your clock set to 1935 and even Perry Mason won't know when they
were actually made.
 
M

Mxsmanic

Woody said:
The answer is that a clock on the computer is useful to record
creation/change time on files.

It doesn't really matter if the file was modified at 6:00.00 000000
or 6:00.00 000035

What matters is if one file was created before another. You're
compiling, but the source hasn't changed, or has; the params file
has been changed since X,Y, or Z... that kind of thing.

On a computer, Approximate Time is almost always all that's really
needed; a clock that ***always runs forward***, and keeps time within
a few minutes a day.

That is true if all activity is confined to a single computer. When
multiple computers are involved, however, they must be synchronized.
And if the computers interact with other computers outside local
control (as by communication over the Internet), then they must not
only be synchronized, but they must be synchronized to a universal
standard, such as UTC.

This is why clock accuracy is important.

In the old days when every PC was completely isolated, time hardly
mattered at all, and often people would use PCs without bothering to
ever set the correct date or time. Nowadays, almost all PCs have to
be at least approximately synchronized to the correct time of day, and
often very precise synchronization is required.
 
D

David Maynard

Mxsmanic said:
Woody Brison writes:




That is true if all activity is confined to a single computer. When
multiple computers are involved, however, they must be synchronized.
And if the computers interact with other computers outside local
control (as by communication over the Internet), then they must not
only be synchronized, but they must be synchronized to a universal
standard, such as UTC.

This is why clock accuracy is important.

In the old days when every PC was completely isolated, time hardly
mattered at all, and often people would use PCs without bothering to
ever set the correct date or time. Nowadays, almost all PCs have to
be at least approximately synchronized to the correct time of day, and
often very precise synchronization is required.

Except for spammers that routinely post years, even decades, into the past.
 
M

Mxsmanic

David said:
Except for spammers that routinely post years, even decades, into the past.

What's the advantage of doing so? And why can't servers simply reject
anything that is obviously far in the past?
 
W

Woody Brison

Mxsmanic said:
That is true if all activity is confined to a single computer. When
multiple computers are involved, however, they must be synchronized.
And if the computers interact with other computers outside local
control (as by communication over the Internet), then they must not
only be synchronized, but they must be synchronized to a universal
standard, such as UTC.

OK, thanks, but why is it so important? A message is received
on one computer at a certain time per that computer's clock. It
was sent from another computer at some time, recorded in the
message, per that computer's clock. Are we going to calculate
transit time or something? Check to make sure it didn't arrive
before it was sent?
 
M

Mxsmanic

Woody said:
OK, thanks, but why is it so important?

Sometimes you have to be able to correlate or synchronize events over
long distances with great accuracy (fractions of a second).
A message is received on one computer at a certain time per that
computer's clock. It was sent from another computer at some time,
recorded in the message, per that computer's clock. Are we going
to calculate transit time or something?
Yes.

Check to make sure it didn't arrive before it was sent?

Yes.

There are many applications for accurate time. In fact, the more
accurate time one can obtain, the more useful applications become
practical and available.
 
D

David Maynard

Mxsmanic said:
David Maynard writes:




What's the advantage of doing so? And why can't servers simply reject
anything that is obviously far in the past?

Hell if I know but I just got one dated 1969.
 
D

DevilsPGD

In message <[email protected]> David Maynard
Hell if I know but I just got one dated 1969.

That's usually because the date header was missing completely, and some
system along the way assigned a header of Jan 1 1970 -0000, then
corrected for timezone shift.
 
I

Isaac Wingfield

Mxsmanic said:
That is true if all activity is confined to a single computer. When
multiple computers are involved, however, they must be synchronized.
And if the computers interact with other computers outside local
control (as by communication over the Internet), then they must not
only be synchronized, but they must be synchronized to a universal
standard, such as UTC.

This is why clock accuracy is important.

In the old days when every PC was completely isolated, time hardly
mattered at all, and often people would use PCs without bothering to
ever set the correct date or time. Nowadays, almost all PCs have to
be at least approximately synchronized to the correct time of day, and
often very precise synchronization is required.

In almost any instance where a high degree of precision and
synchronization is needed, the computer will be running a version of NTP
software which can provide precision time from even very poor CPU
timebases.

"Ordinary" computers don't need that degree of precision, and a
once-or-twice a day comparison to an NTP server somewhere on the 'net is
all that's required.

Isaac
 
Top