Have you used a 8275 CRT controller in the last 30 years? (Think!
That was used with the 8085 8-bit processor)
'Modern' monitors are not NTSC. They are digital, and the current
types are HDMI. Therefore, all timing is generated by the onboard
display controller. I still repair monitors, including LCD types. Not
many, and not often but I do know how they work.
It's in an old Geramn made printer he's trying to reserect. Why
would it have a modern monitor. You need your 'chops' busted, in real
life. He is working on an old, retired system. You keep tossing out
crap that has nothing to do with his problem. He has been pointed to
multiple sources of the proper crystal.
Yes, an old piece of equipment that is in tatters and may or may not use
the original monitor. That is my point. Until the OP responds, you
simply don't know what monitor is being used. If he wants to test the
equipment with a $1 crystal rather than a $50 crystal that will take two
months to get, that is his choice.
Damn, you're stupid. CGA video used the 14.318180 system clock in the
PC. It was divided by 4 to provide the required burst. Early PCs had a
trimmer capacitor to set the exact freqency.
Have you ever seen a CGA only video card with a crystal? Once again:
COLOR BURST. SEVEN CYCLES. PHASE LOCKED TO BURST ONCE PER SCAN LINE.
Colorburst crystals were made by the millions, and for decades. They
only had to be able to be pulled to the burst, and remain stable for one
scan line. The two only have to be close enough to be pulled into
phase. I have looked at a lot of the burst crystals in free running
mode, and they were within 5 Hz. Is that good enough for you? Really,
you need to stop making a fool of yourself and look at the real
circuits.
I'm not going to argue with you about this. The crystals used in PCs
and nearly all consumer electronics are only accurate to a few 10's of
ppm and typically are only stable over temperature to roughly the same
range. You can tune the crystal to an exact frequency with a trimmer
and it will be off by 10's of ppm by the end of the day or before the
equipment has warmed up (or conversely after). Did PCs warm up the
oscillator before turning on the monitor?
The point is that talking about depending on crystals being ±3 ppm
accurate is not valid unless it is a more expensive piece of equipment
that can justify the cost of a temperature compensated oscillator. I
don't care about how many crystals you have looked at, holding a ±3 ppm
spec on an oscillator is not something you do with a $0.50 crystal
oscillator. CGA never had to rely on the timing being anywhere near
that accurate, period; or any other display I have ever worked with.
Heck, that is outside the spec of many frequency counters from HP and
elsewhere! The model HP5383A timebase is specified for 3 ppm/month
aging, 2.5 ppm between 0°C and 40°C and 0.5 ppm due to power 10% line
variation. This is a piece of equipment with an accuracy limited by the
crystal oscillator, so I think they would put a lot more effort into
that feature than PC makers would in their reference clocks.
Your claims that CGA monitors (or any others) need to be driven to a
timing accuracy of ±3 ppm is not supported by the facts.
Go be rude with someone else!