Maker Pro
Maker Pro

NTSC versus PAL

  • Thread starter William Sommerwerck
  • Start date
W

William Sommerwerck

"Die, die, my darling!"

As both PAL and NTSC are basically dead systems (NTSC in the US, at least),
there is little point in discussing their differences. But as Mr. Alison
insists on displaying his ignorance in public, I'm going to, anyhow.

The first color TV system approved by the FCC was a field-sequential (or
frame-sequential -- I forget which) system proposed by CBS. It was developed
by Peter Goldmark, the same man given credit for the modern LP phonograph
record. (I say "given credit for", because there have been questions as to
whether he was the principal designer.)

The CBS system is a classic example of a design botched from the get-go. At
that time (not long after WWII), there was no practical way to display three
color images simultaneously with a single CRT. So Goldmark went with a
spinning color wheel, a system that had been tried 25 years earlier for
color motion pictures, and found wanting.

The problems with such a system are obvious, but I'll describe them. One
problem is that it requires three times as much film (or in the case of TV,
three times the bandwidth). Another is that moving objects show color
fringing.

Then there was the problem of the spinning color-filter disk. A 10" TV would
require one at least 2' in diameter. Imagine the disk needed for a 21" set!
(Not to mention the noise, and the possibility, however remote, of
decapitating the cat.)

These obvious (and lethal) deficiencies didn't deter Goldmark or CBS,
because they were in competition with RCA/NBC. The CBS argument was... Why
limit TV to B&W? Why not /start/ with a color system, and be done with it?
CBS pressed the FCC (as one writer pointed out, every sale of an RCA B&W TV
would be another nail in the coffin of the CBS color system), and in 1950
the CBS system was approved, despite the fact it was wholly incompatible
with the 480i system already in use. *

David Sarnoff ("the most-nasty name in electronics") was naturally upset.
RCA had to make CBS look bad, while completing development of their own
color system. Sarnoff gleefully pointed out that the CBS system was
"mechanical", and subject to all the limitations accruing thereto. Though
this was literally true, it overlooked the fact that one can have
all-electronic field-sequential color. But -- on the other hand -- CBS had
nothing other than a mechanical system to offer.

RCA was working on a "dot-sequential" system. Each line of the image was
divided into 300 (or so) pixels **, with red, green, and blue samples
alternating. This system worked fairly well -- it produced an acceptable
picture on B&W sets. But (for reasons I don't remember) color receivers had
problems displaying B&W images. As color receivers would (initially) be used
mostly for B&W viewing, this was not acceptable

The breakthrough came when engineers at Hazeltine and GE remembered Monseuir
Fourier, and recognized that sampling the colors was equivalent to a
"continuous" signal at the sampling frequency. They "slipped a note under
RCA's door" (so to speak), and NTSC/PAL came into existance. The color
information was transmitted on a subcarrier whose sidebands were interleaved
with the luminance sidebands, to minimize interaction. ***

"...complete with bad commercials that repeat all night, both in compatible
color and black and white." -- Stan Freberg

The brilliance of NTSC/PAL is that their signals produce as good (or better)
an image on B&W sets, and display excellent color on a color set -- without
making any existing equipment obsolete, and without requiring additional
bandwidth.

So... why is NTSC "better" than PAL? For one thing, it has "better" and
"more" color. Although the original NTSC proposal used red and blue color
signals of equal bandwidth, it was recognized that this didn't fit with the
way the eye actually sees color.

It turns out that for a 480-line system displayed on a 21" tube, the eye
sees full color (red/green/blue) only to about 0.5MHz. From 0.5MHz to
1.5MHz, the eye sees only those colors that can be matched with red-orange
and blue-green primaries. **** The system was therefore changed to the
red-orange/blue-green and yellow-purple primaries, the former of 1.5MHz
bandwidth, the latter of 0.5MHz bandwidth.

PAL uses equal-bandwidth (1.0 MHz) red and blue primaries. If an NTSC set
fully demodulates the 1.5MHz color signal (most limit it to 0.5MHz to make
the set cheaper), more of the original image's color detail will be
displayed (though this will be visible mostly in graphics).

Much has been made of PAL's phase alternation, especially its supposed
ability to eliminate the need for a tint [sic] control. (It should be hue
control.) When was the last time you adjusted the hue control on an NTSC
receiver? 30 years ago?

This issue is confused by two factors -- the differences between European
and American distribution systems, and their studio standards.

If the transmission network has constant group delay, the hue setting should
be set 'n forget, and never need to be changed. The American system had good
group-delay characteristics -- the European did not. So switching channels
could require twisting the hue knob. But that's not all there is to it.

Non-linear group delay changes the colors in a way that cannot be corrected
simply by adjusting the hue control. All the colors cannot be "correct" at
the same time. The advantage of PAL is that these color errors "flip" with
the phase, and are complementary -- the eye "averages" them to the correct
color.

So what's wrong with that? Well, the averaging also reduces saturation.
(Mixing an additive primary with its complement pushes it toward white.)
With severe group-phase error, the image shows bands of varying saturation.
(In NTSC, there are bands of varying hue.)

The other point of confusion is that, for many years, US broadcasters didn't
pay much attention to signal quality. Cameras weren't set up properly, and
burst phase wasn't properly monitored. So when you changed channels, you
sometimes had to change the hue setting. Broadcasters finally got their acts
together, and color quality has, for some time, been pretty consistent from
channel to channel.

In short, PAL's phase alternation is an advantage with transmission systems
having poor group-delay characteristics -- a problem that did not exist in
the US. In every other respect, it is inferior to NTSC.

All of this is true, to the best of my knowledge. Corrections and additions
are welcome.


* Some dishonest manufacturers sold B&W TVs with a "color converter" jack on
the back. It wouldn't have worked, because these sets didn't have the
required IF bandwidth (AFAIK).

** No, the term didn't exist at the time.

*** Some interaction is visible with objects having fine B&W detail. The set
"misinterprets" this detail as color information.

**** This is why two-primary color-movie systems (such as the original
Technicolor) could give acceptable -- though hardly great -- results.
 
W

William Sommerwerck

As both PAL and NTSC are basically dead systems
What about PAL and NTSC videos, DVD/Blu-ray?
When did they die?

I meant as broadcast systems. I have plenty of NTSC DVDs, and analog cable
signals are still NTSC.

Blu-ray is its own format (1080p/24 or 1080i/60).
 
G

Geoffrey S. Mendelson

Meat said:
What about PAL and NTSC videos, DVD/BluRay? When did they die?

Technically video tapes are not NTSC or PAL. They have separate tracks
for luminance and chroma. The recorders all stripped them apart before
recording them and put them back together when playing them.

There is no technical reason not to build a video player with a digital
output, which digitzes the signals and presents them as an digital data
stream, with out actual NTSC nor PAL encoding. The field/frame rate would
be the same as the source material, but that's not the same thing.

The same with DVD's and BluRay. The data is encoded using MPEG compression,
which has separate information for luminance and chroma. It can be rebuilt
as red-green-blue pixels without ever going through NTSC or PAL.

As reg-green-blue cameras become more common, I expect that there will be
an eventual shift to rgb encoded data, but that's a long way off.

Geoff.
 
P

Phil Allison

"William Sommerwanker = Rabid Nut Case "
The belief that NTSC is a stupid design, and PAL corrects all the
bone-headed elements of NTSC, is untrue. The original NTSC proposal was
actually PAL (I have the copy of Electronics magazine to prove it), and
NTSC
is, overall, a less-compromised design than PAL.


** Wot a putrid pile of utterly absurd verbal sophistry.

The " original NTSC proposal " has got NOTHING to FUCKING do with what
NTSC turned out to be in reality.

In * REALITY * the NTSC broadcast signal is massively compromised in
comparison to a PAL signal.

But on dark, smelly PLANET " Sommerwanker"

- any fucking absurdity is held out to be true.



..... Phil
 
P

Phil Allison

"William Sommerwanker Mental Retard "
You don't know what the hell you're talking about. Put up or shut up.


** YOU have put up nothing but total bollocks.

Everything YOU ever posted is 100% PURE FUCKING BOLLOCKS.

YOU are nothing but a stinking public menace and a VILE narcissistic prick.

**** Off and DIE !!!!!!!!!!
 
W

William Sommerwerck

Blu-ray is its own format (1080p/24 or 1080i/60).
Oh? So Blu-ray will play on a 50 or 60 Hz system
and the audio will be in sync?

Good question. I haven't looked to see whether a Blu-ray player can be set
to deliver an SD signal. I don't think it can.
 
S

Sylvia Else

I don't think that is actually true.

I think you'll find that was the intent. However, if the phase error is
too great, the eye averaging doesn't work so well, hence the
introduction of the delay line.

At which point you wonder why bother sending two colour signals in
quadrature if you're just going to average them with the next scan line
anyway. SECAM avoids that complexity by just going straight to the delay
line. I lived in Paris for 18 months. If there's a quality difference
between a SECAM and PAL picture, it was far from obvious.

Sylvia.
 
P

Phil Allison

<[email protected]
In * REALITY * the NTSC broadcast signal is massively compromised in
comparison to a PAL signal.


PAL has plenty wrong with it and is 'massively compromised' the same
ways as NTSC.

** More INSANE CRAPOLOGY !!!!!!!!!!


Editing in composite PAL .....


** More fuckwit, OFF TOPIC CRAPOLOGY !!

See the words " broadcast signal " - fuckhead ???

Even know what it means ???



...... Phil
 
W

William Sommerwerck

If the transmission network has constant group delay, the
I don't think that is actually true. It's been a lot of years since
I studied PAL decoding at college, but as far as I recall, the
averaging is done totally electronically, courtesy of the PAL
delay line. This is a glass block delay line of one scan-line
period, so if you run a direct and a delayed path side by side
in the chrominance channel, and then sum the outputs of both,
you arrive at an electronically averaged result of two sequential
lines, with any phase errors balanced to zero. This has nil effect
on the overall colour saturation, as this is controlled by a) the
ACC circuit, and b) the user saturation control.

The averaging can be done electronically, but there is also some visual
averaging.

I'm not sure you can remove the phase distortion without reducing the
saturation -- all the stuff I've read on PAL says otherwise -- but I won't
press the issue because I haven't thought it through carefully.
 
W

William Sommerwerck

I think you'll find that was the intent. However, if the phase error is
too great, the eye averaging doesn't work so well, hence the
introduction of the delay line.
At which point you wonder why bother sending two colour signals in
quadrature if you're just going to average them with the next scan line
anyway.

But you don't have to average them. NTSC doesn't. And the delay line can be
used for comb filtering.

SECAM avoids that complexity by just going straight to the delay
line. I lived in Paris for 18 months. If there's a quality difference
between a SECAM and PAL picture, it was far from obvious.

The problem is, SECAM /requires/ the delay line because the system transmits
only the red or blue color-difference signal at any time. This is what I was
talking about -- it keeps the transmission side cheap, while making the user
pay more for their TV.

For most images, you won't see a difference. But in an image with strong
vertical color transitions, you'll see aliasing, especially when the image
moves vertically.
 
S

Sylvia Else

But you don't have to average them. NTSC doesn't. And the delay line can be
used for comb filtering.



The problem is, SECAM /requires/ the delay line because the system transmits
only the red or blue color-difference signal at any time. This is what I was
talking about -- it keeps the transmission side cheap, while making the user
pay more for their TV.

For most images, you won't see a difference. But in an image with strong
vertical color transitions, you'll see aliasing, especially when the image
moves vertically.

If we were building an analogue colour TV transmission infrastructure
now, then maybe we'd go the NTSC route, since it eliminates the delay
line. But it's undoubtedly true that, for whatever reasons, in earlier
times, NTSC didn't perform that well, whereas those whose systems were
PAL or SECAM got good colour pictures from day one.

Sylvia.
 
P

Phil Allison

"Stupider than Anyone Else Alive"

If we were building an analogue colour TV transmission infrastructure now,
then maybe we'd go the NTSC route, since it eliminates the delay line.

** Total insanity.

But it's undoubtedly true that, for whatever reasons, in earlier times,
NTSC didn't perform that well,


** The laws of nature have not changed since 1953

- you tenth witted, know nothing, bullshitting pommy bitch !!!



...... Phil
 
S

Sylvia Else

"Stupider than Anyone Else Alive"



** Total insanity.

You realise there are two different delay elements required in a
PAL/SECAM set?
** The laws of nature have not changed since 1953

I think engineering techniques have.

Sylvia.
 
P

Phil Allison

"Stupider than Anyone Else Alive"

You realise there are two different delay elements required in a PAL/SECAM
set?


** Not relevant to the point at all.

I think engineering techniques have.


** Not relevant to the point at all.

You TROLLING, ASD fucked, congenital moron !!!!




..... Phil
 
P

Phil Allison

** More fuckwit, OFF TOPIC CRAPOLOGY !!

See the words " broadcast signal " - fuckhead ???

Even know what it means ???

They still do some composite D-2 editing at CBS network. Or don't they
count as broadcast?


** Hey fuckwit.

In relation to television transmission - where does one find the "
broadcast signal " ???

Don't strain you tiny brain thinking too hard.




...... Phil
 
S

Sylvia Else

If the transmission network has constant group delay, the hue setting should
be set 'n forget, and never need to be changed.

It's not clear to me why that wasn't the case anyway. Whatever phase
error was introduced to the colour signal by the transmission system
would also affect the colour burst. If the problem could be addressed by
means of a tint control with a setting that remained stable even over
the duration of a program, it rather seems to imply that a phase error
between the colour burst and the colour subcarrier was built into the
signal at the studio.

Sylvia.
 
S

Sylvia Else

NTSC? No delay line? Moron. The luminance data had to be delayed to
allow time to process the Chroma data. An open delay line in a NTSC
video display caused a very dark image with moving blotches of color. I
found and replaced several, in NTSC TVs and Video Monitors.

In which case you'd know that a PAL TV contains two delay lines. One
provides a short delay and addresses the difference in delay between the
chroma path and the luminance path. The other provides a full scan line
delay to allow averaging of the chrominance signal.

It should be obvious from context that "the" delay line that I was
referring to was the latter.

But I suppose calling people morons is easier than doing your own thinking.

Sylvia.
 
D

David

Sylvia Else said:
It's not clear to me why that wasn't the case anyway.
Whatever phase error was introduced to the colour signal
by the transmission system would also affect the colour
burst. If the problem could be addressed by means of a
tint control with a setting that remained stable even over
the duration of a program, it rather seems to imply that a
phase error between the colour burst and the colour
subcarrier was built into the signal at the studio.

Sylvia

One big problem was differential phase and gain in the
transmission path. In this case both the amplitude and phase
of the color information was influenced by the total
amplitude of the signal including the luminance. Since the
burst was at IRE 0 and the average picture content was IRE
50 or so, differential phase shifted the color hue.

David
 
W

William Sommerwerck

If we were building an analogue colour TV transmission
infrastructure now, then maybe we'd go the NTSC route,
since it eliminates the delay line.

PAL doesn't /require/ a delay line.

But it's undoubtedly true that, for whatever reasons, in earlier
times, NTSC didn't perform that well, whereas those whose
systems were PAL or SECAM got good colour pictures from
day one.

NTSC has always "performed well". Poor NTSC image quality was always due to
bad studio practice.
 
W

William Sommerwerck

** The laws of nature have not changed since 1953
- you tenth witted, know-nothing, bullshitting pommy bitch !!!

That's really going too far. Is there any way to permanently block Mr.
Allison?

By the way, it's pome, an acronym of "prisoner of mother England".
 
Top