I
Ian Jackson
Generally, UK (and even European) TV sets had a hard time with adjacentGeoffrey S. said:I don't know how well UK sets worked in the 1960's, but US TV sets were
not capable of receiving adjcent channels at one time, so they were not
used. For example, channel 2 was used in New York City, while the nearest
channel 3 station was in Philadelphia, 90 miles away and too far to be
received without a large antenna.
I think the next one up was 5 in NYC and 6 in Philly.
channels. Like the USA, the off-air broadcast channels were arranged so
that, within the normal service area, there would never be an adjacent
channel which was anything like as strong as the channel(s) intended for
that area.
The same was true of cable systems. As TV sets were incapable of
operating with adjacent channels, they carried only alternate channels.
However, things changed with the advent of cable set-top boxes. These
were specifically designed to be capable of receiving a level(ish)
spectrum of maybe 30+ channels. The tuned channel was converted to a
single output channel in Band 1 (selected to be a vacant off-air channel
in the area where the STB was to be used). Essentially, all the adjacent
channel filtering was done on output channel, so the TV set was
presented with only a single channel, thereby eliminating any problems
with poor adjacent channel selectivity.
Early STBs covered only non-off-air channels, eg 'midband' (between
Bands 2 and 3) and 'superband' (above Band 3 to around 300MHz). As a
result, large cable TV systems would carry alternate channels in Bands 1
and 3 (so that they could be received directly by the TV set), and
adjacent channels elsewhere (which could normally only be received via
the STB).
Later on, when multi-channel cable TV was recognised as 'the way to go'
by the TV set manufacturers, TV sets themselves started being equipped
with wideband tuners - typically providing virtually continuous coverage
from 50 to 300MHz and beyond, plus the UHF TV broadcast band. At the
same time, TV set adjacent channel selectivity was improved, as they had
to be capable of receiving the adjacent cable channels.
In the 1980s, SAW filters became widely available for use in domestic TV
sets, and these virtually eliminated the problems of interference from
adjacent channels. Of course, eventually, cable TV set-top boxes also
developed further, providing not only continuous wideband coverage of
from 50 to 870MHz, but they also became descramblers/decoders for pay-TV
services.
IIRC, at first, UHF was not very popular in the USA. Tuners were prettyWhen the US started UHF TV in the mid 1960's (all 1965 models had to
have VHF/UHF tuners), they spaced the channels far apart, Philadelphia
for example had three, 17,29 and 48.
rudimentary - consisting of virtually nothing except a triode variable
frequency oscillator and a crystal diode mixer (techniques essentially
borrowed from WW2 radar technology), and this fed the input of the
existing VHF tuner. UHF transmitter powers were low, and as receiver
sensitivity was not much better than a crystal set, coverage was
minimal, so virtually no one bothered much with UHF TV. As a result, TV
sets continued to be manufactured fitted with only the traditional
12-channel lowband/highband VHF tuner.
Eventually, because of total congestion in the VHF TV bands, I believe
the FCC stepped in, and more or less forced TV manufactures to fit the
additional UHF tuner. I believe understand that they did this rather
indirectly - not by requiring TV manufacturers to fit UHF tuners per se,
but instead by making it illegal for them to ship TV sets across a state
border if they did not have a UHF tuner.