R
R. Mark Clayton
Stephen said:Large LCD and Plasma screens use "de-interlacing" to avoid this, but this
has drawbacks because it has to either display one field for twice it's
original sample period (i.e. display a 1/50th second exposure for 1/25th
second)
So what do you think old CRT's did? Nothing of course, but they had
phosphors with a persistence of 40mS or more, so the lines drawn last time
had not faded when the alternate ones were drawn - which amount to the same
thing.
Many modern CRT sets draw the whole screen 100 times a second.
or start doing clever things like only doing this for fine details, or
guessing where the patterns of the image should be in the missing lines of
each field by examining the picture details from adjacent fields and
lines. Most large screens use the "clever" approach, but none of this
processing can be done perfectly and it tends to introduce lag, smearing
or exaggerate mpeg artefacts by freezing them on the screen for twice the
length of time they would be displayed by a CRT display which is not
de-interlaced.
Modern large screens also have to resample the whole picture from 576
lines to the 768 lines of the display which inevitably reduces the
resolution to about half of 768. This is why standard definition looks
poor on an "HD ready" screen but looks virtually the same as 768 line HD
when displayed on a CRT at 576 lines from an RGB Scart.
I think it is more to do with jpeg images, and sitting too close to the set,
so you can resolve (see individual) the pixels.