Maker Pro
Maker Pro

120hz versus 240hz

G

Geoffrey S. Mendelson

Sylvia said:
CRT TVs refresh at 50Hz or 60Hz (near enough) depending on region.

FYI, Computer CRT screens refresh at 60 to 85 Hz.

The main difference between CRT's and LED's or LCD's is persistance.
The CRT's have long persistance phosphors, when they are illuminated, they
stay lit for a relatively long time. That's why the interlacing system
works, the odd lines are still lit when the even ones are illuminated.

Geoff.
 
S

Sylvia Else

FYI, Computer CRT screens refresh at 60 to 85 Hz.

The main difference between CRT's and LED's or LCD's is persistance.
The CRT's have long persistance phosphors, when they are illuminated, they
stay lit for a relatively long time. That's why the interlacing system
works, the odd lines are still lit when the even ones are illuminated.

It's not that long, which is why photographs of television pictures look
so awful. Interlacing is used to avoid flicker without having to
transmit 50 or 60 full frames per second.

LCDs don't flicker anyway, regardless of their framerate. The frame rate
issue relates to addressing the judder you get as a result of the image
consisting of a sequence of discrete images, rather than one that
continously varies.

It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.

Sylvia.
 
W

William Sommerwerck

LCDs don't flicker anyway, regardless of their framerate. The frame
rate issue relates to addressing the judder you get as a result of
the image consisting of a sequence of discrete images, rather than
one that continously varies.

Not quite, otherwise the issue would occur with plasma displays. Indeed, it
would with any moving-image recording system.

The problem is that LCDs don't respond "instantaneously". They take a finite
time to go from opaque to the desired transmission level, and then back
again. The result is that the image can lag and "smear". (25 years ago, the
first pocket LCD color TVs from Casio had terrible smear, which added an
oddly "artistic" quality to sports.)

For reasons not clear to me, adding interpolated images reduces the smear.
This makes absolutely no sense whatever, as the LCD now has /less/ time to
switch. I've never gotten an answer on this.

It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.

Interlaced images can be de-interlaced. Note that most product reviews test
displays for how well they do this.
 
W

William Sommerwerck

Dolby has a new thing -- HDR LCD that
modulates the LED backlights on-the-fly.

This is neither new, nor was it invented by Dolby.
 
B

bob urz

CRT TVs refresh at 50Hz or 60Hz (near enough) depending on region.

Since a TV program will only contain images (interlaced) at that rate -
or frequently less - a TV that purports to offer a higher refresh rate
will have to create the extra images by some kind of interpolation. If
it does a bad job, then the result will be unwatchable regardless of how
high the refresh rate is.

Sylvia.

It can get more complicated than that. Dolby has a new thing out HDR LCD
that on the fly modulates the LED backlights for brightness in groups.
that was not possible with CFL LCD backlights.

http://www.dolby.com/uploadedFiles/...sional/dolby-hdr-video-technical-overview.pdf


bob
 
C

Chris

AZ Nomad said:
I think when they refer to LEDs, it is LEDs used for backlighting
probably for an LCD.

Yes, that is how it was explained to me from a salesman as well as what I
gathered from online info. So, apparently, it is still an LCD screen. Also,
somehow the refresh rate of the LEDs create some sort of multiplier effect
with the LCDs; thus the higher hz. It sure would be nice to know if this is
correct, and also why/how it enhances the picture.
Although I am far from an expert in this area (hence my original post), I
have the ability to understand just about anything that is explained
correctly. When information is presented in an ambiguous way, which is what
I have seen so far on internet research, that is definitely a red flag that
the author probably is not knowledgable in the subject matter.
 
W

William Sommerwerck

Dolby has a new thing -- HDR LCD that on the fly
Which would be fine if the LEDs corresponded exactly
to the pixels. But they don't.

I've seen at least one review that complained that local dimming produced
"halos" around objects in darker scenes. I would never, ever buy a set with
such a feature, unless it could be shut off.
 
W

William Sommerwerck

** And if you put the remark back into its context --
No it doesn't.

Agreed. It seemed unrelated, even out of left field. I suspect Sylvia didn't
properly express what she wanted to say.
 
W

William Sommerwerck

I don't know how a 240Hz "scan rate" would be achieved.
It's probably a sort of trick that the set's electronics use
to make the picture seem just that much more stable.

Actually, it's a frame rate. It can be done by interpolation, by inserting
blank frames, or a combination of the two.
 
L

Leonard Caillouet

Phil Allison said:
"Arfa Daily"

** But they are called " LED TVs " by their makers and so are

*KNOWN BY THAT NAME* to members of the public.


Fools like YOU and Sommerwanker would complain that a bottle of "Steak
Sauce" contained no steak.



.... Phil

I guess we should refer to all LCD sets by their backlight type. That makes
the one on my wall a CCFL TV. And I guess all of those DLP, LCD, DiLA,
SXRD, and LCoS projection sets should be called mercury vapor or whatever
type of lamp they use. And the new projectors could be called LED
projectors as well, even if they are DLP devices.

The point is that referring to the set by the type of backlight it uses is
very misleading and is causing much confusion in the marketplace.

Leonard
 
S

Sylvia Else

Not quite, otherwise the issue would occur with plasma displays. Indeed, it
would with any moving-image recording system.

The problem is that LCDs don't respond "instantaneously". They take a finite
time to go from opaque to the desired transmission level, and then back
again. The result is that the image can lag and "smear". (25 years ago, the
first pocket LCD color TVs from Casio had terrible smear, which added an
oddly "artistic" quality to sports.)

For reasons not clear to me, adding interpolated images reduces the smear.
This makes absolutely no sense whatever, as the LCD now has /less/ time to
switch. I've never gotten an answer on this.

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow the square, but the square itself was moving in discrete steps.
So the eye was causing the image of the square to be smeared across the
retina. I was seeing this effect on a CRT screen, but the longer the
persistence of the image on the screen the worse the effect would be.
Interpolating the position of the image on the screen would reduce that
effect.

However, I can't explain why this would be less pronounced on a plasma
screen.
Interlaced images can be de-interlaced. Note that most product reviews test
displays for how well they do this.

They have to be deinterlaced for display on any screen with significant
persistence, but deinterlacing doesn't increase the underlying frame rate.

Sylvia.
 
S

Sylvia Else

<snip>






Because LCD cells are painfully slow at switching, which equates to a long
persistence phosphor on a CRT, which as you say yourself, makes the effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more delay
than the cells themselves are contributing, then the result will be smooth
motion without any perceivable blur, which is pretty much how it was with a
standard domestic CRT based CTV.

Arfa

It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I
understand plasma displays, that's not how they work.

Sylvia.
 
M

Mark Zenier

You should also be aware that there are several 'resolutions' of screen and
drive to take into consideration. Almost all TV showrooms both here and in
the US, tend to have the sets running on at least an HD picture, and often a
BluRay picture. This makes them look very good at first glance. Problem is
that in normal day to day use when you get it back home, you are going to be
watching standard resolution terrestrial broadcasts on it, and on many sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that it is
not what they saw in the store.

At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).

Mark Zenier [email protected]
Googleproofaddress(account:mzenier provider:eskimo domain:com)
 
S

Sylvia Else

I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'.

The fact that a sequence of still images are perceived as a moving
picture is clearly a consequence of visual persistence. And it's obvious
that things will look bad if the images actually overlap. But that's not
what we're discussing.

We're discussing why certain types of display don't do such a good job
despite having a reasonably sharp transition from one image to the next.

The Wikipedia article you cited said that even LCD switching times of
2ms are not good enough "because the pixel will still be switching while
the frame is being displayed." I find this less than convicing as an
explanation. So what if the pixel is switching while the frame is being
displayed? It's not as if the eye has a shutter, and the transition time
is much less that the eye's persistence time anyway.

Sylvia.
 
G

Geoffrey S. Mendelson

Arfa said:
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'. Black and white TV CRTs used a
phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both
of these are designated as being short persistence types. The green and blue
phosphors used in a colour CRT, have persistence times of typically less
than 100uS, and the red around 2 - 300uS.

Short and long persistance are relative terms. Compared to the P1 phosphors
of radar screens and osciloscopes, P4 phosphors are relatively short
persistence. Compared to an LED they are long persistance.

Note that there is a lot of "wiggle room" in there, supposedly the human
eye can only see at 24 frames per second, which is 50ms.

Also note that there are relatively few frame rates in source material,
NTSC TV is 30/1001 frames per second, PAL TV is 25. Film is 24, which was
stretched to 25 for PAL TV and reduced to 24/1001 for NTSC TV.

Film shot for direct TV distribution (MTV really did have some technological
impact) was shot at 30/1001 frames per second.

Digital TV could be any frame rate, but they have stuck with the old standards,
US digital TV is still the same frame rate as NTSC and EU, etc. digital TV is
still 25 FPS.

Lots of video files online are compressed at lower frame rates because of
the way they are shown. The screens still operate at their regular frame
rate, the computer decoding them just repeats them as necessary.

Geoff.
 
G

Geoffrey S. Mendelson

Arfa said:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same 'rule'
of putting the still image up very quickly, and not leaving it there long,
to achieve the same result.

It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or have
not quite properly prescibed lenses, your image will appear sharp, not blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything faster
than 24 frames a second is not perceived as being discrete images, but one
smooth image.

The 50 and 60 fields per second (a field being half an interlaced frame) were
chosen not because they needed to be that fast (48 would have done), but to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all, as
they were fixed around 1960 and not changed, but it is display technology as
your brain perceives it.

No matter what anyone says here, it's the combination of exact implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".


If you think about it, the only 'real' difference between an LCD panel, and
a plasma panel, is the switching time of the individual elements. On the LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo,
it's the element switching time which causes this effect ... ??

There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.

Plasma displays are gas discharge devices, they only glow when there is enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.

That's the aim of LED backlit TV screens (besides less power consumption, heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.
 
W

William Sommerwerck

Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.

Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers". This is one of the reasons the frame rate was increased to 24
with the introduction of sound.

The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.

That's new to me.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light
level.

NTSC is actually 4.2/1.5, or roughly 2.8 to 1. PAL is closer to 5:1.

That's the aim of LED backlit TV screens (besides less power consumption,
heat, etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

I hate to spoil things, Geoff, but liquid crystals are quite capable of
taking intermediate positions -- that is, forming a continuous gray scale.
 
S

Sylvia Else

Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers". This is one of the reasons the frame rate was increased to 24
with the introduction of sound.



That's new to me.

Well, the story I heard way back when is that it was to synchronise the
picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.

Sylvia.
 
Top