Discussion in 'General TV Discussions Forum' started by dspit1664, Dec 21, 2011.
What is Hz in laymans terms ?
Does it really make much difference in a tv ?
Hz = Hertz is the SI unit of frequency.
See for a full explanation:
Hertz - Wikipedia, the free encyclopedia
However in simple terms relating to modern flat panel TVs it is usually the frequency that the screen is refreshed with new information.
Where it gets complicated is the source and nature of that information. UK TV standards use 50Hz interlaced signals and without going into lots of details about interlaced and progressive and the significance of how things were different on older CRT Tvs (100Hz had a different purpose and significance on a CRT than it does on a LCD/plasma, it reduced flicker by reflashing the image to the screen twice as often which was a good thing). Plasma/LCD don't flash then fade like CRTs, they stay lit until changed. This actually means that unless there is additional processing going on 100Hz is no different on a plasma/lcd to 50Hz you would just be reupdating the same info twice and nothing would change.
So the real significance of Hz on a modern plasma/LCD is how the 50Hz feed is reprocessed into a 100Hz or 120Hz, 240Hz etc.
Essentially a bit of mathematical cleverness is performed in a whole variety of different ways to create extra frames that fill gaps between those present in the transmission. How well this is done and the quality of the result vary massively with make/model/price etc. It can produce improvements particularly with smoothing jerky motion which is how it is usually marketed but it can also produce apaaling side effects and be a total disaster.
It really is a matter of pay your money, take your choice but don't be sucked in by the marketing that you must have it, or that the bigger number deffinitely means better, try comparing a 50Hz screen with a 100Hz a lot of people don't even see a difference.
Hz or Hertz is the SI unit of frequency in cycles per second.
In terms of modern flat panel TV's its how many times the screen refreshes per second. Note that strictly speaking hz is not the correct term here , but its commonly used , a more correct term is fps or frames per second , but hz and fps are commonly used to describe the same thing.
It is important in that the TV should be able to handle any content that it gets , and content for screens comes in many different formats.
480p at 60hz
576p at 50hz
720p at either 50hz or 60hz
1080p at either 24hz or 50hz or 60hz ( 1080p at 24hz is whats on blu ray , so its important that any new TV handles this content well )
and 3D content coming at 2 x 1080p 24hz , one stream for each eye.
Note that most TV's this side of the water work in multiples of 50hz , so to display most of the content listed above a routine must be used whereby this content is made to repeat so that it fits into multiples of 50.
How each TV does this and how well it does it affects the picture quality.
Yep know wiki thanks for pointing it out.
Should have been more to my real point in my post but you really answered it later on. ) many thanks it good to know 100 Hz may not necessarily be better than 60 or 50 Hz box
Many many thanks AndyCob and Andy1249
Nice a lot of info clearly explained. A lot more aware than I was before.
Most TV's this side of the water accept both 50Hz and 60 Hz inputs. On the other side however 50hz derivatives are not accepted.
In terms of de-interlaced progressive displays the operative terms are 25Hz and 30Hz.
Ok so above mentioned 50 and 60 Hz tv's so how comes for tv's like -
There's a big jump to 600 Hz ?
Perhaps if I read the fine print I'd find out however I'm sure all you learned peeps will have views on this.
Because it's marketing, the screen refresh rate is not 600Hz it's only 100Hz, but because the processing breaks the screen area into 6 sub fields they then times 100 by 6 to get 600. Which of course provides a nice big number for the average punter to go wow about.
So 100 Hz still better than 50/60 Hz and the 600 Hz is pretty much just a marketing ploy ?
Absolutely. In fact the 600Hz mode can be very annoying with some types of programme.
Please enlighten me with 'some types of...' thanks
Nice a lot of info clearly explained, but the problem is you have a capable 400hz tv and tdt-hd is only 50hz or 60hz
Ok so what is the point your trying to make except for the obvious one re Hz diff ?
I'm afraid that really requires a long explanation of the difference between Hz, fps, frames per second, fields per second and interlaced and progressive. Just to complicate things it's also different if your source is film as opposed to a TV broadcast, DVD or blu-ray.
That said here goes. The UK uses a 50Hz standard as a legacy of using the 50Hz electricity supply as a timing mechanism in old TVs.
What this actually means for a TV broadcast is your TV receives an interlaced signal of 50 fields per second. A field is essentially half the picture contained in one complete frame, in an interlaced TV broadcast your TV receives the odd numbered lines of the picture then one 50th of a second later the even numbered lines. What makes this extremely confusing is that TV cameras shoot 50 frames per second take the odd or even lines alternately and send them as 50 fields. What this means is the fields of odd lines and even lines are actually 1/50 of a second apart in time and do not make a complete frame if combined together. This has the advantage of updating the picture more often for smoother motion but means the fields don't line up as a complete frame producing jagged edges.
With film which is shot at 24 frames per second and then for the UK is sped up to 25 frames (this is why the same cut of film on UK broadcast or DVD runs shorter than the American version even though nothing has been removed, this does not apply to blu-ray). The 25 frames are then split into 50 fields of odd and even lines. This means for film the two fields do make a complete frame.
So what your TV has to do is receive the 50Hz interlaced picture, buffer part of it, decide if the source is film or TV, de-interlace the fields back to frames using the correct method, apply a very complicated method of frame interpolation to generate 'extra' frames to convert to whatever screen refresh rate the TV is using, combine those extra frames back with the broadcast and display them.
The potential for getting things wrong in this process, such as misidentifying the nature of the source, selecting the wrong, de-interlacing or frame interpolation options, or just doing a bad job of them and creating horrendous artefacting, ghosting and a host of other issues is significant.
It is very telling that the first lines of most professional screen setup guides is to try and find all the settings in the TV that relate to these motion interpolation systems and turn them off.
In short while they are sold to the customer as wonder solutions for smooth motion and improved picture qulaity they are often disastrous and totally ruin whatever you are trying to watch from a quality standpoint. That said there also versions of these systems that genuinely provide significant improvements.
ok is different tdt hd that blue ray (you can achive the total only with br or mkv rip)
Please enlighten me with 'some types of...' thanks
No experience myself but references have been made in the Forums about a 'smeary' appearance to fast moving action such as sports events. There variation in picture quality across the broadcasters will give different results after having the "400Hz" treatment.
the problem is for me hz is important infootball for the turf which is green and irregular but is for satellite or tdt and that is 50hz, for films with 100hz you have sufficient,
I want my next TV to handle fast moving motion without blur or with much less blur than I currently experience when gaming. I've got a 50Hz set and the motion blur really stands out to me, especially in darkly lit sections of first person shooters.
I thought a modern 100Hz TV would improve this situation, but after reading this thread I'm not so sure.
If you want good motion, buy a plasma instead of LCD. AFAIK most plasmas are 600 so just it ignore all the rubbish about Hz
This is total rubbish. If they are '600' then they are 600Hz. MOST plasmas are NOT 600Hz. Nothing wrong with a plasma though, if that is your choice.
As I said earlier the 600Hz is a marketing thing it's not the refresh rate of the screen.
I can't remember the last time I saw a plasma that wasn't 600hz. But I'm sure there are some out there.
Sorry for the thread bump, but I'd like to know why my tv says it's running 1080 on a HD channel (now I've got sky HD) but only runs 50hz.
Has my LG got a function to make it 100hz like it says it can do or is it just whatever the sky box puts out?
I thought the HD box would give me 100hz.
Sorry again but I'm new to all this stuff
Guys Guys Guys, let me try and shine some light on this wonderful conundrum, Plasma TV's will all shout 600Hz but as some of the chaps have said this is a marketing ploy because it is a big number. The 600Hz refers to the sub field motion or sub field drive of the TV. This is due to a Plasma TV having to constantly flash the phosphor chambers with in the sub pixels in order to sustain light through the panel. You can work it out in a couple of ways but the main one is that the panel refresh is 50Hz or 50 frames per second (fps), each frame (1Hz or 1fps) is made up of 12 sub-frames, therefore 50x12 = 600Hz or 600 fps. This does not means that the panel is 600Hz as it is only showing a portion of each frame as it goes through the sub-frames, a bit like building a collage. The net benefit of a Plasma screen being in this constant state of flux is that it has very little motion residue as the image is never constantly show on the screen, however there is no way you can say as a £400 Plasma is going to be better than say a LCD based TV at double the price just because it has 600Hz sub field, because Plasma's can still suffer from motion Judder the same as LCD screens.
On the LCD side of things there are a number of key technologies that can help reduce motion issues. The main cause of motion issues is the residual images seen on the screen as images move across the screen, however these can be counted by some key developments. Firstly if you have an LED lit LCD TV, the chances are that it will have Backlight scanning, for example on a Samsung LED it is called LED MotionPlus. This technology is always worth putting on as it will help reduce these motion artifacts by throwing up a blank frame in between each frame thereby removing the previous frame before the next one is shown (a bit like when you shake an Etch a Sketch to create a new picture). Other Key technologies include Intelligent Frame Creation, this helps to smooth motion out by inserting interpolated frames inbetween the real ones in a effort to smooth out any judder. This can work to a greater or lesser extent, but it can be worth playing with as it can improve motion for some types or programmes. The key to all of this in a great statement "The proof of the pudding is in the eating", in other words, when looking for a TV get the salesperson to throw up the sort of programmes you want to watch. It is also important to make sure that they take to TV's out of the store modes as most TV's switch off a lot of the motion enhancements in favour of making the TV's look as bright as they can be. I normally use either Standard or a Movie or Cinema mode.
Also don't just go on Hz as the figures these days are getting silly, a bit like Contrast ratios were, as a general rule of thumb LCD based TV's will be brighter, so if you intend to use your TV in bright conditions, these are worth a serious look, where as if you intend to use the TV in darker conditions, especially for film etc then Plasma will give you better blacks in most situations, if however you intend to go for a heady mix of the both either aim for a top end Plasma or an LED lit TV as they will be more versatile.
I hope this helps
Separate names with a comma.