I am currently looking at a new TV and note that a lot of the new TVs are offering 120hz (my current TV offers 100hz, which Samsung was pushing as the latest big thing at the time). Can anyone possibly help me understand why this makes any difference, please? I have done a bit of Googling on the subject and have just confused myself more than before. I was under the impression that video, to trick the eye into seeing smooth motion, needs to be seen at 24fps. I remember reading that for historic scientific reasons, TVs used to output at either 50hz or 60hz (the difference being that there is a difference in electric systems among countries). This is all fine, but if a TV screen is refreshing 50 or 60 times a second and the video is only playing at 24fps, why would you need to then go to 100 or even 120hz (I could understand a change if the refresh rater were less than 24fps)?