From what I understand, the fastest "true" panel refresh rate is 200hz. However, with Sony and Samsung TVs claiming 400 and 800hz motion processing, I don't quite understand how it would work. Like for like motion processing makes sense, in that if a screen is displaying 50hz content, but refreshing 100 times a second, it has "room" to add an extra 50 interpolated frames, but surely if a TV can display 200 frames per second, forcing it to display 400 would just add motion blur? Can somebody explain how it actually works, because I sell TVs for a living and even manufacturer reps can't explain it. They usually default to, "The screen is fast, and it's processing the picture, and then you get the backlight making it brighter, or something and they add it together to make 800hz" which really doesn't actually explain anything.