I was wondering if the desire for native rate might not be misplaced? On PC monitors 60Hz is about the minimum usable without seeing flicker. I run mine at 75Hz. It will run at 100Hz but of course the lower you run it the sharper the image as the signal bandwidth requirement is less. So I wondered if that was why plasmas seem to have a frame rate of 60Hz, i.e. the minimum to avoid flicker which I guess is more noticeable on a big screen. Of course on 50Hz, 25 fps, then 75Hz would be perhaps best. 100Hz would be good but then you'd need that much more bandwidth in the plasma electronics. So can you see 50Hz frame rate? Or are all plasmas more than 50Hz?