Does my plasma really look better after running in, if so, why?

I have a question. How do viewers actually know what the intended colours are on a typical MPEG encoded SDTV signal?

i.e. how do they know if what they are seeing on their TV screen is accurate?

You are arguing away from the point of change over time for a fixed configuration. In fact this has no relevance at all. However there are fixed standards for how a TV should be calibrated accurately, there is no leeway for error there is a right and a wrong. If a screen is calibrated correctly, it is displaying the correct colours. If the screen properties change overtime when you revisit and remeasure the same test patterns they will have now shifted into the wrong and will need recalibrating.

If so then I'll offer you the whole of the internet to find a reputable link that backs up what you are claiming. That's basically every plasma manufacturer.

Nobody apart from you gives enough of a care to bother writing a white paper about this!!!!!! I'm afraid I'm not going to dedicated my time to finding anything for you, this post has taken long enough LOL!!

IMO the TV will gradually lose luminance over its life and that's one change no one will dispute but I doubt the luminance will drop much during the first few months and I'm not sure this counts as an improvement anyway.
Well this is actually a pretty damn significant admission because luminance directly effects both the grayscale and the colour!!! And indeed it drops initially rather quickly, settles over the majority of the lifetime of the panel, then because an increasing fall again once it has passed it's half-life expectancy (the figure manufacturers quote e.g. 100,000 hours is to the point of 50% brightness since new)

Also, I recently looked at the hours/mins for my 5090 via the IR service menu and it said >600hours since I bought it in the Spring. So my TV is pretty much run in according to the experts yet I don't really get an impression of any improvement?
In a brightly lit environment for example a change in absolute black is not going to be so obvious, perhaps. However I can't believe that you haven't noticed the PWM noise fade (which IMO is a massive weakness with the Pioneer screens that thankfully is at least not as bad after the kind of hours you have clocked up on it as it is when new). Perhaps settings, perhaps your eyes, I don't know fella. Does it matter that people are telling you you will get a better picture over time essentially for free?

I don't actually see how anyone can draw any conclusions about picture quality from comparing the calibration of a fresh screen with one that has done the magical 200hours. i.e. two different TVs that might not even have the same internal hardware revisions or firmware.

I'm talking the same screen recalibrated. Same source, settings, patterns, and measuring kit.

Also, what is being claimed anyway? If the colour drifts slightly then why don't some viewers say the picture gets 'worse' during running in?
It does get less accurate from a calibrated to now out of accurate point of view which can be defined as worse. However the potential for a better picture is increased. But this is only focussing on the colour and greyscale aspects and not the noise and depth of black that many people also notice. Seems a matter of semantics not really focussing on the main argument.

I'm wondering if I should ask Agilent how long we have to run it in for before making any accurate measurements?

Why do you think these things need recalibration over time? The answer is because they change.
 
Well this is actually a pretty damn significant admission because luminance directly effects both the grayscale and the colour!!! And indeed it drops initially rather quickly, settles over the majority of the lifetime of the panel, then because an increasing fall again once it has passed it's half-life expectancy (the figure manufacturers quote e.g. 100,000 hours is to the point of 50% brightness since new)

I'm not making any kind of admission here. I've made this point already on previous threads. We all know the plasma panel gets dimmer over time.

You are suggesting that gradual loss of luminance over time makes the TV image better? I'm just saying it will get dimmer.


Why do you think these things need recalibration over time? The answer is because they change.
But you haven't addressed the 'running in' aspect of my question? I'm wondering if we should run it in before using it as you previously suggested a parallel between electronics and running in car engines when new?


However I can't believe that you haven't noticed the PWM noise fade (which IMO is a massive weakness with the Pioneer screens that thankfully is at least not as bad after the kind of hours you have clocked up on it as it is when new
Can you define what you mean by PWM noise fade?

I know what PWM is and how it is used to enhance the colour palette but I cannot see how any associated noise will be reduced.

Once the TV has finished with decoding a frame it has to try and create the image on the screen.

But being a plasma it only has 8 colours so it has to do a whole heap more processing in order to try and fool our eyes (over the course of several frames) to see the correct colours with minimal colour banding etc.

If does this with a diabolically clever algorithm that spreads the colour error for each pixel (from the basic 8 colour palette) over time (using subfields/PWM) and also across neighbouring pixels and thus our eyes see something like the correct colour.

This process uses a fixed algorithm and a by product of it is noise. The noise is there on the screen. It's meant to be there and there's a mathematical reason for it to be there and it's part of the technology.

Unless the algorithm changes then the noise levels associated with PWM/dithering will stay the same.
 
Last edited:
Unless the algorithm changes then the noise levels associated with PWM/dithering will stay the same.

It hasn't on mine and, believe me, I'm very surprised as I'd previously dismissed the theory as absolute tosh:)
 
I've heard that LCD's get worse over time.............I'll get my coat;)
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom