GadgetObsessed
Distinguished Member
- Joined
- Jul 6, 2001
- Messages
- 3,236
- Reaction score
- 1,850
- Points
- 1,055
I was at the Philips event last night and thought that I would post some details of the testing, the results and my thoughts on those results.
I wanted to start by thanking Philips for setting this up. I have never before seen a test where a manufacturer puts their set against competitors sets and has an independent team calibrate all of the displays so they are each doing what they should.
Test Setup
Although we knew the TV models being tested (Sony A1, Panasonic 952, Philips 9002 and LG C7) this was set up as a truly blind test. All of the 55” displays were set up in a line, with their height adjusted so that the screens were all at the same level. The bezels of each set were covered so that it was not possible to identify which display was which. They even went as far as covering the screens when changing sources to ensure that the display of the source signal on each TV wouldn’t give away the manufacturer.
Each screen was identified with a letter below it, from A to D. We were told that we were to watch a number of clips and were to identify which we thought was the best display for each clip according to our own personal preferences.
During the test the room was reasonably dark but not blacked out.
All of the displays had been calibrated by Steve Withers from AVForums, who was together with Phil Hinton, the AVForums editor. Danny Tack, picture processing guru from Philips lead the evening.
Intent of the Test
Danny Tack started off with a description of what he was aiming to achieve with image processing. Many of us on the forums regard a calibrated image with no motion processing as the ideal. Danny though was aiming to demonstrate that with good quality image processing a TV can produce an image which is actually better to watch than one with no such processing.
For movies you ideally want to see exactly what the director intended. However, for other content such as sports, general TV viewing and lower quality sources such as streaming - additional processing really can help.
Testing Methodology
The tests were split into two parts.
The first part was to compare the TVs as they would be set up for ideal movie watching. This meant displaying a calibrated image with all the image and motion processing turned off.
The second part of the test was to compare the TVs with the same set of clips when set to Vivid. (More on this later.)
Content of the Clips
The clips covered a broad range in terms of source quality and content. The sources varied between low quality Netflix, high quality Netflix, SDR bluray and HDR UHD bluray. In terms of content the clips included one still image, baseball (to test motion handling of the ball) and a clip with a large number of big vertical and horizontal pans of detailed shots of buildings.
Note that the HDR clips were only shown during the second, Vivid phase, of the testing. The argument was that calibrating HDR is still a bit of a grey area. It wasn’t mentioned explicitly but I assume that the HDR clips were all HDR10. While the Sony and the LG support Dolby Vision the Panasonic and the Philips do not. (Philips have recently joined the HDR10+ alliance.)
Calibrated Testing
All four sets are based upon the same LG provided OLED panel. Therefore, the differentiating factor between the sets in terms of image quality is the image processing capabilities that each manufacturer can bring to the table.
Comparing these sets when fully calibrated and with the image processing disabled was therefore an interesting proposition.
In practice, during the calibrated testing, it was very difficult to spot any significant difference between any of the sets in any of the clips.
Personally, I found myself generally putting B and C down as my preference for each clip although there really wasn’t much in it. This did make me realise that as I was sat fairly centrally, sets B and C had an advantage in being straight on to me. Even though OLED doesn’t have the viewing angle issues of LCD, a picture directly in-front of you will always look better than one at an angle. (According to Phil Hinton there can also be some minor colour shifts even with OLED when viewing off angle.)
It was interesting to me to note that there were some noticeable colour differences between the sets. For example, at times the picture on set B was noticeable more red than the picture on display C. These differences arise because even a calibrated consumer set isn’t perfect for all colours for all brightness levels. Additionally the sets vary in how much flexibility their calibration controls provide. For example, on Sony sets you can calibrate the grey scale but there is no colour management system to directly calibrate the primary and secondary colours. Philips has a 2 point grey scale control whereas all the other sets have 10 point grey scale controls.
One test that really stuck out to me was the one with the vertical and horizontal panning. Here all four sets juddered horribly. This shows that if you don’t like judder on such shots, you have to use some level of motion processing.
The overall result to me was that if you are a movie watcher viewing a calibrated set with no motion processing then it really doesn’t matter which of these 4 sets you pick.
Vivid Testing
Now this was an interesting one and there is a valid question over how fair this test is. Personally, I would avoid using the vivid setting on any TV. Primarily, this is because the typical approach for a manufacturer is to set the Vivid mode up so that everything is cranked up to 11. Generally, I find the colours massively over-saturated and unnatural. The sharpness being cranked up gives rise to hard outlines and noise. Finally, using the maximum motion interpolation often leads to motion artefacts.
However, what Danny at Philips has done is different. He has aimed for a Vivid mode that is much more natural than a typical Vivid mode.
First impressions of vivid mode tests on all the sets was that everything was brighter and more saturated.
Throughout the various tests display C was generally much more natural than all of the others. It had the right amount of additional sharpening to improve the low quality source and a nice saturated image without seeming too unrealistic.
One of the lower quality streaming clips was a scene from Friends. In this scene there was a dim wall in the background which was noisy and heavily compressed. Set B somehow made this wall pulse as its brightness level fluctuated. Set A was applying so much sharpening that the noise was exaggerated. Sets C and D did much better.
The motion tests were particularly significant. With the scene with horizontal and vertical panning the juddering was removed from all of the sets - bar set A. I did notice occasional minor artefacts but these were far less of an issue to me than the horrible juddering seen previously. Both sets B and D also struggled with the baseball test showing a baseball moving across a green field. Set B had some double ball issues. On set D the baseball momentarily disappeared!
There are potentially other things to discuss about what we saw but I don’t want to cover these here as this is long enough already and there is the whole question of the validity of testing all the TVs in Vivid mode.
Additional Testing
At the very end there was a test of the HDR clips again in Cinema mode rather than vivid. This set of tests wasn’t originally planned. This was prompted by discussion around the different tone mapping used by the different sets during the HDR Vivid testing. In some high contrast scenes it was very clear that different manufacturers had very different HDR tone mapping in Vivid. For example, what may look black on one set was dark grey on another set.
Now watching in Cinema mode is a much more realistic test - as many people will use this and manufacturers generally set this up pretty well.
Watching this test reminded me of the calibrated test at the beginning. While there were differences between the sets it was subtle and it wasn’t clear which set, if any, was better than the others. Unfortunately we didn’t get to watch the non-HDR clips in cinema mode.
The Results
Calibrated testing - Set C was the winner but not by much. The results were fairly mixed and “No preference” did well. In discussions among the group it became clear that generally preferences were not strong for any of the sets. Overall, I don’t think that the win for C was very significant (statistically speaking) - especially as it had the advantage of being one of the two central sets.
Vivid Testing - set C walked away with this one with nothing else coming close.
There were no results taken for the Cinema mode HDR testing.
The Reveal - which set was which?
A - Sony A1
B - Panasonic 952
C - Philips 9002
D - LG C7
My Thoughts on the Results
It is fair to say that the test proved the point that Danny was trying to make. Image processing, when done well, can significantly improve an image - especially when you have issues such as juddering and noise and low quality sources. So if you have some particular issues with what you are watching, then it is at least worth trying some of the image processing options on your TV - especially for non-movie content.
Philips (Danny) should be congratulated on coming up with a Vivid mode that adds sharpness, motion smoothing, etc, in a way that really can improve some content without many of the issues with the Vivid mode of other manufacturers.
Personally, I have some other conclusions to draw from the test.
Firstly, if you watch a calibrated image with processing turned off it really makes very little difference which set you choose. Go for the one that matches your budget and personal preference for brand, aesthetics or features - such as Ambilight or Dolby Vision. The same may also apply if you watch in cinema mode. However, without watching the full set of clips - especially the motion ones - and without knowing what level of other settings relating to motion and sharpening, it is difficult to be definitive.
Secondly, most TV companies have really awful vivid modes. This in itself, isn’t really an issue as most of us on these forums would never use Vivid mode anyway. I regard it as a mode only used on the shop floor.
So we cannot necessarily say from this test that the Philips has the best image processing overall - only that it has the best processing when set to Vivid.
For the other three sets, if they had had different settings then we may not have seen some of the issues that we saw.
A much fairer test to determine which display has the best picture quality overall would have been to set the image processing for each set to a more realistic level. Danny himself pointed out that on the judder test there was a setting on the Sony that would have made the image just as smooth as on the other sets but it wasn’t used because it would have resulted in other motion issues but we didn’t get to see that. Would the Panasonic and LG have dealt better with the baseball clip with different motion settings applied?
So overall, we cannot say from this test that Philips have the best OLED or the best picture processing. I am not in any way denigrating the Philips saying this. Just that a test based only on the Vivid settings is a very limited test.
I wanted to start by thanking Philips for setting this up. I have never before seen a test where a manufacturer puts their set against competitors sets and has an independent team calibrate all of the displays so they are each doing what they should.
Test Setup
Although we knew the TV models being tested (Sony A1, Panasonic 952, Philips 9002 and LG C7) this was set up as a truly blind test. All of the 55” displays were set up in a line, with their height adjusted so that the screens were all at the same level. The bezels of each set were covered so that it was not possible to identify which display was which. They even went as far as covering the screens when changing sources to ensure that the display of the source signal on each TV wouldn’t give away the manufacturer.
Each screen was identified with a letter below it, from A to D. We were told that we were to watch a number of clips and were to identify which we thought was the best display for each clip according to our own personal preferences.
During the test the room was reasonably dark but not blacked out.
All of the displays had been calibrated by Steve Withers from AVForums, who was together with Phil Hinton, the AVForums editor. Danny Tack, picture processing guru from Philips lead the evening.
Intent of the Test
Danny Tack started off with a description of what he was aiming to achieve with image processing. Many of us on the forums regard a calibrated image with no motion processing as the ideal. Danny though was aiming to demonstrate that with good quality image processing a TV can produce an image which is actually better to watch than one with no such processing.
For movies you ideally want to see exactly what the director intended. However, for other content such as sports, general TV viewing and lower quality sources such as streaming - additional processing really can help.
Testing Methodology
The tests were split into two parts.
The first part was to compare the TVs as they would be set up for ideal movie watching. This meant displaying a calibrated image with all the image and motion processing turned off.
The second part of the test was to compare the TVs with the same set of clips when set to Vivid. (More on this later.)
Content of the Clips
The clips covered a broad range in terms of source quality and content. The sources varied between low quality Netflix, high quality Netflix, SDR bluray and HDR UHD bluray. In terms of content the clips included one still image, baseball (to test motion handling of the ball) and a clip with a large number of big vertical and horizontal pans of detailed shots of buildings.
Note that the HDR clips were only shown during the second, Vivid phase, of the testing. The argument was that calibrating HDR is still a bit of a grey area. It wasn’t mentioned explicitly but I assume that the HDR clips were all HDR10. While the Sony and the LG support Dolby Vision the Panasonic and the Philips do not. (Philips have recently joined the HDR10+ alliance.)
Calibrated Testing
All four sets are based upon the same LG provided OLED panel. Therefore, the differentiating factor between the sets in terms of image quality is the image processing capabilities that each manufacturer can bring to the table.
Comparing these sets when fully calibrated and with the image processing disabled was therefore an interesting proposition.
In practice, during the calibrated testing, it was very difficult to spot any significant difference between any of the sets in any of the clips.
Personally, I found myself generally putting B and C down as my preference for each clip although there really wasn’t much in it. This did make me realise that as I was sat fairly centrally, sets B and C had an advantage in being straight on to me. Even though OLED doesn’t have the viewing angle issues of LCD, a picture directly in-front of you will always look better than one at an angle. (According to Phil Hinton there can also be some minor colour shifts even with OLED when viewing off angle.)
It was interesting to me to note that there were some noticeable colour differences between the sets. For example, at times the picture on set B was noticeable more red than the picture on display C. These differences arise because even a calibrated consumer set isn’t perfect for all colours for all brightness levels. Additionally the sets vary in how much flexibility their calibration controls provide. For example, on Sony sets you can calibrate the grey scale but there is no colour management system to directly calibrate the primary and secondary colours. Philips has a 2 point grey scale control whereas all the other sets have 10 point grey scale controls.
One test that really stuck out to me was the one with the vertical and horizontal panning. Here all four sets juddered horribly. This shows that if you don’t like judder on such shots, you have to use some level of motion processing.
The overall result to me was that if you are a movie watcher viewing a calibrated set with no motion processing then it really doesn’t matter which of these 4 sets you pick.
Vivid Testing
Now this was an interesting one and there is a valid question over how fair this test is. Personally, I would avoid using the vivid setting on any TV. Primarily, this is because the typical approach for a manufacturer is to set the Vivid mode up so that everything is cranked up to 11. Generally, I find the colours massively over-saturated and unnatural. The sharpness being cranked up gives rise to hard outlines and noise. Finally, using the maximum motion interpolation often leads to motion artefacts.
However, what Danny at Philips has done is different. He has aimed for a Vivid mode that is much more natural than a typical Vivid mode.
First impressions of vivid mode tests on all the sets was that everything was brighter and more saturated.
Throughout the various tests display C was generally much more natural than all of the others. It had the right amount of additional sharpening to improve the low quality source and a nice saturated image without seeming too unrealistic.
One of the lower quality streaming clips was a scene from Friends. In this scene there was a dim wall in the background which was noisy and heavily compressed. Set B somehow made this wall pulse as its brightness level fluctuated. Set A was applying so much sharpening that the noise was exaggerated. Sets C and D did much better.
The motion tests were particularly significant. With the scene with horizontal and vertical panning the juddering was removed from all of the sets - bar set A. I did notice occasional minor artefacts but these were far less of an issue to me than the horrible juddering seen previously. Both sets B and D also struggled with the baseball test showing a baseball moving across a green field. Set B had some double ball issues. On set D the baseball momentarily disappeared!
There are potentially other things to discuss about what we saw but I don’t want to cover these here as this is long enough already and there is the whole question of the validity of testing all the TVs in Vivid mode.
Additional Testing
At the very end there was a test of the HDR clips again in Cinema mode rather than vivid. This set of tests wasn’t originally planned. This was prompted by discussion around the different tone mapping used by the different sets during the HDR Vivid testing. In some high contrast scenes it was very clear that different manufacturers had very different HDR tone mapping in Vivid. For example, what may look black on one set was dark grey on another set.
Now watching in Cinema mode is a much more realistic test - as many people will use this and manufacturers generally set this up pretty well.
Watching this test reminded me of the calibrated test at the beginning. While there were differences between the sets it was subtle and it wasn’t clear which set, if any, was better than the others. Unfortunately we didn’t get to watch the non-HDR clips in cinema mode.
The Results
Calibrated testing - Set C was the winner but not by much. The results were fairly mixed and “No preference” did well. In discussions among the group it became clear that generally preferences were not strong for any of the sets. Overall, I don’t think that the win for C was very significant (statistically speaking) - especially as it had the advantage of being one of the two central sets.
Vivid Testing - set C walked away with this one with nothing else coming close.
There were no results taken for the Cinema mode HDR testing.
The Reveal - which set was which?
A - Sony A1
B - Panasonic 952
C - Philips 9002
D - LG C7
My Thoughts on the Results
It is fair to say that the test proved the point that Danny was trying to make. Image processing, when done well, can significantly improve an image - especially when you have issues such as juddering and noise and low quality sources. So if you have some particular issues with what you are watching, then it is at least worth trying some of the image processing options on your TV - especially for non-movie content.
Philips (Danny) should be congratulated on coming up with a Vivid mode that adds sharpness, motion smoothing, etc, in a way that really can improve some content without many of the issues with the Vivid mode of other manufacturers.
Personally, I have some other conclusions to draw from the test.
Firstly, if you watch a calibrated image with processing turned off it really makes very little difference which set you choose. Go for the one that matches your budget and personal preference for brand, aesthetics or features - such as Ambilight or Dolby Vision. The same may also apply if you watch in cinema mode. However, without watching the full set of clips - especially the motion ones - and without knowing what level of other settings relating to motion and sharpening, it is difficult to be definitive.
Secondly, most TV companies have really awful vivid modes. This in itself, isn’t really an issue as most of us on these forums would never use Vivid mode anyway. I regard it as a mode only used on the shop floor.
So we cannot necessarily say from this test that the Philips has the best image processing overall - only that it has the best processing when set to Vivid.
For the other three sets, if they had had different settings then we may not have seen some of the issues that we saw.
A much fairer test to determine which display has the best picture quality overall would have been to set the image processing for each set to a more realistic level. Danny himself pointed out that on the judder test there was a setting on the Sony that would have made the image just as smooth as on the other sets but it wasn’t used because it would have resulted in other motion issues but we didn’t get to see that. Would the Panasonic and LG have dealt better with the baseball clip with different motion settings applied?
So overall, we cannot say from this test that Philips have the best OLED or the best picture processing. I am not in any way denigrating the Philips saying this. Just that a test based only on the Vivid settings is a very limited test.
Last edited: