I was at the shootout and thought that I would post my thoughts on the night. As an enthusiast it was an interesting event to attend. It was also a great opportunity to hear directly from a manufacturer (Philips’ Danny Tack) and to meet AV Forums very own Phil Hinton and fellow forum members.
Essentially, this was a repeat of Philips’ 2017 shootout but this time with the latest 2018 models from each manufacturer.
I want to start by thanking Philips and TP Vision for running this event. It is great to see a manufacturer directly engaging with enthusiasts. In this case, the group that would be voting on the shootout results were all AV Forums members. Running such an event is also a canny move by Philips/TP Vision, as these shootouts attract a lot of interest on the net - especially among the kind of TV enthusiasts that buy OLEDs.
Disclaimer
This comparison was called a shootout - and yes, we were comparing 4 calibrated sets from different manufacturers with the same test clips. However, the following points need to be kept in mind when interpreting the results of this event:
Calibration - the TVs were independently calibrated in their Cinema/Movie mode by Phil. For the first half of the shootout this Cinema mode was used. However, for the second half of the test the displays were not in this calibrated mode. Instead they were all in Vivid/Dynamic mode.
Philips chose the test clips - it is likely that Danny has used some of these same clips when refining Philips processing algorithms.
Given the above, the overall results of the shootout
cannot be interpreted as meaning the winner is the best 2018 OLED TV overall. To answer that question all of the TVs would have been set up in the best way for the content shown for the whole test - and that is unlikely to have been using Vivid mode.
Personally, I have no issue with Philips choosing to run the test in this way. Enthusiasts may want a completely independent test with each TV set to its own optimum combination of settings. However, no manufacturer is likely to sponsor such a test. In Philips defence, Vivid is often the default on the shop floor and so is the mode that most buyers see when actually choosing a TV. Given this, Vivid mode should be what manufacturers actually think is the “best” mode when trying to make their TV stand out in a shop display of many TVs.
Now, with that disclaimer over, onto the test itself…..
Test Setup
The competing models in the test were all 65" sets as follows:
Philips 803
Sony AF9
LG C8
Panasonic FZ802
Philips went to great lengths to ensure that the shootout was set up as a truly blind test. All of the displays were set up in a line, with their height adjusted so that the screens were all at the same level. The bezels on all 4 sides of the sets were covered so that it was not possible to identify which display was which. Philips even went as far as covering the screens when changing sources, to ensure that the display of the source signal on each TV wouldn’t give away the manufacturer.
Each screen was identified with a letter below it, from A to D. We were told that we were to watch about a dozen clips and were to identify which we thought was the best display for each clip, according to our own personal preferences.
Danny Tack, picture processing guru from Philips in Holland, lead the evening.
Testing Methodology
The viewing tests were split into two parts.
The first part of the test was to compare the TVs as they would be set up for ideal movie watching. This meant displaying a calibrated image in Cinema/Movie mode, with all the image and motion processing turned off.
The second part of the test was to compare the TVs with the same set of clips when set to
Vivid.
Content of the Clips
The clips covered a broad range in terms of source quality and content. The sources varied between low quality Netflix, high quality Netflix and 4k SDR and HDR.
Note on Comparing Sets
Even when set up next to one another and showing with the same content, it is surprisingly hard to compare TVs. I found that in practice I could only really compare the sets one pair at a time. As the clips themselves were quite short, even though played twice, it was often difficult to spot significant differences. It didn't help that the clips generally had frequent scene changes. By the time you had got a good impression of how one TV displayed a scene it was too late to see the scene on one of the other sets.
Calibrated Testing
All four sets are based upon the same LG provided 2018 OLED panel. (Although, more on this later.) Therefore, the differentiating factor between the sets in terms of image quality is the image processing capabilities that each manufacturer can bring to the table.
Comparing these sets when fully calibrated and with the image processing essentially disabled, was therefore an interesting proposition.
One thing did stick out immediately. Even though all 4 sets were calibrated set C had a significantly warmer colour tone than the other three sets.
Personally, I found myself generally putting B and C down as my preference for each clip although in many clips there was very little between the 4 sets. This did make me realise that as I was sat right at the front and off to the right, set A had a significant disadvantage from my viewing point. Even though OLED doesn’t have the same viewing angle issues of a VA LCD panel, a picture directly in-front of you will always look better than one at an angle.
The first comparison was a still image of a woman with her sand covered palms up facing the camera. Set B was noticeably more sharp than all of the others. While I liked this for a still image I did feel that this level of sharpness would have been too much for a moving image.
One noticeable test was one with the heavy use of vertical and horizontal panning of bright and detailed backgrounds - shown in 24p. The shooting of this test footage was commissioned by Philips and was specifically intended to highlight fast panning judder with 24p. Here all four sets juddered horribly. This shows that if you don’t like judder on such shots, you have to use some level of motion processing.
The warmer colour tone of set C worked both for and against it. In the scene showing a Dutch music festival, to me the extra warmth improved the skin tones compared to the other sets, which seemed slightly blue in comparison. However, in the HDR bar scene from Passengers, the background had a lot of red and the skin tones were already quite red. In this case the extra warm colour tone made set C look unrealistically red. In the Planet Earth scene the extra warms made the grass that should have been green look a bit brown.
In the HDR clip of the Lego movie, set B had the brightest specular highlights of any of the sets.
Overall though the differences between the sets (other than the colour on set C) was subtle and in many cases it was really difficult to pick a best set.
If you are a movie watcher, viewing a calibrated set with no motion processing, then there isn't a great deal to choose between the 4 sets.
Vivid Testing
The displays were then set to Vivid mode and the same test clips were shown.
There is a valid question over how fair a test this is. The typical approach for a manufacturer is to set the Vivid mode up so that all the processing is cranked up to 11. This can often result in oversaturated colours, motion artefacts and haloes caused by over-sharpening.
In many of the tests again there was not that much to choose between the sets. However, there were some of the tests showed significant differences.
For example, one of the tests showed bright, shots of buildings and interiors with lots of fine detail. In this test sets C and D looked awful. Often they made the fine detail pulse in a very strange way. Sets A and B looked much better.
However, this cannot be used as evidence that sets C and D had any issues with displaying fine detail. All it shows is that these sets increase sharpness far too much in their Vivid modes, resulting in some nasty side effects when displaying fine detail.
For the scenes where the use of Vivid didn’t seem to particularly disadvantage any set, I again didn’t find any set being head and shoulders ahead of the others. However, generally sets A and B did seem slightly better.
The motion tests were particularly significant. With the scene with horizontal and vertical panning the juddering was removed from all of the sets. I did notice occasional minor interpolation artefacts but these were far less of an issue to me than the horrible juddering seen previously.
The Results
Calibrated testing - Set B was the winner but by a small margin. The results were fairly mixed and “No preference” did well. Overall, I don’t think that the win for C was very significant (statistically speaking) - especially as it had the advantage of being one of the two central sets.
Personally, I would call this phase of the testing too close to call and it would end up as a
draw.
Vivid Testing - B won this much more. It generally had a more pleasing and more natural image. Crucially, it didn’t have any scenes where the use of Vivid resulted in obvious issues, giving it an disadvantage.
Combining the results from the two sets of test meant that the winner on the night was set B.
The Reveal - which set was which?
A - LG C8
B - Philips 803
C - Sony AF9
D - Panasonic FZ802
My View on the Results
Calibrated Movie Viewing
If you only watch a calibrated image with processing turned off then these tests did not show an obvious winner. (Although C stuck out as both good/bad depending upon the scene.) Go for the set that matches your budget and personal preference for brand, aesthetics or features - such as Ambilight or Dolby Vision.
However, we didn’t get any chance to look at some of the features of the competing sets such as Sony’s and LG’s dynamic HDR tone mapping - that apply even when calibrated and could have made a difference in certain scenes.
Vivid Mode Testing
Philips has a Vivid mode that is more natural and pleasing across a wide range of sources.
Surprise Activities
After the results were announced there were two additional activities. The first was a comparison between a Philips 903 OLED and a Samsung Q9FN LCD with Full Array Local Dimming.
Danny walked us through these tests and pointed out the key differences. The initial tests were in calibrated mode and used a different set of clips to those used in the main shootout.
I was suprised that generally in calibrated mode the image had more saturation and contrast on the OLED than on the LCD.
One of the main points of note in these tests was how poorly LCD FALD does when you have generally dark scenes with relatively small bright highlights, or large areas of just above black detail.
For example, in scenes of fireworks and of a car’s bright instrument display against an almost black background, the highlights were much brighter and more impactful on the Philips than on the Samsung.
To prevent blooming the Samsung was essentially keeping the zones of the bright areas relatively dim. Significantly reducing the brightness of those highlights.
There was also a scene of a black glossy dog against an almost black background. In this case the low brightness level overall again caused the Samsung problems. It simply ended up being very dim overall and crushed away a lot of the detail to black.
These tests were repeated in Vivid Mode again highlighting the same differences. There was also a scene from the Life of PI where there were some dimmer areas of the scene and a very bright area showing a window. The Samsung looked very poor in this bright scene as well. It blew out the detail in the bright areas while crushing detail in the dark areas.
Again bear in mind that these are selective examples. OLED is always going to be significantly ahead of LCD when it comes to bright details against a dark background. An LCD with around 500 dimmable zones cannot compete against the 8.3 million dimmable zones of an OLED. FALD cannot help that much when trying to displaying a relatively small area of white pixels adjacent to black pixels. Even for a FLAD LCD the Samsung has a reputation for crushing black detail. However, it is a fair set to choose as it is pretty much the only FALD set available right now. (Sony’s ZD9 is discontinued and the ZF9 hasn’t hit the shops yet.)
However, while the group were discussing the failings of the LCD the HDR clips started to play and watching these the LCD really did well and its extra brightness advantage started to show.
Philips 903 Sound
The second surprise activity was hearing Philips 903 in action. It was a difficult environment for the set in a very large room with a very high ceiling. Also the set was further away from the wall than would have been ideal - about 2 feet.
Despite this the sound produced was really impressive. Bowers and Wilkins have done an excellent job in creating one of the best sounding TVs. Philips also said that there will be future sets that push the B&W aspect further i.e. giving B&W more freedom to make more significant changes to the physical design of the set to incorporate larger and even better sounding speakers. The aim being to move beyond a “TV with great sound” and toward “a truly audio-visual product.”
Is the Philips Using a Different Panel?
Officially all of these sets are using 2018 LG OLED panels.
However, one review of the Philps 803/903 has shown that the sub-pixel structure of the Philips is not the same as that for all the other sets using 2018 panels. (Anyone with a decent macro lens and camera can photograph a screen close up and have a look at the pixel structure. I have tried it myself.)
Also myself and the people I sat next to, noticed that when the sets were turned off the Philips (B) looked different. WHile the other three sets were completely black with black reflextions the Philips had a very slight tint. This was pretty subtle and I doubt that I would have noticed it other than by having all the sets next to one another.
So it is possible that the Philips is the first manufacturer to start using a new LG panel.