Planetary Reference verses "Filmmaker" Mode

PlanetaryReference

Active Member
I wanted to counterpoint the “Filmmaker” mode discussion within your recent AV podcast.

I have this mode on my 2020 LG UN7300PTC TV. I've tried it and was confused how this was a quasi 'reference' setting, as it was much too red and dark, the ‘warmth’ level (Warm 2) was much too high for it, and it needed to be set to ‘medium’ (one warmer than ‘cool’) to even begin to look visually acceptable (and still didn't). So I concluded, "What were they thinking?"

This is not what the screen looks like when I go to the cinema.

I listed to the remarks within the podcast yesterday and decided to give it another go. But found the same things, I had to make a LOT of adjustments just to get it looking acceptable with a recent movie.

Podcast Samsung HW-Q950T Soundbar Review


I’ve come to the conclusion this “Filmmaker Mode” setting is a low-fidelity filtering effect. So maybe you’re thinking I’m an ignoramus, who doesn’t understand calibration concepts, or the technical nuance of colour, and thinks his eyes are more objective, or can perform better than sensor measurements and setup of the associated parameters, etc. OK, guilty as charged - but you’re still wrong. So allow me to explain and offer a different view of the topic. I’m a nobody in AV just a customer who wants the best bang for my dollar. This is also not about features manufactures want to display (white with “too much” blue, high backlight or contrast, etc.) to sell their stuff in big box retail displays.

With sound system tuning I always aim for the sound of real life. In real life I don’t hear a lot of 50 to 20 hertz sounds. It’s there, but its rare, or so low-level that it’s barely noticeable (and if it was present we'd want it gone!). For example, a real gun shot has a lot more 3kHz, 1kHz, and 100Hz amplitude than it has 30Hz content present in the transients, either up close or from 500 meters away. Less bottom end is accurate. Except that's sometimes not how surround sound and movie audio will bias the sound and its amplitude with frequency response.

Plus everything gets compressed to hell too, so the bottom end always comes up to very unrealistic levels, and your neighbors are no longer happy campers. That is anything but accurate, natural, nor realistic high-fidelity sound reproduction. But it is how movies (unfortunately) do it. If a main battle tank firing was 'realistic', or natural people, people would be deafened, and leave the cinema, very early, screaming and talking about class action suits.

'Filmmakers' and video and sound studios only produce what's acceptable to see, and hear, in theaters.

I often detest the sound presentations in modern movies, things like the Transformers action movies or Avengers digital AV bubble-gum movies, for instance. The audio is frankly hopelessly unnatural, it’s glib and much over compressed and processed, and becomes subwoofer-biased, for drama and audience 'effect'. It's not that this adds nothing, it does add something, but I probably need to turn my sub down below music playback levels in order to watch it, without annoying the people at the end of the street.

That’s poetic license by sound people, urged on by the people funding the movie who want max bums on seats. They get away with doing that and such plastic audio cr_p is replete with horrifyingly glib 'action' music scores as well. These do sell, but mostly I don’t like it. When a cake is over cooked you don’t eat it, you throw it away, or you see if some kid likes it (who will regard you as their friend for life).

The same applies to Filmmaker mode. It’s not entirely cr_p, but it is not the slightest bit accurate, it is a product of movie theater's display format on a screen that reflects a lot of photons back at you. If the screen was as bright as real life it would be blinding within a dark cinema in the same way a real sniper's rifle shot depicted accurately, would make the audience’s ears ring for two days. No one wants that. So what the filmmakers produce is an attenuated tuned video EFFECT.

At the South Pole when the sun is not below the horizon (i.e. long night), it's not the far above the horizon. This produces early to mid morning-like lighting, and colour hues and textures associated with a low angle of incidence, where the blue light is absorbed more than the red light. Unless it is "high" Summer. Then it's bright and blue dominated colours. The sun is 6,500k of course, altered by ray pathd through more atmosphere (but much less humidity to absorb it too) but the sky in high summer is also BRIGHT BLUE and washing the ground with blue photons. This is on top of the 6,500K of direct sunlight illumination, so the Blue light is altering both the colour and the brightness, in the process. OK? So there is a lot of blue in the real world on a bright day, and a very blue-white wash from clouds, within a mostly blue sky.

That’s natural.

It looks nothing like Filmmaker mode’s redness or darkness at midday, it's not even close. The real comparison for my TV, is not with some film, it's with a high-quality unfiltered still camera image, which is both accurate, and has no such philosophical, artistic or funding commitment to 'Filmmaker' effect perfection. It just shows as accurately as possible what the real world colour and lighting ranges really look like.

This camera image is what I want my high fidelity 'monitor' to look like.

But I don’t even need this image to balance my TV to real world colours and lighting levels, all I have to do is look out the window to get a reference image of what my TV should be calibrated to.

The same applies to audio. When younger (and more bamboozled by the BS), I used to chase 'perfect' audio reference calibration, until it occurred to me that my ears are the best surround system I'll ever hear, and all I have to do is open windows and doors and listen, and I have the perfect reference natural audio, and can tune my audio system to it. Even now, when editing sounds, I make sure I can hear the natural world for my internal "reference monitor" calibration.

Using my own senses works better for audio and video setups than all the other high-falutin malarkey being constantly put about as the proper way of doing it. I also use my nose for cooking, and my own senses invariably gets that right too, I don't need to time anything, I can smell when I got it right.

"Trust your feelings Luke!" :nono:

I’m not saying solid calibrated starting points are not essential, they are. For instance, the Filmmaker mode offers 1.886 as reference Gamma level. The other display modes do not have that option so typically I would use 1.9 Gamma with those, but 1.886 gamma in filmmaker mode is a better setting to work from.

And in audio, a flat smooth response and accurate drivers do matter, as calibrated starting points. But what speaker ever produces this, with room harmonic overtones and attenuation involves? None do. Same with room types for TV displays, there's always tuning compromises, due to room lighting, colour and reflection. The colours of the display will never be perfect, but tuning helps a lot to remove the obvious calibration problems.

Whatever, with baseline calibration assistance, we learn to negate both the audio and video depiction issues, which any particular room or building structure presents us with, selectively tuning away from an ideal reference calibration, to compensate for it. The point being, that Filmmaker audio and video effects aren’t even close to what I want to see and hear. They are not the ideal, nor the zenith of display and audio depictions.

So to get a good baseline setup on my 2020 43inch LG UN7300PTC TV, on this PC, I provide this to it:

[Filmmaker Mode]
Backlight 60
Contrast 62
Brightness 59
Sharpness 20 (Needed on PC and in games to clarify, plus works well on TV)
Colour 50
Tint 0
Gamma 1.886 (I used to do this with DarkRoom 1.9, but now do it with Filmmaker for this gamma 1.886 option)
Colour Warmth = Medium
No dynamic contrast
‘Low’ blackness
No smoothing, motion or noise reduction, etc.

This works very well for PC use, TV, and Game console video depictions.

Then I visually set a 2-Point (low and high luminosity) Whiteness level using a whiteness and colour youtube calibration video, that I downloaded. This allows me to tune whiteness levels visually, very accurately (yes, it works). The result looks very natural, when I compare it to the REFERENCE PLANET's graphic truth, that I see out my window, and also inside my windows, for hue, blueness and brightness/contrast level. This is the calibration video I've used:

HDR Display performance test and calibration video


I also created and used 4K still graphics test pattern images. I created a range of blended black, gray-scale, white, RGB and there intermediate gradients, and provided contrast strips of 64 and 128 brightness levels, lower and of same colour, within them, and grading from 255 to 0, horizontally and vertically as well.

As it turned out I didn’t need the graphics images as much to fine tune because the above video allowed to tune whiteness levels, for colours plus brightness and contrast, to perfection. But the graphics images did then confirm that the tuning was near to perfect. for a solid 2-point tuning reference. So I have a ‘perfect’ calibration of reference sources for setting whiteness, brightness, contrast, backlight and colour levels, visually. So my TV’s colours and grayscale is visually calibrated to match the REFERENCE-PLANET outside my home, plus also the REFERENCE-PLANET inside the home.

Human sensors looking at planet reference, and screen, tell me that screen looks pretty darn accurate.

The Filmmaker mode, as a one button press option for the TV, looks nothing like the preference-planet outside or inside my home. Which is unacceptable, it’s not even close to the “high fidelity” philosophy we pursue for audio playback to match the planetary-reference truth, of our ears. So why would 'Filmmaker' represent an acceptable 'standard' for my TV's display? The Filmmaker version on my 2020 LG UN7300PTC is unacceptable for a display setting, even for movies.

The acceptable display tuning can only be the planetary reference, inside and out, and it’s not a mere compromised entertainment video source. Same for the natural surround sound my ears hear each second which beats any surround system ever created, it’s the closest to audio perfection there ever will be.

[Yes, I use cotton buds, and pop my ears regularly. I’m that extreme about high-fidelity! :laugh: ]

The planet is illuminated by 6,500k light ‘colour’ or ‘warmth’ 'glow'. Yes it is, but it also has half a hemisphere of bright luminous blue sky above, as its secondary lighting wash, plus bright perfectly white clouds (gray scale) bathing the entire landscape, in a massive excess of white and blue photons, on top of that 6,500k ‘warmth’!

So 6,500K is not accurate as a planetary reference colour calibration standard to aim for.

The net ‘colour’ of the 'reference planet' is far bluer than the 6,500K sunlight contribution to surface colourations! The colours are thus being biased away from this 6,500k ‘ideal’, by the blue sky photon wash, and the white clouds brightening up everything. It's also altered by the angle of incidence of the sunlight and how much atmosphere it must pass through, which changes with seasons and time of day.

And are those daytime lighting sources bright were I live, in the topics! I can get a sunburn in 10 minutes in Summer, and 20 minutes in Winter. The abundant bright BLUE end of the spectrum does this, not the red end of the spectrum, i.e. there is a lot of blue light in the reference planet, outside, that also comes inside. My interior never exhibits any red-filter wash effect, except at dawn, or dusk.

Titanium white paint in sunlight hurts your eyes to look at, and inside it’s quite bright too, and these filmmaker red hues are not to be seen. It’s predominantly white with slight blue wash, with more green than red present.

So the natural light is NOT a warm reddish wash!

It could not be further from a warm wash of red tones like this Filmmaker mode setting is. It is unfortunately a gimmick (on this TV), and it is not a superior as a reference setting, even for movies. My own settings are far better - chalk and cheese.

[ I do wonder why this LG TV has such rubbish standard display settings. The reference-planet is right there in front of us, it’s actually hard to not get it approximately right. But not one of the standard settings on this TV is even close to what I see in the planetary-reference colour pallet, and its relative luminosity level. Yet the TV is capable of displaying a high-fidelity depiction of this planetary-reference pallet and levels. Clearly the TV was engineered by competent people and can achieve so much more than it does with factory default settings. The settings I've provided above for this particular panel, do actually look natural, they're bright, they are not a red-wash at all, nor are they a blue-wash. The LG marketing for in-shop display setting, is a comparative joke compared to these settings above.]

So the reference-planet outside is showing that this “Filmmaker mode” is a low-fidelity, it's an effect, applied over the planetary reference colours and correct Gamma, plus the excessively dull contrast and much too low brightness setting for the panel. It's bad. It may work in a cinema context, perhaps, but it’s certainly not working for my TV panel. If I want to the cinema and it looks like that I would ask for my money back.

No matter how serious they are about their art or calibration, it’s not even close to accurate compared to the imagery I see when walking to the shops.

Something tells me the filmmaker is an artist and is not interested in visually real imagery. If I watch their movie though it will be with a TV tuned to the planetary reference, so they’d best not over indulge their art depictions of a fictional world too much.

But I also don’t want titanium white paint depicted on a TV in sunlight to hurt my eyes, so no, I don’t want an 8,000 candela TV panel. Any more than I would want a sniper’s rifle going off in the same room as my ears. My human senses need less of that intense end of the AV high-fidelity spectrum. Dolby Vision format and its range is probably all I could tune and visually endure, on a daily basis.

Most people can not cope with high levels of 16kHz sound on a daily basis either. Nor can I cope with constant excessive sub-bias in audio of music or movies. And nor can I put up with excessive darkness bias in videos, or movies. This “Filmmaker mode” does in fact bias the TV panel toward excessive darkness. I can’t put up with that any more than a family can put up with a poorly tuned sub, thudding and rumbling the home while listening to Dr Phil save suburban American families from subs which are apparently driving whole streets of them totally cray-cray.

I like balanced lows in audio, I like balanced blacks levels in video. No TV panel or sub can do this acceptably with a factory setting. Some are better than others of course. So we need TVs and subs with adequate low level ranges, then tune those too the room. Teenagers like poorly tuned (usually not tuned) subwoofer audio, but the people with enough money to buy such systems, mostly don’t like such crappy subs or sound media products. They don't like music or movies which botch the sub frequencies, and get them way out of balance with the “planetary reference” audio for the same sound types. I don’t like black-blacks on monitors much either (unless carefully tuned). What I want my TV to depict, is what I see at night when I walk around, inside, or out, and then see a dark object, or area.

This Filmmaker mode does not do that at all.

It’s quite cr_p. The settings I gave above do this much better, especially when whiteness, colour and the resulting gray-scale are tweaked with the video I've linked above. Some whiteness settings needed a LOT of alteration away from ‘0’ output levels, to get a 2-point whiteness balance at both the very low and very high monitor light output level. I'm not providing those whiteness settings because every panel is a little bit different.

I will only ever want what the world looks and sounds like, not what some ‘film‘ mode made it look like, to an executive editor or producer looking for an ‘effect’. That’s fine, as art, but I don’t care for it and an not going to tune my TV display to that. The Filmmaker mode is not rubbish, I am using it as the Gamma=1.886 option is a more refined standard for gamma, so it’s not as if the editing-studio approach provides no potential for calibration improvement on TVs, and I hope that sort of refinement keeps coming. But I hope future versions of this sort of 'mode' are made realistically functional for TV panel uses, because this standard “one button press setting” amounts to a dull red-filter colouration “effect”, with nowhere near the “high fidelity” imagery, this particular current model display is capable of.

My TV first needs to produce a “planetary-reference” midday, and midnight, dusk and dawn, to even be close to high-fidelity for PC, TV, movie and games monitoring. This Filmmaker mode looks terrible for all of these monitoring applications.

Bluer and brighter is a valid endpoint to tuning, just look out your window on a sunny day, near to midday, and what do you see?

Forget configuration ideals, look and listen to what’s going on in the physical world and fine-tune according to your own senses, as only your senses are going to be looking and listening to the results of such tuning. What looks and sounds natural? Do I like this? It’s hard to go wrong if the answer is yes.
 
Last edited:

[email protected]

Well-known Member
That's quite a remarkable first post..I think that the one thing I have taken from this and purely from the first couple of lines (not bothered with the rest) is if you want better quality don't buy the cheapest and most basic of sets
 

Sloppy Bob

Distinguished Member
You want to see movies and hear movies the way they are at the cinema.

Don't watch them on a bottom range IPS panel TV with a soundbar.
 

sagaris99

Well-known Member
So basically, this is a forum for snobs?
Snobs, no.

You bought an entry level LG IPS (which aren't renound for the most accurate picture at the best of times) and somehow seem to be disappointed that the image quality isn't close to your specific requirements and standards. The level of detail and knowledge you have is great, but the product you've bought has no capability of meeting those requirements. A backlit LED/LCD TV, be it IPS or VA simply cannot replicate colours in their true form, especially that of daylight (6500k), and calibrators recommend using warm colour temps as they get the 'closest', but no TV I'm aware of can meet true 6500k.

It'd be the same as buying a Dacia Sandero and being disappointed you can't take Eau Rouge/Raidillon at Spa giving it full chat and not end up in a wall upside down.

If you're after a planetary reference standard product, you're either looking at buying the £35k Sony X300 reference monitor, or probably the Panasonic HZ2000 - but even these will be tough to meet the standard you require.
 
Last edited:

PlanetaryReference

Active Member
Not expecting miracles from this TV, it's just a monitor for a PC that sometimes gets used for TV, movie, or a game, not looking for video editing standards, just to tune it to remove visually obvious low-fidelity aspects. It's quite responsive to producing a far better display than could reasonably be expected.

But as it's incapable of displaying any useful level of 'Filmmaker' mode, that sort of mode would be best left off all but high-end units. I can make that mode look good, but frankly the "Dark Room" option also has its BT1.886 gamma option and maybe looks a bit better.

I have not seen anyone say "Filmmaker" mode looks good or worth using, so was surprised to see this being cited as a good thing on the podcast as having redeeming features. Recommending it as a one button press solution needs some qualification.

Looking at the forums I realize there are many people in here who like to pay 10 times more to get a 3% improvement in some parameters. I've learned that proper tuning of what I have, from a basically 'good enough' product, can bridge most visual inadequacies and no one can even tell there's a marked difference in quality, without direct side-by-side demo, to spot an ultimately 'meh' visual difference.
 

Sloppy Bob

Distinguished Member
As above. What you're wanting and expecting from the equipment you have is unrealistic. Obviously having a "Filmmaker mode" on your TV is purely a box-ticking marketing exercise, the same as it would be if they had done on your TV what they do with many others and put Dolby Vision on it, even though the TV isn't remotely capable of benefitting from DV as it's not a FALD sent and doesn't have the brightness to pull it off.

Filmmaker mode on an OLED is a far distance from a budget IPS panel.

You've just joined, made 2 statements and basically gone out your way to insult the staff/owner of the site and now it's members who are mugs for paying more for products.
Enjoy your stay.
 

Thug

Moderator
Its good to have a discussion, but just make sure its respectful guys, not only to the site admin/owners, but also to each other.
Please don't let this descend into a slanging match or it will end up being closed.
 

Derek S-H

Distinguished Member
OP - you do make some interesting points in your initial post, but there are some problems with your assertions:

1. Ageing - I'm now 56 and my eyesight and hearing is not the same as it was when I was 25 and they never will be. What I see and hear now is unique to me, so just looking out of the window and listening to life, though laudable, is not a universal experience.

2. Representation - I think most of us know that what we see and hear using our AV gear is a version of reality that has been presented to us, and that that version has been manipulated to make it more acceptable and even enjoyable. I have never heard a real explosion in my life, nor would I want to. I've never heard the sound of strings or a piano playing in the background whilst I have a heart-to-heart with a friend. These events, these experiences, are mediated by others but I'm fully aware that this is going on!

3. Choice - there are a basic set of universally-agreed standards that govern both film and music reproduction, what's known colloquially as intentions. And these standards are, in my opinion, fairly reachable and reproducible using domestic equipment. Absolute standards using professional, reference-quality technology would be prohibitively expensive and even then there's no guarantee you'd like what you were seeing or hearing. So what do people do? They can use the existing standards if they want to, or they can just ignore them completely and watch everything on Vivid or boost the bass!

4. Equipment - you initially wrote a compelling argument regarding finding your own standards using everyday life as a benchmark. That's great, but as others have stated, that's never going to be achievable with your existing TV so it kind of undermines your position. If you'd spent thousands on reference-quality gear then stated your points regarding the accuracy of representations, you might then be more convincing.

I do see what you're trying to say (I think!), but I don't think you've said anything too original or unusual that people who like AV haven't already thought of. And people who aren't bothered about any of this wouldn't be on this site in the first place!

I do hope we haven't put you off the site and will stay and continue to post. I use the site a lot even though, funnily enough, I am probably the least technically-minded person you'll ever meet. But I do want my content to look and sound good to me, after that I'm not too bothered about stats or numbers or graphs or measurements, I just want stuff to get out of the way so I can enjoy my films and music. :)
 

Dodgexander

Moderator
Filmmaker mode is just a gimmick to me. You can already achieve the same thing by switching off enhancements. I have heard the podcast (and read Steve's article) and I don't really agree with the way its made up to be.

I think the key thing to take out of it is the bit Steve talks about 'different manufacturers takes on it will mean different things' (to paraphrase).

I think what people take out of this is that the picture accuracy is going to be improved on TVs. That is, at least how it reads to me...but not how I feel it actually is at all, and as far as I've seen there's nothing at all that is set by the 'mode' that states it does anything other than switch the TV to the correct preset and turn off any enhancements.

It is not a magical cure for TVs with poor out of the box picture accuracy, as the best picture mode on those TVs can still be poor.

But I'd love to be proved wrong and I sincerely hope that is what it turns out to be in the long run.

Although I personally am a cynic and I've already seen the UHD Alliance fail with their HDR Premium/HDR ultra specifications. I can see their intention is good, but the the set of rules they define for filmmaker mode just as the rules required for a premium/ultra badge are not comprehensive enough and don't offer enough of an incentive for manufacturers to play ball.

To the OP, probably what you notice is just poor out of the box accuracy, this won't be changed by using filmmaker mode since your TV probably already has poor out of the box accuracy in its best picture mode. What I'd suggest you do if you still can is to return the TV and purchase a replacement based on picture accuracy instead. If its strictly a TV for SDR content, Panasonic's GX800/HX800 is probably a better choice. As would Hisense models.

For HDR and accuracy you really need to think about spending a lot more money, think LCD TVs like the Sony X950G/X950H or an OLED from Panasonic (others OLEDs also aren't far off with accuracy either though).

Btw you mention your TV is a UM7300 but that model is not a 2020 model. Do you mean UN7300, and which size do you own? This TV is one of LGs lowest range models, depending on the size you bought LGs strong points with these TVs are cost, smart TV and better than average viewing angles. They are not strong with picture accuracy, contrast or blacks.

If you want to improve picture accuracy, forget any adjustment by eye and either venture into learning display calibration with equipment yourself, or pay for pro calibration from a local professional.
 

PlanetaryReference

Active Member
Filmmaker mode is just a gimmick to me. You can already achieve the same thing by switching off enhancements. I have heard the podcast (and read Steve's article) and I don't really agree with the way its made up to be.

I think the key thing to take out of it is the bit Steve talks about 'different manufacturers takes on it will mean different things' (to paraphrase).

I think what people take out of this is that the picture accuracy is going to be improved on TVs. That is, at least how it reads to me...but not how I feel it actually is at all, and as far as I've seen there's nothing at all that is set by the 'mode' that states it does anything other than switch the TV to the correct preset and turn off any enhancements.

Agree, and if the colour production is as unbalanced as reputed, that would account for the overly red hues. Nevertheless, I don't think anyone is going to actually use Filmmaker mode as a one button press setting. So yes, a gimmick.

I'm also just wondering what TV LG optimized this Filmmaker mode's Backlighting, Brightness and Contrast settings with? Because it sure wasn't done on this one. So why is that setting even on this TV, as a sell 'feature'? It's actually more like a reason not to buy it.

It appears none of the presets on this TV were set on this TV, they all look so bad. This TV is so much more capable than anyone would think, from looking at the available defaults. I'm not sure why manufacturers always get that so wrong. You'd think they'd get someone who knows what they're doing to at least setup a 'standard' preset properly, for customers.

People, in here, and elsewhere, recommend to get a pro to setup the screen. That should have happened at the factory! For everyone. Yes, I realize there's variation in all panels, but not that much variation that you can't find a very good standard preset for a model, that gets close to what it can ultimately deliver.

To the OP, probably what you notice is just poor out of the box accuracy, this won't be changed by using filmmaker mode since your TV probably already has poor out of the box accuracy in its best picture mode.

Yes, I thought that when I set the 2-point whiteness visually, with the video linked above. Some of the high and low RGB level settings needed, to get the whiteness visually accurate compared to the real world, were well away from neutral levels (this was not the case with my previous Samsung panel, it's neutral positions were close). But once set for 2-point balance, visually, the colours did become very accurate and metallic and coloured 3D graphical objects in particular, looked much smoother and more 'life-like'.

It did cross my mind that manufacturers don't setup the low-end TVs properly from factory because they need the high-end unit to look so much better, and noticeably better colour balanced for the extra money asked. Otherwise it's hard to justify the extra cash outlay in the display area, if the top end TV is not configured to be noticeably better than the low end TV looks.

I suspect the low-end TVs provide far more visible improvements when tuned, than the top-end models do.

What I'd suggest you do if you still can is to return the TV and purchase a replacement based on picture accuracy instead. If its strictly a TV for SDR content, Panasonic's GX800/HX800 is probably a better choice. As would Hisense models.

I looked at the available 2020 TVs under $1200, and this one stood out as the better image and feature set. I'll keep it as it's working very well, now that I've wrung everything out of it via tuning (except multi-point finer tuning), and it's working very well in all the applications I use it for.

I'm not actually disappointed with it - quite the reverse, it's performed very well. It's 'reference' Filmmaker Mode sure blows though. I'll learn more about TVs for the next HDMI 2.1 eARC TV I purchase late next year, at higher price level.

For HDR and accuracy you really need to think about spending a lot more money, think LCD TVs like the Sony X950G/X950H or an OLED from Panasonic (others OLEDs also aren't far off with accuracy either though).

I was led to believe, before buying it, that this LG TV would have marginal HDR effect, due to lacklustre brightness spec, but I found this not to be the case. It may be the case if using standard presets (yes, the standard HDR preset is absolute sh_t!), but when properly tuned the HDR effect is good. It's much better than I expected to see from it. So HDR, despite the myth, is not a weak point of this TV. But I have read several times people repeating that meme. But the claimed lack of brightness is very much exaggerated (probably by sales people trying to bid people higher), but it's not a problem, in practice.

In fact, I wouldn't want HDR effect any stronger, it's a strong effect on this panel. So maybe people have been repeating this general meme without regard to seeing it on a properly tuned version of the same panel. HDR is certainly not its weak point.

Btw you mention your TV is a UM7300 but that model is not a 2020 model. Do you mean UN7300, and which size do you own? This TV is one of LGs lowest range models, depending on the size you bought LGs strong points with these TVs are cost, smart TV and better than average viewing angles. They are not strong with picture accuracy, contrast or blacks.

You are correct, my mistake, on checking it's a 2020 43inch LG UN7300PTC being used mostly for PC display purposes, PC based movies, with some TV use, and occasional 4K XBox One X use.

[I edited the model number in the original post.]

If you want to improve picture accuracy, forget any adjustment by eye and either venture into learning display calibration with equipment yourself, or pay for pro calibration from a local professional.

Well that's part of the point I am making, I've found this to not be necessary, nor true. Yes, I could spend 1/10th of the cost of the TV to get a guy to come around to set the whiteness, far more comprehensively than I have, as he shakes his head and baits me to buy something that's 3 to 4 times the price. But I've already wrung most of the potential gains out of it via setting it visually.

I've repeatedly seen people state that you can not set whiteness settings visually. And I accept that's certainly true for more than 2-point. But it's absolutely not true for a solid two-point setup. Getting the two-point set correct, visually, made a world of difference to this TV's image quality. In effect, it rectified almost all of what was wrong with it's colour accuracy and balance. Yes, pro multi-point calibration would nail that down even closer within the mid-range levels, but would I even notice a difference? Maybe, but probably not.

I also looked at the test pattern graphics which I'd created to check the whiteness and colour calibration, and their visual clarity and contrast, which is now extremely good from low to high. I learn much more about calibration and tuning via doing that myself, and finding ways to make it work than to listen to people tell me that it can't be done. This does work and it has fixed all of the noticeable visibly detectable deficiencies.

But I'll spend some time working out how to do this more comprehensively to get clearer images. But don't under rate how clear and accurate the colours are now in a 3D games for instance, from doing that visually. The difference between what it does now and default setting, is vast.

This is part of the point of my OP, human senses are capable of setting colour very accurately to a reference-earth, out the window (and to a decent calibration video), in the same way human ears can set sound very accurately, to the sounds of the reference-earth. I'm very familiar with tuning a 31 band EQ by ear. Our senses have much more resolving and tuning capacity then people give them credit. And their results are usually better than a reference calibration result.

IMO, it's pure myth that human senses and sensitivity can't determine if a 2-point whiteness setting should go up or down, when using the calibration video linked within the OP. The display difference after doing that is night and day within 3D HDR games, for instance. This does work, and it does solve most of the underlying grey-scale and color display issues with such cheap panels.

It's a skill people should develop, not be dissuaded from.

--

[ Moderators, I fail to see where I've 'insulted' owners of the forum, via critiquing a podcast's points with counterpoints. If that constitutes an 'insult' around these parts I must have woken up in the wrong Universe. There was no 'insult'. If I'd meant to insult them there would be no ambiguity involved. Didn't happen. ]
 
Last edited:

PlanetaryReference

Active Member
OP - you do make some interesting points in your initial post, but there are some problems with your assertions: ...

... I do hope we haven't put you off the site and will stay and continue to post. I use the site a lot even though, funnily enough, I am probably the least technically-minded person you'll ever meet. But I do want my content to look and sound good to me, after that I'm not too bothered about stats or numbers or graphs or measurements, I just want stuff to get out of the way so I can enjoy my films and music. :)

No, your remarks were fair, thank you.
 

Sloppy Bob

Distinguished Member
I was led to believe, before buying it, that this LG TV would have marginal HDR effect, due to lacklustre brightness spec, but I found this not to be the case. It may be the case if using standard presets (yes, the standard HDR preset is absolute sh_t!), but when properly tuned the HDR effect is good. It's much better than I expected to see from it. So HDR, despite the myth, is not a weak point of this TV. But I have read several times people repeating that meme. But the claimed lack of brightness is very much exaggerated (probably by sales people trying to bid people higher), but it's not a problem, in practice.

In fact, I wouldn't want HDR effect any stronger, it's a strong effect on this panel. So maybe people have been repeating this general meme without regard to seeing it on a properly tuned version of the same panel. HDR is certainly not its weak point.

But you're not getting HDR, you're getting brightness (maybe). All the HDR tag on your TV is doing is allowing you to watch HDR content without the washed-out grey-green colours of early 4K non-HDR TV's.

HDR is about contrast, but as your TV has no dimming zones at all, the entire backlight is either bright, dark or inbetween as opposed to having separate zones that are a variety of bright and dark which is what gives the contrast between light and dark.

This is a representation of the difference.


No Local Dimming.................................................................................Local Dimming..
not-all-hdr-is-the-same (1).jpg
local-dimming-banner-label.jpg
 
Last edited:

Broken Hope

Active Member
Yeah HDR without local dimming or being OLED just can’t get the difference between light and dark correct.

As for filmmaker mode, it’s based on standards, the SDR standard for luminescence is 100 nits which is why the backlight will be set low compared to other modes on the TV, it’s correct for the standard, but can take some getting use to compared to other modes that set the backlight much higher.
 

Dodgexander

Moderator
@PlanetaryReference

I think you need to take the time to research HDR, what its about and why if you buy an LCD TV why you need higher peak brightness and good local dimming. If you are satisfied with HDR looks on your current TV I can only think you'll be flabbergasted when you see how its actually meant to be look...although the argument is kind of pointless if you are buying only a smaller TV, since there are no smaller TVs with capable HDR hardware.

Regarding by eye calibration, simply put. You're wrong. You may be happy with the adjustments you made by eye, but you cannot determine if the adjustments you made causes more errors in other areas of the picture. Part of using professional equipment to calibrate is so you can take a measurement, make adjustments and compare afterwards. You track each change you make to see if it improves or reduces picture quality. When you judge only by eye you can only see that you have changed what you perceive to be a weakness already. For all you know you could have damaged accuracy elsewhere by changing it, or your own perception of what it should be is just wrong. If you touch white balance controls yourself then you can cause more harm than good.
There's a reason there's a professional calibration industry, and whilst I agree TVs should come pre calibrated out of the factory a particular weakness of the model you purchased is out of the box accuracy and if you'd have opted for a different model instead you probably wouldn't feel the need to make any adjustments at all.

Having said that, you seem to be happy with the results and that's all that matters. Your criticism of filmmaker mode is deserved, but its not the first time something like this exists, and from a marketing perspective there's far too much that's advertised on TVs nowadays that shouldn't be. I don't see any future in filmmaker mode, but I do agree that it should be a push towards TVs being calibrated correctly out of the box.

As for what filmmaker does on your current TV, it probably sets Gamma and the white point as close to the industry standard as it can (between its built in picture modes) it will also disable any kind of noise reduction, or motion interpolation leaving the image as free from any alteration as possible. What it won't do is make a TV that has poor picture accuracy suddenly have better picture accuracy, for this you need to buy a TV that scores well in reviews in this area.
 

PlanetaryReference

Active Member
But you're not getting HDR, you're getting brightness (maybe). .. This is a representation of the difference.

No Local Dimming.................................................................................Local Dimming..
View attachment 1353065View attachment 1353066

Thank you for taking the time to produce those to illustrate your points.

A real night sky above a city is never black like this, and it is a good example why I'm not keen on really black-blackness on screens at night, unless it's an indoor scene, where there's no ambient light at all, in which case why is there a 'scene' where you can't see anything? I mentioned this dislike for dark blacks within the OP. The night sky tint is never black like this contrast effect, at least not in any Western city. Perhaps it is over the DPRK's night sky. It's usually an annoyingly well-lit sodium street lighting hue, scattering off aerosols, producing an amber wash. It does not make colours "pop" with high contrasts, it tends to alter and wash them out a bit.

i.e. more like the left side image.

Also, high rise buildings are never dark like this local-dimming example. The local-dimming, used as effect, is pulling out the natural ambiance and replacing it with ... nothing. Why? It's then much less than it should be. In comparative audio terms, this would be like pulling 15dB out of all frequencies below 75 Hz. Yes, that would indeed be a "high contrast", but I would not like that, either. There has to be some balanced details below 75 Hz, preferably extremely well tuned and balanced nuanced details.

So too me your non-local dimming image is by far the more accurate and "higher-fidelity" representation, even if it could have greater low-light level tuned nuances to it.

And your TV is no doubt somewhat more capable of approximating a planetary-reference than mine is, so why don't you look at planet as a reference, outside and inside, and tune your TV to that? How would actual higher-fidelity be an inferior display? Because some manufacturer, sales person, or a calibration specialist said the optimization of the effects is "better-er"?

I'm not interested in sophisticated finer-points of effects, I'm only interested in the visual results. Is it accurately representative of the world or not? i.e. is it actually a "high fidelity" reproduction in as much as a consumer product can reproduce it?

Why would I let effects dictate the tuning?

If you keep accepting less, you get Transformers Movie franchise versions of AV 'reality', as a commercial and domestic 'standards', and people who base product reviews and impressions of products on the flavor of the latest AV bubblegum effect fads.

The reality is that most TV's can already do a very good job of tuning to the planetary-reference standard, but some are a bit better at this than the others. I accept that your TV could do it better than mine.

Other tuning approaches are effectively using gimmicks that produce lower-fidelity video.

So if I were to purchase your preferred model of TV, I would tune it to a realistic planetary-reference appearance, and end up with an image that looks much closer to the "without local-dimming" image appearance (but a bit finer tuned). So I don't feel like I'm missing out on much here, and can quite happily live without local-dimming, though I'm not suggesting I don't want it to aide in fine tuning the dark end of my next TV's display.

As for HDR brightness, on this monitor, it's more than enough. The brightness effect is strong, and I would not want it any stronger, in fact it's possibly a bit too strong at present. I suspect a lot of people have been bamboozled by sales people's "how to sell a more expensive TV" memes, with respect to this insufficient brightness claim, than by having an objective look at the function, utility and performance of it, themselves.

With a better multi-point calibration the HDR brightness effect will not doubt look even cleaner and punchier. So to me, the local dimming feature is not essential to obtaining good HDR effect, and the exaggerated contrasts are antithetical to high-fidelity representation of a planetary-reference tuning, which is more visually accurate than those local-dimming examples. [Visual accuracy, like truth, can be relative, personal and a bit annoying, to have pointed out in unsolicited ways, eh?]

So the added 'sparkle' of HDR as contrast effect, does not attract me at all, as I would not tune a TV to look that way, for night time local-dimming. To me it's over-cooked and has spoiled the fidelity of the captured urban image. The real world does not look like that, and nor does the planet have a 7 dB boost from 31Hz to 42 Hz.

So this for me is not about HDR or local dimming, as a contrast effects. It is about a visually tuning a balance which is accurate and visually convincing and realisitic.

But if instead highlighted that the very dark grays, as "black", that would be a valid criticism I would accept. I would like more details in the darkest 5% range of the dark parts of movies, and I would use local-dimming to fine-tune this for detail, but never to make blacks blacker as a fake contrast against pretty lights and movement, as an effect, for visual drama. I can not live with that, day to day, any more than I can live with a poorly tuned sub being on all day. So what good are these effects to me? The reality is, a more subtle HDR effect is what I require - this TV delivers it. Though I do which It had subtle local-dimming too. Calibration matters but subtle tuning to approach visual 'reality' matters more. Contrast effects should be avoided, they're as attractive and accurate to me as using "dynamic contrast".

I expect you now better understand what I mean by a representative planetary-reference tuning for my AV.

This is why I was a bit mystified if not triggered by this "filmmaker mode" being put forward as a 'reference' appearance or setting. OK, it's an effect, designed for cinema, but it's not what I would ever want my TV to look like. And your images above also look nothing like it BTW. What is outside, and inside, 24/7/365 is all the reference I need to tune a TV, and this will in fact produce a more representative high fidelity portrayal of what the world looks like, than what your ultra-high contrast HDR TV currently produces.

I'm a bit over the AV sparkle thing in the same way I am over poorly tuned audio and unrepresentative or exaggerated surround-sound. And no, I do not have a soundbar, as you originally supposed. Though I was interested in the review of the soundbar in this podcast, which early user impression and detail was informative and well out, I thought. He certainly did not mark it down in audio terms of look at it as inferior, because it was a sound bar. I found that interesting and surprising.

I would be pleased (even if you're still feeling offended, or probably even more offended now, I'm sorry, I'm not trying to do that) that you'd acknowledge that there's a validity to this as a visual reference for tuning the display setting accurately. And that super-duper features, effects, contrasts, and the 'technically ideal' calibration is not the goal that I'm interested in here. 6,500K was once an ideal tuning standard, but as I pointed out in the OP, the real world does not look like 6500K, and nor does the real world sound like most new high-end audio systems.

Though I have zero doubt I could get them pretty close to it, using my ears alone.

In the end, are we interested in representative high-fidelity AV, or just to be bedazzled by the latest effects?
 

PlanetaryReference

Active Member
Regarding by eye calibration, simply put. You're wrong. You may be happy with the adjustments you made by eye, but you cannot determine if the adjustments you made causes more errors in other areas of the picture. Part of using professional equipment to calibrate is so you can take a measurement, make adjustments and compare afterwards. ... What it won't do is make a TV that has poor picture accuracy suddenly have better picture accuracy, for this you need to buy a TV that scores well in reviews in this area.

I agree with most of what you said. But not this. You are strictly technically correct, but in practice, you are not. Plus I did clearly say that a basic calibration is needed as the starting point to tuning to the planetary reference out the window, but that my senses are what will be carefully and experimentally fine-tuning it from there, as my senses will be what's using the tuned result. So why would I not fine-tune the calibration with my eyes?

I'm one of those people who used to hear 'audiophiles' (i.e. predominately pompous salesmen) in 'Hi-Fi' stores, preach vigorously against the appalling audio evil of using a graphic equalizer (eek!) to tune an audio system away from what the audio engineers and the album "master" intended for me to here.

I of course rolled my eyes and bought a high-end equalizer off them anyway, then had a good laugh at them when I got outside the store. Those sales people were completely wrong about audio calibration then, and now everyone would readily recognize this (and laugh at their silliness too).

But it sure took a long time for people to listen to their ears instead of the salesmen's memes though. The audio system I bought was far more "high fidelity" with the equalizer than without it. That was heresy back then, shameful even to own and EQ, but no one would doubt it now. This belief was the diametric opposite of what the 'audiophiles' and other people pushing AV sales lines were telling everyone else.

More-or-less similar is true for TV and AV tuning now.

Our senses do work, our ears and eyes are the best AV systems we will ever possess, they are constantly telling us the AV truth.
 
Last edited:

Dodgexander

Moderator
Well its good that you are happy with your adjustments...and I'm honestly happy you are. As you say you feel like they have made a good difference and there's only one opinion that matters when it comes to how you think your own TV looks.

But this is a discussion forum and an AV enthusiast forum at that. We are all here because we love the technology and we love to talk about the ins and outs of it. We enjoy educating ourselves in one way or another when it comes to the finer details.

There's ultimately no right or wrong answer if you are happy with your TV, but I just wanted to point out that errors are easily made by eye and that is why pro calibration even exists.

I still disagree with your points regarding HDR and local dimming. The result of these technologies when used well creates a much, much more acute and realistic picture. It shouldn't result in something being artificial at all. But you will understand that if you ever do own a FALD TV (or OLED) in the future, because the difference really is night and day. Other technologies such as dynamic contrast adjustments do indeed do what you explain, but not local dimming...and not HDR.

Subtle with HDR isn't really a thing. By having a TV without good HDR hardware you aren't really getting a more 'subtle' picture. All you are doing is clipping detail out of the picture that was intended to be there, or raising black levels and crushing shadow, or both. When HDR content is mastered in the studio its designed to be shown with bright highlights and realistic blacks, the difference makes for great contrast and improves colour between each step in contrast too.

By using a TV with limited brightness and no wide colour gamut like your LG you are basically negating the point in having HDR at all, and you will probably find just using the non-HDR version will actually look better.

Its not just salesman that promote pro calibration (or even audio EQ for that matter) but owners too. Its a big deal and means a lot to a lot of people. So much so there's auto EQ systems built in to every AVR you can buy now and the more you spend on an AVR, the better EQ you get. Now if by mentioning EQ you are talking about music EQ, well that is something else entirely. Home Theatre and Music are two very different things.
 

addyeddy

Active Member
I think we all have different interpretations of what's a good picture and what isn't, and if it works for the viewer then it's fine.

I bought a cheap, end of line Hisense H65U9AUK, and every time I watch a 4k disc or even blu ray, I'm knocked out by the picture impact, especially with HDR on 4k, but I know critics/reviewers will say I'm not seeing the best HDR effect because it doesn't support Dolby Vision or HDR10+, but at a cost of 699, I've no complaints.

However it does for me and at the end of the day, that's all that matters.
 

Gabe777

Active Member
Make the picture look pleasing to YOU.

All this [email protected] is just that. Everyone sees differently. Their brains process things differently.

Noone tells me how to EQ Tidal HiDef music. Nobody !

Get it how you like it. And stop messing...start watching.
 

PlanetaryReference

Active Member
Make the picture look pleasing to YOU.

All this [email protected] is just that. Everyone sees differently. Their brains process things differently.

Noone tells me how to EQ Tidal HiDef music. Nobody !

Get it how you like it. And stop messing...start watching.

You realize this thread had been dead for almost a year? Nevertheless, I stand by everything I've said, it's entirely valid. As are the points made by others.

TV calibration is a waste of time? Sure mate - NOT! Anyone who has done it knows that, it makes all the difference to the image. At which point, even very small further adjustments can make a significant difference to image clarity and the visual reproduction fidelity.

Should people using 4K movie cameras and production gear take your advice and stop calibrating to carefully developed broadcast industry standards? No, capture and playback require calibration, and the ultimate reference point or reality check is the world around us. A/V is just an attempt to replicate scenery and sound at locations other than where it was originally captured. That requires calibration of A/V equipment. It's also the reason new microphone and camera technology is constantly developed - to increase the fidelity and quality of both in affordable reproduction equipment. And these capture technologies are always designed specifically to more closely approximate established technical standards of human vision and hearing for a commercial price point. There is no room for slack attitudes to A/V fidelity and standards in that market-competition.

Should we just buy a secondhand 2014 TV that looks 'good enough' to watch a movie on, and be satisfied with that? Nah, no one here will do that, unless they can't get any better with their money.

As for EQ, there is such a thing as a full-range 'flat' speaker response, at all audible frequencies standard for human hearing range, it's what high-end studio monitors are created to do, and they aim for an industry standard, and to do it better at the cheapest price. And what's for certain is that if a home audio system or TV can not get close to that flat full-range response then the music and audio track will not sound authentic, life-like and detailed, compared to the real-world original music performance, or real-world sound scores (and having that much is just the absolute minimum needed for calibrating an audio system, at starting point).

Best not bother wasting your breath naively arguing for lower fidelity of reproduction in A/V equipment, it isn't acceptable, and it never will be. If we all had that attitude no one would bother with A/V discussions, or comparing notes, or solving issues, or bother with reading product reviews and comparisons, or to buy better A/V gear.
 
Last edited:

Gabe777

Active Member
You realize this thread had been dead for almost a year? Nevertheless, I stand by everything I've said, it's entirely valid. As are the points made by others.

TV calibration is a waste of time? Sure mate - NOT! Anyone who has done it knows that, it makes all the difference to the image. At which point, even very small further adjustments can make a significant difference to image clarity and the visual reproduction fidelity.

Should people using 4K movie cameras and production gear take your advice and stop calibrating to carefully developed broadcast industry standards? No, capture and playback require calibration, and the ultimate reference point or reality check is the world around us. A/V is just an attempt to replicate scenery and sound at locations other than where it was originally captured. That requires calibration of A/V equipment. It's also the reason new microphone and camera technology is constantly developed - to increase the fidelity and quality of both in affordable reproduction equipment. And these capture technologies are always designed specifically to more closely approximate established technical standards of human vision and hearing for a commercial price point. There is no room for slack attitudes to A/V fidelity and standards in that market-competition.

Should we just buy a secondhand 2014 TV that looks 'good enough' to watch a movie on, and be satisfied with that? Nah, no one here will do that, unless they can't get any better with their money.

As for EQ, there is such a thing as a full-range 'flat' speaker response, at all audible frequencies standard for human hearing range, it's what high-end studio monitors are created to do, and they aim for an industry standard, and to do it better at the cheapest price. And what's for certain is that if a home audio system or TV can not get close to that flat full-range response then the music and audio track will not sound authentic, life-like and detailed, compared to the real-world original music performance, or real-world sound scores (and having that much is just the absolute minimum needed for calibrating an audio system, at starting point).

Best not bother wasting your breath naively arguing for lower fidelity of reproduction in A/V equipment, it isn't acceptable, and it never will be. If we all had that attitude no one would bother with A/V discussions, or comparing notes, or solving issues, or bother with reading product reviews and comparisons, or to buy better A/V gear.
I didn't say don't calibrate. My Oled and Qled are calibrated.

I simply said stop listening to the people who think you're meant to have your settings like this or like that.... don't use SOE etc. It's BS and you know it.

Set it up how you like it.

And as for being a year old ... so what ? Don'tget your point. Information is timeless, especially regarding general concepts. Heck.... consoles are often 5 years old .... if I want an answer to an issue in 4 years time, I'm pretty sure posts from 3 years earlier will still be relevant to the PS5 for example.
 

The latest video from AVForums

Maverick UK Premiere IMAX Review + Top Gun, Tom Cruise, Tony Scott and 4K + Movie/TV News
Subscribe to our YouTube channel

Latest News

dCS announces Bartók 2.0 firmware upgrade
  • By Ian Collen
  • Published
Audiovector unveils QR 7 loudspeaker
  • By Ian Collen
  • Published
What's new on UK streaming services for June 2022
  • By Andy Bassett
  • Published
LG UltraGear Gaming Monitors get VESA certification
  • By Ian Collen
  • Published
Sony confirms UK pricing for A90K and A75K Bravia XR 4K OLED TVs
  • By Ian Collen
  • Published
Support AVForums with Patreon

Top Bottom