CES 2018 News: Dolby Vision gains momentum despite HDR10+

Discussion in 'General TV Discussions Forum' started by Steve Withers, Jan 12, 2018.


    1. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      I can see HDR10+ having a LOT of benefit to companies - especially those that may not have the budget to innovate or do a lot of R&D. They will benefit from the knowledge and R&D of that Alliance - which can also help content makers too.

      Dolby Vision will never lose - not unless HDR implements higher bit depth because even when TV's have the maximum colour volume so Dynamic Metadata is not adding anything, Dolby Vision has a higher bit depth so will still enhance HDR content over HDR10 & HDR10+.

      The Dynamic Metadata maybe its 'biggest' appeal at the moment when TV's have to tone map the colour volume down to the limitations of the current capability of HDR TV's but if you have the maximum Colour volume capability, then Dynamic and static Metadata would map the content accurately to the PQ curve. The better the TV's colour volume, the less you need Dynamic Metadata and therefore, the less you need HDR10+. Because DV is 12bit, even when TV's no longer need to tone map so no need of Dynamic Metadata, Dolby Vision will still be relevant and 12bit will be its biggest appeal.
       
    2. Roohster

      Roohster
      Distinguished Member

      Joined:
      Jul 1, 2004
      Messages:
      4,181
      Products Owned:
      7
      Products Wanted:
      1
      Trophy Points:
      166
      Location:
      Derbyshire
      Ratings:
      +4,588
      Basically, we're stuck in yet another format war, where the only losers will be... us.

      Got a DV capable tv? Bad news, it won't do HDR10+.
      Got an HDR10+ capable tv? Guess what? No DV.

      Then we have the studios taking sides, manufacturers taking sides, and content providers (Netflix, Amazon etc) opting for one or the other.

      I know a lot of the public won't care and will buy whatever tv the salesperson wants to sell them, and many will believe whatever they're told.

      This is all making my head hurt, as this was going to be the year I took the plunge with UHD. At this rate I just feel like hanging onto my trusty old GT60 plasma (which still looks incredible) for another year.
       
    3. simonlewis

      simonlewis
      Well-known Member

      Joined:
      Jul 31, 2014
      Messages:
      4,252
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      136
      Location:
      stockport
      Ratings:
      +1,574
      Thats exactly how i feel was hoping to upgrade my tv this year but i might as well upgrade next year, you never know i may walk away with an 8K tv.
       
    4. raduv1

      raduv1
      Distinguished Member

      Joined:
      Mar 24, 2006
      Messages:
      13,532
      Products Owned:
      1
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      kent
      Ratings:
      +8,935
      Yup this is where I am at with the 2018 displays, happy to go on with my HDR10 display until we have Displays,spinners and AVRs that do all.

      With media streaming and UHD BD choosing one or the other on this then I'm not upgrading until there is a victor or all in one hardware solution for all.

      Same with HDMI 2.1 as why upgrade this year when in 2019 we will likely see the new HDMI spec in all kit .
       
      • Like Like x 2
      • Agree Agree x 1
      • List
      Last edited: Jan 14, 2018
    5. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      Except we aren't really! Both DV and HDR10+ MUST have a HDR10 base layer so if you don't have DV or HDR10+ you will still get HDR10 so you will not miss out on any HDR content regardless. LG also have a Pseudo HDR10+ mode - it analyses the picture and adjusts the tone mapping - basically giving the same result as Dynamic Metadata. Thus if you buy an LG with DV, you can still watch HDR10+ content in HDR10 and let the TV add the Dynamic Metadata.

      Its not like HD-DVD or Bluray, VHS or Betamax because ALL discs MUST have HDR10 as a base layer so that EVERYONE with a HDR TV will get HDR playback. The better the HDR TV, the less you need Dynamic Metadata anyway.

      It seems people have NO idea what this means and think they will miss out because Studio A is backing DV whilst Studio B is backing HDR10+ and therefore will miss out on some HDR films because they bought DV or HDR10+. Point is you will NOT miss out on HDR at all regardless! All of these MUST have HDR10 as its 'base' with an extra layer on top that either adds DV OR HDR10+ but you will ALWAYS get at least HDR10.

      If it is that concerning, buy a DV TV that offers modes that analyse the Picture and adjusts the tone mapping. That way you get the effect of HDR10+ on ANY HDR10 content - whether it has the extra layer of HDR10+ or not.

      The ONLY two formats that are 'important' are HDR10 and HLG. That way, whatever HDR content is offered, whether its broadcast, streamed or via physical media, you will not miss out. If you want a bit more of an optimised experience that tends to benefit the darker scenes only and much more beneficial on the 'weaker performing' HDR TV's compared to much higher beak brightness TV's (as they have far larger colour volume to tone map into), then maybe look at getting a TV with DV or HDR10+ for the Dynamic Metadata. Like I said, its probably better to get one with DV and a pseudo HDR10+ mode BUT if the TV you prefer has HDR10+, any DV movies will still deliver HDR10.

      Neither DV or HDR10+ will give higher peak brightness or wider colour gamuts. If a TV can map the content 1:1, then the Dynamic Metadata is redundant. It will follow the same PQ curve, delivering the same HDR quality (well apart from DV using 12bit which may help in some scenes) but they won't map the content any different to Static Metadata because they will ALL follow the same curve.

      To put it another way, you can play Dolby Vision content on a HDR10+ TV - it will just revert to the base layer of HDR10 and the same is true for HDR10+ content on a TV that supports DV - it just reverts to HDR10. Therefore its not really a 'format war'. You don't have to pick one side and miss out on HDR that backs the other.

      Its more like Playstation Pro and Xbox X being able to play each others games but at PS4/XB1 level - just not the 'enhanced' versions. That's hardly a 'war' but more a 'bonus' for picking a side.
       
      • Like Like x 2
      • Agree Agree x 1
      • Useful Useful x 1
      • List
      Last edited: Jan 13, 2018
    6. Toon Army

      Toon Army
      Well-known Member

      Joined:
      Jul 1, 2004
      Messages:
      950
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      96
      Location:
      Kinver, South Staffs
      Ratings:
      +575
      Amazon Prime Video to stream HDR10+ on Samsung 4K TVs

      Admittedly this is the US but a sign of things to come.
       
    7. d10brp

      d10brp
      Well-known Member

      Joined:
      Jul 6, 2008
      Messages:
      1,617
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      116
      Ratings:
      +750
    8. MahaRaja

      MahaRaja
      Well-known Member

      Joined:
      Oct 24, 2005
      Messages:
      1,131
      Products Owned:
      0
      Products Wanted:
      1
      Trophy Points:
      87
      Location:
      London
      Ratings:
      +359
      Many DV shows are not so good on Netflix according to some members here. Thought DV HDR experience are the 'best'.;)
       
    9. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      In the US, DV is just as widespread as it is here. Its not a 'default' but, like it is here, available on the same select TV sets. I know Panasonic are not available in the US but Samsung are and they are not offering DV to US only and in fact, HDR10+ was rolled out to the US first and on Samsung.

      If you have a DV TV, I think the content will automatically play DV source rather than the HDR10 base layer but that's what you want, the source to play at the highest standard it can on your TV without having to manually find and select the right version.
       
      • Agree Agree x 1
      • Useful Useful x 1
      • List
    10. Roohster

      Roohster
      Distinguished Member

      Joined:
      Jul 1, 2004
      Messages:
      4,181
      Products Owned:
      7
      Products Wanted:
      1
      Trophy Points:
      166
      Location:
      Derbyshire
      Ratings:
      +4,588
      I know that HDR 10 will still be there as a base spec, but why the battle between HDR 10+, DV et al? (and yes it is a battle, companies are taking sides).

      It's like saying your BluRay has a base stereo audio track. Yes, you can hear audio but you're not getting the most out of your system.
       
    11. xmbs

      xmbs
      Active Member

      Joined:
      Feb 6, 2010
      Messages:
      86
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      18
      Ratings:
      +48
      Unless I’ve missed something Amazon are still not streaming DV in the UK...
       
    12. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      If its like Base stereo, then Dolby is offering 'Dolby Stereo' and others an 'enhanced Stereo' but ALL still STEREO with the same Peaks and audio range as base Stereo. Maybe some of the 'quieter moments seem a little 'brighter' with Dolby and Enhanced but you can't hear any real difference in the louder moments - regardless though, you are still getting Stereo (not 5.1, 7.1 or Atmos audio but still just Stereo)

      Companies may well be taking sides, deciding which format suite their needs (and price structure) best. However its not like betting on Betamax or VHS, HD-DVD or Bluray and forcing you to pick one knowing that you may miss out on actually watching some studios films completely. ALL DV and HDR10+ will play on ALL 4k HDR TV's, Bluray Players etc and if you can't read or display that 'extra' layer, you will get HDR10 as a minimum - it won't, for example, say content unreadable but will give you HDR content. If your buy a 'poor' or low Colour Volume HDR TV, then its more important to get Dynamic Metadata - and, like I said, you can get a DV TV that adds basically adds Dynamic Metadata to HDR10 so its essentially as good as having support for both.

      I really don't see what the issue is. All Dynamic Metadata does is try to optimise the scenes better to your display. If your TV has a small colour volume, then the Dynamic metadata can help - mostly the darker scenes where Peak brightness doesn't reach its maximum level. Under Static metadata, this can appear a bit darker because its scaled down because it needs to keep the range for the scenes later on that may hit the peak.

      Basically if you have a 700nit TV and the content reaches 2500nits for example, Static Metadata will say to the TV - display that 2500nits at the max 700nits and scale down from there. A scene later on that may hit 600nits now only hits 400nits because of that scaling. Dynamic metadata still only has 700nits available so will scale that 2500nit scene down to 700nits but then the Dynamic Metadata will know that 600nit scene will fit into the TV's parameters and may opt to map it 1:1, thus making it seem brighter. Pseudo HDR10+ that LG (and I think others will be adding similar too in 2018) would do the same - see that the scene can be mapped 1:1 and adjust its tone mapping algorithm to do so. NONE of these will 'enhance' the content beyond the curve - so at the most you will ever get is 1:1 mapping regardless.

      If however you have a 3000nit TV, the Static Metadata will follow the PQ curve 1:1 throughout. Dynamic Metadata will also see that every scene can be displayed at 1:1 and follow that same curve 1:1 giving exactly the same HDR results. It won't matter if you have HDR10+, DV or just HDR10 because it will ALL map the content 1:1 - meaning that 2500nit scene will be 2500nits on ALL formats, that 600nit scene will be 600nits on all formats.

      If you only have a 2000nit TV, The static tone mapping algorithm could be to display upto 1000nits at 1:1 and map the 1000-2500nit content down to 1000-2000nits. That will still give a MORE optimised HDR experience than a 700nit TV. That 600nit scene will look the same but a 1000nit scene will look better than the 700nit TV can display it and be more optimised because its ALL 1:1, at 1500nit scene will look better too - even if its only displaying say 1400nits because its scaled down to fit 2500nits but more of it is at 1:1 and better optimised than on a 700nit DV/HDR10+ TV.

      To go back to your 'Stereo' analogy, its like having a very 'weak stereo' that can't play 'basic' stereo as it was meant to be and having the Dolby or enhanced stereo to 'boost' the performance in quieter moments to give a bit more volume than standard Stereo offers but if you have a high quality stereo with decent speakers, playing Stereo, Dolby Stereo and Enhanced Stereo all sounding the same. You are still getting the 'most' out of it regardless of format because at its peak, all 3 sound the same because of the lack of quality in the hardware. If you have such a weak system, you are not getting anywhere near the quality of audio that ANY of these can properly offer because they are all so compressed to fit the limitations of your Stereo but because quieter moments aren't using as much of the capacity - can be boosted/enhanced a bit to give a better audio from 'poor' equipment. To get the 'most' from Stereo, you need a decent 'stereo' set-up and the better stereo you have, the less of a 'boost' to the quieter moments you get until you reach the full capacity of the audio which then makes Dolby and Enhanced pointless because you are now hearing the audio EXACTLY as it was intended regardless.

      If you are looking to buy a very poor spec (compared to the master quality spec) HDR, having to tone map that content down to fit the limitations of that hardware, and having some form of dynamic metadata to decide how best to fit that content in, then it makes sense but if you have hardware with much greater colour volume meaning far less tone mapping issues (if any), then the need for Dynamic metadata reduces or even disappears entirely.

      To put it into resolution terms, If you have 4k content but only a 720p screen, you have a 'basic' super-sampling algorithm to ensure it scales down well. Dolby and Samsung (for want of a HDR10+ equivalent) come along and say we can 'enhance that so that some scenes can focus its super-sampling more on a persons face for example - analyse the pixels to give better eye colour or mouth accuracy to give a 'better' result for certain scenes but more general scenes where. face accuracy may not be needed, it super-samples more evenly. For 1080p owners, the need to pay more attention to faces drops because you have more pixels and less super-sampling anyway, by 1440p, the differences between standard super-sampling and the Dolby/Samsung enhanced super-sampling is negligible and barely noticeable because again you have far less need because you have more pixels and a lot less super-sampling. Those eyes and mouths are much more accurate than 720p had - even when 'enhanced' by Dolby/Samsung. At 4k, there is NO super-sampling so no need of DV or Samsungs enhanced options at all.

      It ONLY matters if you are looking at cheap and/or poor quality (as in colour volume) TV's. A 4000nit 100% DCI-P3 TV has a MUCH greater colour volume than a 700nit 95%DCI-P3 TV and therefore has to tone map that content down by significantly greater margins so the 'enhanced' layers of HDR10+ and DV help much more but CANNOT exceed the limitations of the hardware and offer 'every' scene improvement over basic HDR10 because its still having to operate within the same hardware parameters - the same Brightness and colour gamut, the same 'colour volume'. You cannot get 'more' colour volume out of a TV with DV or HDR10+ - maybe a higher APL in some scenes but still not necessarily as good or 'optimised' as basic HDR10 on a TV that meets or exceeds the contents colour volume.

      The need for HDR10+ diminishes with better quality Hardware. The benefits of DV change depending on the Hardware with 'weaker' TV's, the main benefit being Dynamic Metadata with 12bit being the only benefit at the higher end. Regardless, you still get HDR anyway.
       
      • Like Like x 3
      • Agree Agree x 2
      • Useful Useful x 1
      • List
      Last edited: Jan 14, 2018
    13. blueboy1873

      blueboy1873
      Active Member

      Joined:
      Sep 14, 2009
      Messages:
      74
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      11
      Location:
      Bishopton
      Ratings:
      +23
      Or an AV Manufacturer could simply include all flavours of HDR...and be done with it:smashin:
       
    14. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      Of course that is an option - and its not as if the DV associated TV's would have to pay 'extra' to add HDR10+ being an open-source format - although may require joining the HDR10 association and sharing their expertise and/or whether that impacts on their deal with Dolby or not.

      As far as 'players' go, it's already happening with Panasonics new player.

      Its still not 'essential' because TV's will analyse HDR10 and adjust its tone mapping on a scene by scene basis so doing the same thing as HDR10+ does and you get the benefit of that whether the content was HDR10+ or just HDR10.

      If your TV has a substantial colour volume anyway, then HDR10+ won't add any benefit so its not 'necessary'. DV will add a little benefit because of improved colour bit depth but from a viewers perspective, the benefits of Dynamic Metadata are negligible to non-existant and watching HDR10 will give a much more optimised HDR experience than DV/HDR10+ on a low performing TV. it really only matters if you are looking at buying a TV that has low colour volume.

      Point is, colour volume is most important when buying a HDR TV. Of course having decent consistent or even perfect blacks helps with the consistency of the HDR experience. Having the best colour volume delivering the best bright scenes etc can be ruined by under-performing dark scenes with black level affected by backlight. OLEDs may not have the same impact in those bright scenes but the dark scenes look great - so its very consistent. However, they also have low colour volume and therefore are likely to benefit more by Dynamic Metadata.

      If a TV could match OLEDs on the dark scenes, but also have the colour volume required for no tone-mapping at all, then Dynamic Metadata is redundant so it doesn't matter if you have HDR10+ (it will look exactly the same as HDR10) and to a lesser degree DV (as that has better colour depth). All three formats would map the content to the display 1:1 so NO difference.

      Like I said, if you are looking to purchase a TV with low colour volume, then both formats matter more but you can at least get around the lack of HDR10+ with LG as they will essentially put 'dynamic metadata' into HDR10 content. At the very least, it will be better than just HDR10 and at the worst, not quite as good as HDR10+ - but you would need 2 identical TV's to compare and contrast. You can't compare with a TV that has a 'brighter' panel as that will have a bigger colour volume to have better optimisation. You also get the benefit of improving ALL HDR10 content to HDR10+ levels so get more benefits.

      At the end of the day, if a TV releases with 10000nits and the full colour gamut but only HDR10 and HLG, I would buy that over a 1000nit TV with both DV and HDR10 because the HDR is always at it most optimal - even with DV and HDR10+, you will end up with scenes not being at the optimal because of the TV's limited colour volume and need to tone map. Personally, I would prefer my 10000nit TV to have DV - especially if it has a 12bit Panel to take full advantage of the greater bit depth but I would have no need of HDR10+ from its Dynamic Metadata, or even some 'pseudo HDR10+' mode. Even DV's dynamic metadata would see the TV as capable of displaying EVERY scene at 1:1 and follow the same PQ curve.

      The TLDR is:- DV and HDR10+ matter if you intend to buy a TV with low colour volume. A TV with the 'full' colour volume will make HDR10+ redundant and the only benefit from DV is its 12bit colour depth. If you intend to buy a high colour volume TV, then you don't need HDR10+ at all and Dolby Vision only adds 12bit colour depth - that's all. The lower the colour volume the more need for dynamic metadata.
       
    15. The Luggage

      The Luggage
      Active Member

      Joined:
      Dec 2, 2007
      Messages:
      70
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      8
      Ratings:
      +18
      Do they not also (theoretically) matter for projectors?

      Like Steve posted earlier in the thread, theoretically they should benefit from dynamic metadata.

      If Dolby has no plans for projectors, I'm curious if HDR10+ will be on future projectors.
       
    16. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      They matter for any display that is trying to tone map large colour volume content down to the display limited colour volume capacity.

      Colour Volume is determined by colour gamut and the luminescence that can be achieved. The bigger the gamut, the better the colour volume BUT also the brighter you can show those colours, the bigger the volume. Best way to think of it is like looking at a Toblerone where the triangle base represents the gamut and length represents brightness. A larger triangle and longer length increases the volume that can hold.

      If your TV/projector/monitor has a small colour volume, content with a high colour volume (HDR) will need to be tone mapped down to fit the smaller volume. Static metadata will give a very consistent image but because the they need to keep room for scenes that are 'brighter', the image isn't 'optimised' as well as it 'could' be in darker scenes although 'consistent' with the rest of the movie. Dynamic metadata will adjust each scene to be optimised to the capacity of your display thus giving 'darker' scenes more impact than static would on that display.

      Just because its a projector, monitor or TV, the same principal applies. The better the colour volume, the more accurately you can get to the PQ curve and less need to tone map. The less tone mapping, the less need for Dynamic Metadata as the more accurate the image is to the master. 1,000 Nits is comparable to about 3,426 ANSI Lumens and that is the 'minimum' standard that HDR content is mastered to and 2,000 Nits is comparable to about 6,582 ANSI Lumens (taken from Light Output - TVs vs Video Projectors ) so it looks like you would need a very powerful projector to deliver the content as the director intended - especially if mastered to 4000nits as a LOT of films are.

      Therefore, it stands to reason that a 'HDR' projector has a small colour volume and needs to tone map and could benefit from HDR10+ and/or DV. I don't see why a projector would be any different to other displays in that regard
       
      • Useful Useful x 2
      • Like Like x 1
      • List
    17. The Luggage

      The Luggage
      Active Member

      Joined:
      Dec 2, 2007
      Messages:
      70
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      8
      Ratings:
      +18
      Which is why I hope that next gen projectors (which will still need to tone map) will support dynamic metadata. But maybe Epson, JVC, Sony etc will now wait and see what format will "win". If there was just one format, maybe the support would arrive quicker.
       
    18. BAMozzy

      BAMozzy
      Distinguished Member

      Joined:
      Feb 27, 2015
      Messages:
      3,747
      Products Owned:
      4
      Products Wanted:
      6
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,623
      Its not a case of which format will win. If you want the best long term DV will win. Why? Because if we ever do get to a point where 'every' TV is capable of delivering the full colour volume, there is NO need for HDR10+ - none whatsoever - at least with Dolby you get a better bit depth. Also, as TV processors get better, the chances are they could offer the ability to analyse the image and adjust the tone mapping - the same as HDR10+ offers - except this also has the bonus of working with standard HDR10 content so actually offers more than HDR10+ may do in terms of content.

      HDR10+ is more about the here and now with TV's unable to reach the colour volume of the content being made. Its a bit like bringing out Chequerboard rendering methods of rendering 4k because the hardware is too limited to deliver 'Native' 4k but when the hardware is capable, then there is no need for chequerboard rendering - well not at 4k but may then be used to achieve 8k... This is slightly different though as it seems there won't be a HDR20 or sHDR - a next 'leap' that may require Dynamic Metadata to improve the tone mapping until the technology catches up to the 'maximum' specs.

      If you only have a 10bit Projector, you could equally get by without DV or HDR10+ - especially if the processor can analyse the image and adjust its tone mapping on the fly. Both DV and HDR10+ must have HDR10 so the TV/Projector/Monitor would receive just the HDR10 base layer and create its own 'metadata' essentially and adjust its tone mapping so you essentially get the 'main' benefits of both DV and HDR10+ without needing either.

      Whilst there is such a variation in colour volume of displays, both DV and HDR10+ have a place. It maybe that just because you have a max colour volume display, the need is basically non-existent (at least as far dynamic metadata goes) but there will be others who may not and therefore I can see both continuing for many years. DV obviously has a better colour bit depth but that will become the only difference and HDR10+ becomes 'pointless' for some - so does any 'processor' that analyses images and adjusts the tone mapping because there is NO tone mapping. Like having dedicated part of a GPU to track objects to improve chequerboard rendering results becomes pointless if the GPU has the power to render in native 4k.

      Like I said, it makes just as much sense for Projectors as any other display with similar colour volume to utilise some form of Dynamic Metadata to maybe improve some scenes (mostly the darker scenes) in HDR. It doesn't have to have DV and/or HDR10+ though if it can offer the option to analyse a scene and create its own dynamic metadata to adjust its tone mapping algorithm. Its not like you will miss out on either DV or HDR10+ content as they both will offer HDR10 versions and the Projector will adjust the tone mapping anyway so you get the same (or at least very similar) enhanced HDR experience. Even without you can still get an excellent HDR10 experience regardless. The better the display, the better HDR10 looks and less impact/need for dynamic metadata with the biggest benefit coming to the weaker performing (as far as colour volume goes) displays.
       
    19. xfiniti64

      xfiniti64
      Member

      Joined:
      Jan 11, 2018
      Messages:
      30
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      9
      Location:
      Lithuania
      Ratings:
      +1
      HDR10+ is free, DV you need pay pay and pay a license. Companys like samsung do not want to pay 3rd party like dolby for their technology. HDR10+ will win, as it is open source, one by one manufacturer will start adding HDR10+ as it is update to all ready free HDR10. HDR10+ is an addition to HDR10.
       
    20. choddo

      choddo
      Well-known Member

      Joined:
      Feb 29, 2016
      Messages:
      1,355
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      86
      Location:
      Surrey
      Ratings:
      +475
      That all depends whether Sony (for example) believe that paying for DV gives them a marketing advantage that outweighs the cost.
       
    21. xfiniti64

      xfiniti64
      Member

      Joined:
      Jan 11, 2018
      Messages:
      30
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      9
      Location:
      Lithuania
      Ratings:
      +1
      If not playstation sony is at bankruptcy. Sony is just rebranded LG, you can open new tv's and you will find LG TV with different firmware, maybe some more processors. Is what they can do just rebrand LG tv. Guess way sony added DV support? LG TV IN SIDE!
       
    22. 200p

      200p
      Active Member

      Joined:
      Oct 13, 2008
      Messages:
      1,078
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      66
      Ratings:
      +207
      Though there's no reason every UHD-BD has to have HDR at all. They could release them as SDR surely. It's mandatory for the player to be able to play back HDR10 though.

      From the spec:
      It doesn't say it's mandatory for the discs to include HDR10 (or do you have a link to an updated spec/official source saying it does or it does when Dolby Vision is on the disc)?
       
      Last edited: Jan 17, 2018
    23. Toon Army

      Toon Army
      Well-known Member

      Joined:
      Jul 1, 2004
      Messages:
      950
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      96
      Location:
      Kinver, South Staffs
      Ratings:
      +575
      They may also get a discount from Dolby for not using HDR10+.
       
    24. choddo

      choddo
      Well-known Member

      Joined:
      Feb 29, 2016
      Messages:
      1,355
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      86
      Location:
      Surrey
      Ratings:
      +475
      OK first I'm not sure that's relevant to my point. I could have used LG as my example.

      Secondly, what you say is true of everyone. I'm not sure it's fair to ding Sony or any other non-Korean-government-subsidised TV maker for having to adapt to the financial reality and buy in components cheaper than they could make them themselves.
       
    25. xfiniti64

      xfiniti64
      Member

      Joined:
      Jan 11, 2018
      Messages:
      30
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      9
      Location:
      Lithuania
      Ratings:
      +1
      Whay use same proprietary technology, where you have no opportunity to change any, be a little pony jumping under other rules, then you can have something open and free. I hate staff like DV, because it is one BIG DRM (Digital rights management) , the staff has cheks , recheks, you need ideal situations from end to end , proprietary nonsense.
       
    26. SunnyIntervals

      SunnyIntervals
      Active Member

      Joined:
      Sep 11, 2017
      Messages:
      359
      Products Owned:
      4
      Products Wanted:
      0
      Trophy Points:
      67
      Location:
      UK
      Ratings:
      +251
      I thought LG made all OLED screens, so of course Sony are using them and applying their own firmware. Nothing new there. Samsung is one of the few who don't.

      And why wouldn't a company use DV if it's better than HDR, which it is. HDR+ only exists because Samsung dont want to pay for the licence, and because they just love to do their own thing! At least proprietary means someone is maintaining it. Open source often gets messed up when companies start tweaking it for their own ends
       
    27. choddo

      choddo
      Well-known Member

      Joined:
      Feb 29, 2016
      Messages:
      1,355
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      86
      Location:
      Surrey
      Ratings:
      +475
      Well DV is technically slightly superior (whether that makes any difference on screen kind of depends on anyone creating any HDR10+ content to compare) but your personal dislike of it is a breath in the hurricane of global consumer tech marketing.

      and I say this as someone who has a Panny OLED that hasn’t even got the HDR10+ update yet and will never have DV.
       

    Share This Page

    Loading...
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice