ARTICLE: What is HDMI Version 2.1?

Discussion in 'Cables & Switches' started by Steve Withers, Jan 1, 2018.


    1. Kenzo85

      Kenzo85
      Active Member

      Joined:
      Sep 11, 2013
      Messages:
      69
      Products Owned:
      1
      Products Wanted:
      0
      Trophy Points:
      21
      Location:
      Herlev DK
      Ratings:
      +25
      I'm not sure whether it was covered or not, Does the new standard ensure fast handshake (under 3-4 sec)? With the future JVC's PJ's it's certaintly on my need to get fixed or improved list.
       
    2. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      Ah, yes, I have other devices that can switch refresh rate. My HTPC for example, which I can switch myself, and the Xbox One actually seems to manage it well. BUT, my point is that in general terms VRR could potentially solve a lot of problems across the board, if all HDMI 2.1 devices will auto match refresh rate during video playback. I don't see anything referring to this in what I have read, just about gaming.
       
    3. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      It won't help. As @stevelup says the problem has been solved ever since HDMI was a thing (in fact, even before that; PAL/NTSC DVD and laserdisc players also worked properly). The issue is lazy device and app manufacturers giving paying customers the finger and not implementing what should be basic functionality.
       
    4. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      Interesting. It is not just a case of switching refresh rate on start and end of playback though, as the refresh rate can alter during the course of the video, like gaming I guess, it is not absolutely fixed.

      Most HTPC software players, introduce some type of audio or video drop, or slowing up / speeding down of audio / video to keep playback smooth and in sync.

      Could VRR implemented on the player and display device make this no longer necessary?
       
    5. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      You are confusing two different things.

      There is no video content which changes framerate mid-content. It only changes at certain points (like menus).

      The issue with HTPC playback is that usually the hardware video and audio clocks are not locked to each other, so they drift relative to each other and lipsync goes out. This is why HTPC has the option to stretch the audio to match the video (or drop frames)

      VRR doesn't help with this because it can't add an arbitrary delay that would have to keep growing in either direction for an indefinite period.

      The QMS (quick media switching) function does look like it will be useful in reducing sync times for frame rate changes, but it will still need device and app vendors to implement the rate switch in the first place.
       
    6. NinjaMonkeyUK

      NinjaMonkeyUK
      Well-known Member

      Joined:
      Aug 1, 2004
      Messages:
      1,014
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      86
      Ratings:
      +274
      VRR is for content that actually has varying frame rate, not for switching between different refresh rates. Your BD player simply isn't outputting 24p correctly for Netflix; this may be down to the settings you are using or it may be because the player doesn't switch to 24Hz for streaming apps.

      VRR is mostly used in gaming because the frame rate changes dynamically depending on the complexity of the scene being rendered. Streamed / disc-based video has a constant frame rate.
       
    7. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      :facepalm:

      Some video has a constant frame rate and some video does not:

      Bandicam - Recording Software for screen, game and webcam capture

      https://gizmodo.com/why-frame-rate-matters-1675153198

      Even where the frame rate is constant, currently not all display devices are able to match it exactly, eg 23.976 / 24 fps.

      Therefore, variable refresh rate technology could be helpful for video playback as well as gaming. Not only to switch to a different refresh rate at the start of the video playback, but also to keep it matched during playback, where it is variable.

      My question is, will the new variable refresh rate technology allow manufacturers, should they choose to implement it in this way, have the HDMI connection sync the frame rate that is being output by the source device to be dynamically altering the refresh rate of the display device to keep them perfectly matched?
       
      Last edited: Jan 2, 2018
    8. ChuckMountain

      ChuckMountain
      Well-known Member

      Joined:
      Oct 5, 2003
      Messages:
      6,395
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      136
      Ratings:
      +1,304
      I am surprised this hasn't already been posted but there are no certified cables beyond 8m for High Speed or Premium as they all fail generally.

      1080P will work at longer distances than that but it can be hit and miss and 4k won't at full bandwidth.

      The problem with a lot of the Amazon pages and reviews is that they are for cables of various lengths so at <8m they will be fine.

      Its the cable not the wire that is certified, so whilst the cables may physically be the same, the added length will cause failures.
       
    9. Lesmor

      Lesmor
      Distinguished Member

      Joined:
      Jul 25, 2007
      Messages:
      5,714
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      Aberdeenshire
      Ratings:
      +3,498
      As I said in my OP for projector owners who need longer lengths
      "It becomes an expensive potluck minefield"
      Answer this, how can a HDMI work with a Epson but fail on a JVC
      but the attitude seems to be just get over it or spend stupid money for a dongle that may or may not work
       
    10. ChuckMountain

      ChuckMountain
      Well-known Member

      Joined:
      Oct 5, 2003
      Messages:
      6,395
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      136
      Ratings:
      +1,304
      It doesn't work properly on your Epson either if it is dropping the picture (is that your OP that you are referring to?), its lucky its working at all at 10m.

      The signal gets worse over distance and being able to extract what's left of a signal depends on what the chipset is doing. Some chipsets will function better than others in the same way as you can get some dongles that claim to drive the signal further.

      Its only get to get worse with HDMI 2.1 when the cables have to carry even more bandwidth. When we get to 48Gbps then we are going to have only really short cables :censored:

      HDMI org recommendation afaik is fibre for longer distances. Nothing over 8m has certification though regardless and you are not going to find anything. If a cable supplier is claiming certification then its the proverbial bull dung.
       
    11. Lesmor

      Lesmor
      Distinguished Member

      Joined:
      Jul 25, 2007
      Messages:
      5,714
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      Aberdeenshire
      Ratings:
      +3,498
      I didn't say or suggest that the supplier of my particular cable claimed certification of any kind only that it ticked all the boxes for HDMI 2.0
      At the end of the day its livable but its also lesson learned
      When it comes to the hype of new tech,buyer beware because it never works as advertised
       
    12. ChuckMountain

      ChuckMountain
      Well-known Member

      Joined:
      Oct 5, 2003
      Messages:
      6,395
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      136
      Ratings:
      +1,304
      Yes I agree, but surely if it doesn't tick the certified box then it should not be advertised as such even if shorter cables do. Secondly for those of us needing longer cables then it some cases its plain misadvertising
       
    13. Lesmor

      Lesmor
      Distinguished Member

      Joined:
      Jul 25, 2007
      Messages:
      5,714
      Products Owned:
      2
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      Aberdeenshire
      Ratings:
      +3,498
      At that time there was no certified box to be ticked
      Completely agree no more to be said
       
    14. ChuckMountain

      ChuckMountain
      Well-known Member

      Joined:
      Oct 5, 2003
      Messages:
      6,395
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      136
      Ratings:
      +1,304
      Premium Certification was released in 2015, wonder what we are going to get for HDMI 2.1 ...
       
    15. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      More confusion and open season for the cable cowboys...
       
    16. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      Again, I think you re confusing things.

      The bandicam footage isn't "real video content" - by which I mean studio derived stuff people really care about correct presentation of - it is screen recordings of games. IE gaming, which is where the variable frame rate stuff is targeted.

      And any display that can't display a given frame rate isn't going to be helped by variable frame rate, because it won't be able to vary to the required frame rate (!).

      Frame rate switching and native frame rate support is all that is required, and has been a solved problem for years, in any device worth watching video on. The only issue is device and service / app manufacturers being lazy or judging that they have better stuff to do, and these advances won't help. "Real" video content (so far) has no need for variable frame rates.

      In fact, the biggest help for frame rate switching and streaming content distributed at native 24p has probably been the >limitations< of HDMI and various SOC chips. Due to 4K 60Hz being "too fat" for many people's devices, in order to be able to offer 4K at all devices and service providers have had to do 24p for 4k, while also supporting 60Hz. This has been the biggest catalyst for multi frame rate support.
       
    17. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      No, Im not confusing things, I think you're looking at things from a fixed perspective of how you understand things to be at the present time. I am discussing the new technology of variable refresh rate and debating how it might be useful in the future, and the possibilities for its implementation.

      If you don't know the answers it's fine to say that you don't know or to just not reply, either of which would be preferable to accusing me of 'being confused'.

      So, thinking about it, how would the devices in the chain know whether the content was a game or a video? Maybe VRR will just work with video content in the same way as it will for games.

      There is no such thing as real video and not real video. Video is just video and some of it has variable frame rate.

      As I have already mentioned from a HTPC perspective there has long been issues with some graphics cards not being able to match refresh rate to video content exactly. Likewise, most display devices only have a selection of resolution and refresh rates that they support. E.g. 1080p/50, 1080p/60, 1080p/24. With VRR implemented does it mean that these display devices will have a fluid refresh rate response, so will be able to support any refresh rate?

      This is a significant change for those of us who use HTPC's for viewing video content. As well as unreliable or non existent refresh rate switching between local content and streamed content, for example, (E.g. also demonstrated by my Blu Ray player) variable refresh rates could also be supported by the display device for video content that has it. This might include non commercially produced video.
       
      Last edited: Jan 3, 2018
    18. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      This is my point, with variable refresh rate, it seems that a display device does not have fixed refresh rates, but variable ones. This article seems to imply that the implementation within G Sync monitors means that the monitors can support any refresh rate between 30hz and 144hz. If the frame rate drops below that, then it will do a multiple, eg 40hz for 20fps:

      Nvidia G-SYNC – variable refresh rate technology | PC Monitors

      So, in the case of using a PC for video playback, with a HDMI 2.1 VRR supported graphics card and display device, could we see the implementation where I could watch a youtube video at 60fps, then a film at 23.976fps, then some home video with a variable refresh rate, then some iPlayer at 50fps or 25fps, and the display device dynamically adjusts it's refresh rate, so no more judder?
       
    19. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      I'm not attacking you; there is as far as I understand a fundamental difference between the goals of VRR and HTPC audio/video frame rate locking use. Variable sync features have been available for a while outside of HDMI and they haven't been adopted yet for video playback because of this.

      This is from the author of the HTPC video renderer madvr (@madshi ) and explains why this is marketed as a gaming feature and is unlikely to be useful for HTPC audio / video sync adaption - PCs running Windows don't have the precision at the OS level for adjusting the sync timing on the fly to meet some desired sync (such as synching the video up to the audio in an otherwise free-running setup) which will lead to odd-looking motion for "real" video where you have an expectation of exactly where and when the motion should go in the next frame.

      MadVR Custom Video Mode - Perfect AV Sync (without Video Clock)

      What you are talking about needs a different feature to VRR. VRR is about presenting frames when they are available, by manipulating when you send VSYNC, and not having VSYNC at unfortunate times for the rendering which leads to tearing because the whole frame wasn't ready.

      To align video and audio using this kind of feature you need to be able to pre-render the frames and get the graphics card to present them at precisely the point in time you want using some sort of hardware timestamps, which as far as I'm aware isn't offered by the graphics card manufacturers.

      And for standalone players, they just need to "do the right thing", which they've been able to do for years. Standalone players are nearly always used with displays targeted at video use which do all the correct refresh rates needed out-of-the-box.
       
      Last edited: Jan 3, 2018
    20. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      It might prevent gross jerkiness as currently seen, but it won't be good enough for HT aficionados it seems. Because it looks to rely on setting the overall refresh rate to the highest rate (with no guarantee of sync to the audio) and then you will have the above mentioned accuracy issues when adjusting the sync to match the lower frame rates.
       
    21. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      But that might be because they have only been available in fairly niche monitors that are aimed at the PC gaming market.

      If VRR is a hardware capability that is built into HDMI 2.1 devices, and commercial mass market TV's have this hardware feature available to them, then it unlocks the potential for this technology to be used for video playback.

      The MadVR discussion is not really relevant because it is talking about the hardware limitations in currently available devices. HDMI 2.1 and VRR may change that in future devices.
       
    22. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      The madVR discussion is very relevant I think. History would say @madshi is at the cutting edge of things and understands well what is and isn't possible, and if this feature is coming to an "HTPC near you" it will probably be via madVR.

      HDMI 2.1 and VRR don't say anything about this from what I understand - the HTPC change needed is at the graphics card hardware and driver API level to allow frames to be displayed at a precise moment in time, and that isn't what VRR is about. You're right, this functionality >could< be built on top of VRR at the backend in the graphics card, but it isn't a part of VRR or HDMI2.1.

      Time will tell if any GPU vendors see this as being something worth doing, but I wouldn't hold my breath for this being the start of the solution for HTPC frame rates. The GPU vendors have shown time and time again that this stuff just isn't that important to them because it is good enough as it is for the 99.999% of PC graphics card customers.
       
    23. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      Ha, okay, I am getting the debate that I was after, which is good, but we are still a bit apart on our thinking of the issue.

      My point is that the refresh rate matching between source device and display device could just happen at a hardware level - via HDMI, without the need to faff about with software solutions such as MadVR, which I am very familiar with.

      Of course, it is down to how manufacturers choose to implement and exploit the potential of this new technology. HTPC's are becoming more niche, but streaming boxes are becoming more popular. Using DLNA for playing back local content is awful in my experience, mainly because of the refresh rate judder. So, if HDMI 2.1 is implemented within the source (PC, streaming box, disc player, fire stick etc) which could then play back locally stored content, as well as stream from a variety of VOD providers, and the HDMI 2.1 TV auto adjusts to dynamically match the refresh rate to the frame rate of video being played, that would be good, no?
       
    24. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      It could, but as I say, I just can't see it happening, partly because there are decades now of legacy here, and partly because of the 99.999% rule above.

      For what you are saying to be possible in the HTPC, all the audio and video decoding would need to happen in the graphics card together. This is a big deal, and brings with it licensing and a ton of work. There is no way I could see the graphics card vendors wanting to own that when it works now for the 99.999%.

      Effectively, product like this existed 20 years ago and was killed off by software solutions - system on chip on an ISA / PCI card that you could send the video and audio to together (realmagic etc). They disappeared in favour of "free" software solutions which could leverage basic hardware acceleration in the graphics card for the video, and do the audio in software on the CPU, to achieve more or less the same job. Software has never worked as well but they couldn't convince enough people that audio sync was worth paying for. Commercial realities kick in.

      As has been said a few times now in the thread, I think your DLNA point is just down to you not getting the >right< players. Network streamers and players exist right now which switch frame rate to the correct video frame rate for each piece of media, if you hunt them out. Again, it is a 99.999% rule which means some manufacturers just don't think it is important, so you are right, for many the experience is poor. Like Sony don't think it is important to properly support 50Hz streaming in their players, yet they do support 24 and 60. So Netflix looks great and Iplayer rubbish. They're just software decisions taken by someone in a product management team somewhere.

      I think the nicest thing that I think will come out of correct VRR / QMS will be less obvious refresh rate changes and the possibility to avoid lengthy HDMI resyncs for manufacturers who choose to implement refresh rate changes in the first place. But they need to decide to acknowledge the rate changes are important and implement them. Which they're not always doing right now.

      Edit: In a recent thread on a related subject, a couple of people popped up who preferred having the refresh rate wrong to letting the player switch rates and incur the HDMI sync delay. My wife is an example of someone who actually doesn't notice at all incorrect frame rate video. Heck, I doubt we're even in the 0.001%! :)
       
    25. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      It's not that I have the wrong player, I am using my Blu Ray player as an example of a consumer product that doesn't work as well as it should or could. It is brilliant at playing back discs, but rubbish at DLNA or streaming. My Xbox 1 for example is rubbish at playing back discs but great at streaming.

      I think you are over complicating what I am suggesting. Nothing in the source device would have to change, apart from it to be HDMI 2.1 and VRR compatible. What I am suggesting is (possibly) a utopian future....

      What if the source device and the TV complete a HDMI 2.1 handshake. Then you can put the TV into 'VRR mode' or 'gaming mode' and it dynamically adjusts the refresh rate to match the frame rate that is being output by the source device. Surely it would have to do this for VRR to work for gaming purposes, so why not video?
       
    26. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      ...said every salesman, to every engineer, ever... :p

      As you've seen, many manufacturers (your own player is a case in point) can't even be bothered to make stuff work properly at the moment when there are perfectly good technical solutions at their disposal to do so, and this new tech doesn't make it any easier.

      We'll see which way this unfolds in the fullness of time, but if HDMI.org (who exist for the purpose of coming up with standards, promoting them, and collecting royalties) aren't even promoting VRR for this use then I really don't think it is going to happen, probably both for the reasons of it not being the appropriate tech to solve the problem for the audio synching on the PC, and even if it could be made to work it not being a strong commercial driver for it.

      The QMS switching on the other hand might be enough to appease those who don't want refresh rate switching just because of the HDMI resync delays, encouraging more people to watch at the correct refresh rate, which might make the incorrect rates more obvious and less acceptable to the general public (raising the standards expected). That would definitely be very welcome.
       
    27. Rambles

      Rambles
      Distinguished Member

      Joined:
      Jul 14, 2004
      Messages:
      10,444
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      167
      Location:
      UK
      Ratings:
      +2,153
      @jfinnie So, to summarise then, it looks like we have agreed that the answer to my original question above then is... maybe :)
       
    28. jfinnie

      jfinnie
      Well-known Member

      Joined:
      Oct 1, 2004
      Messages:
      4,554
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      133
      Location:
      Norwich
      Ratings:
      +1,916
      Maybe... lol. We're at different ends of maybe I think :). I'm at the "pigs maybe will be flying soon" end of maybe for the reasons cited (nothing a couple of rockets couldn't sort out...!). QMS is definitely going to happen for video, I don't think we'll see any useful implementation of VRR for video media (disc, streaming service, UPNP) playback, and I don't think it will fix audio / video sync on PC platforms either. It would be nice to be proved wrong :).

      Edit: BTW, this is from the HDMI.org FAQ for 2.1:
      https://www.hdmi.org/manufacturer/hdmi_2_1/
       
      Last edited: Jan 3, 2018
    29. Joe Fernand

      Joe Fernand
      Distinguished Member AVForums Sponsor

      Joined:
      Jan 20, 2002
      Messages:
      28,926
      Products Owned:
      0
      Products Wanted:
      0
      Trophy Points:
      166
      Location:
      The Borders
      Ratings:
      +3,571
      Lesmor said 'Answer this, how can a HDMI work with a Epson but fail on a JVC' - ask JVC :devil:

      As Chuck says High Speed and Premium High Speed certification topped out at 8m for any copper cable so anything over 8m is non certified and it is pot luck once you try to send 2160p over a cable which is working perfectly for 1080p.

      Anything longer than 8m where you need/want some certainty about a cable working once installed and you need to go to Active or Hybrid Optical and I don't see that changing once devices carrying some of the New Features of HDMI 2.1 arrive.

      Ideally the New Ultra High Speed certification program includes Active and Hybrid Optical cables!

      Joe
       
    30. Johno0885

      Johno0885
      Active Member

      Joined:
      Jun 19, 2016
      Messages:
      187
      Products Owned:
      8
      Products Wanted:
      6
      Trophy Points:
      32
      Location:
      Gawler South Australia
      Ratings:
      +36
      See Steve Whithers Review that HDMI 2.1 might be Fiber Optic well it would make sense to use it .
      The Standard has not been set yet even for the Mainstream U.K Market.
      What would I know I live in Third World Australia where we get all the old Tech as New ?
      It claims to Support 10k Content @ 120 Frames Per Second , 48 Gigabytes Per Second.
      Dolby Atmos 2.1 Support for Dynamic HDR frame by frame.
      Smoother experience for VR due its increased Bandwidth.
      Wait and see what comes out in 2019 T.V.s , Players and Game Consoles ?
      Yet again Good quality Cable with a software update .
       
      Last edited: Feb 19, 2019

    Share This Page

    Loading...