That is an interesting discussion. But let me share my thoughts.
The collection of HDR content available, does not fully utilize much of even the base format of 1000nits. With many titles such as mad max only providing short durations of luminance that exceed the max brightness of most HDR displays. This is not a coincidence and not the full representation of the potential from HDR. With knowledge of photometry it is easy to understand where and how important the luminosity of the format and display output is. We live and breath light on a daily basis (unless you live in a cave somewhere haha) and any change in this is noticable. And can shift the visual clarity and enjoyment of nearly any environment. If we want to capture and reproduce the natural environment we live in and to recreate them to a realistic and impactful gradation that provides to some degree the same level of satisfaction as if you were there. Then the capture and reproduction must function as close as possible to the lighting characterisitics of that which it is taken from.
The sun itself is in excess of 100,000 nits. The moon is between 1000-2500 nits. An average cloudy sky around 2000nits. A typical photographic scene in full sunlight is 5000nits.
Now most of these are extreme examples of lighting conditions, which do not represent what we really require for good visual reproduction. Especially since one of them, would hurt your eyes for even just a short duration. And also does not take into account the light and dark compensation of the retina in our eyes. This same compensation that during the day your 1000nit phone display looks fine, but at night when woken up and looking at the same screen. Trying to not be blinded by having to squint to reduce the light reaching your eyes. But then gradually easing over time as the retina adapts.
However when a 60w standard incandescent light fixture and a photographic scene in overcast both reach at and in excess of 1000nits. There is obviously a visual benefit of some degree to meeting this level of luminance. Which represents a very common and typical lighting characteristic seen in a day to day and typical filming locations.
Moving on to how this relates to current technology and content today. And its potential for the future.
Some of the best TVs today, can only reach roughly between 600 - 700 nits of full scene brightness. Even with local dimming only barely scraping 1000 nits in very limited windows. This ultimately places a limit on what the ideal HDR processing is used. It is in the best interest to develop content to suit todays current technology rather than future proofing it. This is because of the issues with tone mapping and limited luminance output from current technology. Mastering content at higher luminance places more work on the tone mapping system and ultimately degrades the image the further this difference is.
With regard to the tone mapping systems. This is ultimately needed for when content does and will continue breaching our current technology. Which over time will get higher. There are certainly content available now that does so and will cause the signal to be processed by the transfer function of the display. Many EOTF functions begin compressing their signals around 400 - 500 nits. Including the panasonics HDR optimizer. So anything breaching this, will be going through the tone mapping system in order to prevent irreparable clipping damage.
End of the day, tone mapping is your friend right now. It works, its needed and in the future, will be even more important for older displays as HDR content gets scaled up to match the current display technology as it progresses.
Sir... As someone who seems to know a lot... You wouldn;t be so kind as to give me HALF A CLUE about this would you? Backlight MAXIMUM for HDR? Having a laugh??