MadVR on PC as a Video Processor

— As an Amazon Associate, AVForums earns from qualifying purchases —
Looks a good deal if you are prepared to buy used, I just assume any 2nd hand card has been caned though :)
 
Their warehouse stuff is usually stuff that’s returned in the 30 day window. Sometimes boxes are a bit bashed, other time kit looks brand new, they have an unquestioned returns policy so anyone can return stuff within 30 days like most online places but Amazon make it really easy. They also extend that to warehouse stuff so if it turns up and after looking at it you’re not sure you just return it for a full refund.. My 1070ti cane from them that way and has full normal warranty also...
 
certainly wins on performance per pound

unfortunately I think a 1070 won't cut the mustard as the cost of the higher end options is a large step so you have to make a big jump up the GPU scale to deal with it. Highlights recovery + being able to turn off the luminance trade quality option = your wallet will feel it :)
 
Won’t a 1070 be as fast if not slightly faster than a 2060s? I’m. Or sure and am. I expert, but I don’t think MadVR can use any of the newer RTX features, so your looking at cores and clock speeds in reality. Any benchmarks for games that can use the additional rtx aren’t necessarily the same for MadVR.
 
Gaming benchmarks have the 2060 super as marginally slower than a 2070 and a bit faster than a 1080, the 2070 meanwhile is 30% faster than a 1070. Reports suggest a 1080 can deal with all the options so I think a 2060 super is a reasonable bet for now.

The envy uses the tensor cores so it seems reasonable to think this will be a useful feature for the future.
 
Good evening @Thatsnotmynaim, do you have a link to Manni's settings for madMeasureHDR ver. 110?I am going to try 110 out now.

I cannot find a download link, can someone please share it? Thank you
 
Last edited:
It is OK, I just found ver. 110.
 
Would someone be kind enough to post here a screenshot of those L80 curves for the latest beta, please?
The original thread is miles long... I managed to find an early version of L80 but from a beta that didn't have all the current columns for HSTM.

PS- this new tonemapping option is a beast indeed, my overclocked 1080ti cannot handle it without reducing/disabling some other settings I had enabled previously :(
 
Yeah so many changes going on etc it's hard to keep up. Pretty sure Icaro uses the L80. I use these too having the same PJ as he. (change display nits and DT to suit).
 

Attachments

  • Icaro LS10500.png
    Icaro LS10500.png
    250.9 KB · Views: 204
Thank you very much!
I don't have a projector, will test these with a FZ800 OLED.
A couple more things though:
- My understanding was that "min target/real display peak nits" and "no compression limit" should be set to the same value (I'll use 750) but in your screenshot the latter is set to 0?
- I admit I didn't quite get the purpose of "dynamic target nits".... guess I'll keep it at the default 50 (I see you did) or disable it altogether.
 
When I asked what the relationship is between 'real display peak nits' and 'dynamic target nits this was the play 'more or less':

"The peak display nit is just that, it tells madVR how much brightness you have. The DT is how bright you want the picture.

You need to adjust DT to your peak brightness and your taste. The lower DT is, the brighter the picture and the least room you have for highlights (less HDR effect). The higher DT, the dimmer the picture but the more room for highlights. So for a same person, as peak brightness for the display goes up, DT should go up too in order to keep the same brightness but make more room for highlights.".

Not sure what 'not sure what 'no compression limit' does? All the build's ive been following have this set to '0'.
 
Ah yes, I think I saw that response about DT in the original thread, didn't notice the username of the person who asked :) Still, to me, it's a pretty confusing (unclear) answer...
Okay, thanks! I'll play around with it to see what makes sense with OLED.

"no compression limit" I took it as "this is the threshold where MadVR starts clipping highlights down from", hence the recommendations to keep it in-line with "target/peak nits". I'll give 0 a try anyway.
 
A couple other things:

Is there any way to control/alter video levels through MadVR, other than 0-255 and 16-235?
ffdshow had a "levels" section where you could modify the min and max levels of the content, with a nice real-time RGB histogram. It was very handy for movies where they had screwed up the black level during transfers: some require 1 instead of 0, others a bit more like 6 or 8.


The second thing is probably more player-specific than for MadVR, but does anyone know if it's possible to modify the font type and size for PGS subtitles?
They are hideously large and, as far as I can tell, MPC-BE can only modify the layout of SRT subtitles, not PGS ones.

Thanks!
 
Latest build has the popular HSTM curves built in. Makes for some easy experimentation.

I've also now done my first successful 3dlut calibration with DisplayCal.

I'm absolutely blown away with the results. Finally happy with contrast, colour, black levels, UHD playback etc etc.

Feel like my setup has come into it's own now.
 
I gave build 113 a try but in my case (1080ti and Panny OLED) it oversaturates colours and pumps up contrast in HDR for both pass-through and dynamic tone-mapping :(
 
any hints for making madvr process/upscale a sharper image to a PJ like HW40 or downscale 4k content better?

already looks great but can always do with a bit more improvements i guess :D
 
No harm if you have a pc that's got a little power behind it, but if looking for image improvement with the Sony 40 then darbee would be a nice cheap way to do that.
 
No harm if you have a pc that's got a little power behind it, but if looking for image improvement with the Sony 40 then darbee would be a nice cheap way to do that.

Any specific settings I set in MADVR to take advantage of my PC? (8086k/2080)
 
Hello guys.

I own an LG B9 TV and a pc with the RX 5700 XT video card.

Do you recommend in the madVR settings to select "tone map HDR using shaders" without selecting "video output in HDR format"?

I would do an HDR> SDR conversion on the LG B9 TV using madVR tone mapping.

In this way I also solve the problem of the washed out colors caused by the Rx 5700 XT when the TV switches to HDR mode.
It's a good idea or is it better to enable video output in HDr format??
 
If converting to SDR mode then you do not want the TV to switch to HDR mode as you will be double tone mapping, if you have an nVidia card you can select this in Madvr but without you may have to do it in Windows control panel. Also have you tried with the TV doing it as the gen 9 LGs have dynamic time mapping built in anyway...
 
There is a whole thread on this subject at MadVR in use with LG OLED Thread - Doom9's Forum

I think some people like to tonemap to something like 6-700nits in madvr and then let the TV do whatever is left in HDR mode (for reasons to do with how the TV responds in HDR Vs sdr modes). I haven't read the thread in detail though so best to review it yourself.
 
Great input so far. I am really having trouble locating the current madVR beta (113 I think). Been sourcing through AVS the last few days. Any help would be appreciated.
 

The latest video from AVForums

TV Buying Guide - Which TV Is Best For You?
Subscribe to our YouTube channel
Back
Top Bottom