Answered Lumagen Radiance Pro 4242

DarkBlade

Established Member
Joined
Mar 26, 2007
Messages
659
Reaction score
76
Points
197
I'm assuming that this can be setup to manage both a 4K projector and a 4K TV (with different settings etc) - is this correct?

Can the output be connected to an AVR before the signal goes to the projector/TV, or are they meant to bypass the AVR all together?
 
Yes, using the memories function you can have different settings for TV and projector, with different LUT, aspect ratio, etc.

I have 2021 and have the projector connected direct to Lumagen for critical viewing on HDMI2, and the TV and projector connected via the AV amp on HDMI1. (so 2 cables going to the PJ - one from Lumagen and one from amp). This allows me to have a "pure" HDMI output from Lumagen to the PJ for critical viewing, but still have the option of bringing up the amp menus on the PJ.
 
I do very similar to JFinnie with a 2041 in terms of routing the signal direct to the projector from the Lumagen and the rest via my AVR. I actually have three displays connected to my 2041 using a HDMI splitter on the output of my AVR: I use Memory A for the main TV, Mem B for projector use when zoomed for 2.40:1 and Mem C for projector use for 16:9 content and Mem D serves the TV in another room.

I also use the auto calibration function with Chromapure linked to my Lumagen which saves hours of fiddling (other than an initial set up of the basic settings on each display).

I'm sure I'll end up with a Pro myself in the future having owned three Lumagen products so far I find them indispensable and using them with 3 displays helps me to justify the cost.
 
I have a Radiance Pro and it's sits at the heart of my Home Cinema setup doing it's thing.
Fantastic bit of kit which gives you superb PQ once your TV or projector is calibrated.
It even auto detects aspect ratios of blu rays and UHD to fit my scope screen.
 
If you get a Harmony remote and persevere a little you can make the whole thing really slick - I have separate TV and PJ activities and have inserted extra commands into the activity startup / shutdown which send the relevant commands to the Lumagen and AVR to choose the correct inputs / outputs and memory settings.
 
I been looking to buy a Radiance pro for some time, but not made up my mind yet.. What im missing in my old Radiance is a global gamma adjustment, that dont mess up the black and white clipping point, in and output bit resolution shown on the info window, and wonder if the pro can maintain 12 bit true the pipline, the older radiances maxes out at 8 bit, and ill like to be prepared for a potential UHD 12 bit HDR, without having to buy a new processor.
 
I am sure it can take 12 bit in and all the way out. It runs at over 16bit in places i believe. You need 18GHz input and output Module to do it at 50 and 60Hz of course. For confirmation you can email lumagen support and they can confirm exact current bit depth of the pipeline. I think there is further precision updates planned but they are waiting until the final features are in before they see what they have left to work with in that regard I imagine.
 
Last i heard from Jim it was a 10bit pipline, identical to the 2143, would be interesting if that has improved, he mentioned about needing more for HDR.. So guess ill have to contact him to find out where its at.
 
It was 10 bit at start as it was the 2143 firmware i guess. Wihtin a few months they had already started increasing accuracy and then they deicded to do a big inxrease and i know they plan more. As i said, some of the pipeline is already way beyond 10
 
Just to add, if you have an AVR that supports HDMI 2.0a and HDCP 2.2 along with the appropriate resolutions/ frame rates/ colour depths, you could use that to switch your sources and go for the 4240 and use the 2 video outs on the Radiance Pro to feed your display and PJ. However, if you ever require support for watching UHD content at 50Hz or 60Hz, you'll need the 18GHz output card. This only has 1 video output though with the 2nd output being for audio only. That issue could be solved with a decent HDMI switch though of course. It's an option but one that may not work for you.

Once you have a Radiance VP, you'll wonder how you ever lived without it and you'll not be able to live without it in the future. I'm on my third (a 4242) and they keep getting improving.

Paul
 
Just read that the Radiance pro line will not get Dolby Vision, so as someone posted, lets hope DV will be short lived so the Radiance pro will not be outdated, as it will be if DV becomes the leading HDR format, for everybody to have.
 
I think it highly unlikely DV will be the defacto HDR standard...and HDR10+ will have the same dynamic metadata features as DV...and HDR10+/HDMI2.1 is just a firmware update for the Pro range to do support that stuff.
 
I think it highly unlikely DV will be the defacto HDR standard...and HDR10+ will have the same dynamic metadata features as DV...and HDR10+/HDMI2.1 is just a firmware update for the Pro range to do support that stuff.
As looks nothings sure, nobody knows the end of this HDR.. DV is 12 bit, and surposed to be the high quality HDR format.. And we all know how the market and the consumers always go for the highest specs.. So DV might pull it off, as of right now it seems that all the players and TV manufactures will implement DV.. Just look at all the attention to the OPPO 203 DV capability when it came out.
 
Last edited:
Hmmm I'm surprised Pro line not getting DV, must be a licensing cost thing and surely not tech limitation?
Surely they could update the cards and increase the cost to give the customers a choice?
 
Hmmm I'm surprised Pro line not getting DV, must be a licensing cost thing and surely not tech limitation?
Surely they could update the cards and increase the cost to give the customers a choice?

This is the words from Jim( Lumagen)

""We do not think Dolby is going to license us to support Dolby Vision. This is direct from Dolby Labs. So no plans to support Dolby Vision.""

I guess it leaves a slim chance that its not compleetly impossible.
 
Last edited:
Just got a radiance pro setup by Gordon. This is my 2nd one and as stated above once you have had one you never go back!! Mine is running both my pj and oled panel.

Get the best out of one by getting GF to do your calibration and setup
 
Just got a radiance pro setup by Gordon. This is my 2nd one and as stated above once you have had one you never go back!! Mine is running both my pj and oled panel.

Get the best out of one by getting GF to do your calibration and setup

Adding a lumagen to your projector,is it a huge stepup in picture quality or are there subtle differences & does it justify the cost?
I'm a little hesitant of purchasing one because many times I've purchased home cinema equipment after reading reviews,listening to enthusiasts all saying a particular piece of kit is amazing & then personally not really seeing what all the fuss is about.
 
I think they're a great addition with either a professional calibration or you learning to drive it yourself (depending on how you are inclined).

If you look through display reviews many will tell you that, for example, when calibrated they managed dE (a measure of colour difference) of less than 3 for most of the gamut, which is claimed to be at the threshold of vision. I dispute that as I can very easily see by eye colours (particularly greyscale) that are off by well under 2. (in fact, the whole idea of dE was that 1dE was a noticeable difference, no idea where 3 came from). It seems to me it is a blunt "rule of thumb" that doesn't apply equally to every part of the colour gamut.

3DLUT calibration - like you get in a Lumagen - is a given in the professional content production industries. They're using extremely high quality reference broadcast monitors, and even then they calibrate them with a 3DLUT. It is the only way to get a truly accurate colour volume.

To give you an idea; I'm by no means an expert calibrator but this is the kind of error distribution I can achieve by manual means using the controls built in to my JVC X30. The taller and further to the left the chart is, the better... Approx 30% of the measured points are between 1 and 3 dE.

Screen Shot 2017-11-02 at 20.34.17.png


And this is what I can achieve with a 3DLUT generated in Lightspace and loaded into my Lumagen 2143. No measured points above 1dE in this case (sometimes 1 or 2 are just over)

Screen Shot 2017-11-02 at 20.34.27.png


Don't get me wrong, the manual calibration does look good, but the 3DLUT looks much better to my mind. I'd go for accurate colour over resolution any day of the week.
 
I have had 2 lumagens now and they do make a difference, regardless of what people think.
You need a calibrator who knows what they are doing and for me GF is the ONLY guy I would consider. He knows these things inside out.

If you have purchased decent equipment you want it running to its best. To be fair I did not run my Epson 10500 for too long prior to the lumy install and did not have time to really study as this pj is nice out of the box. But there is a difference across UHD and HD discs the lumy is certainly providing changes to the pic. I need the lumy for my 2:35 format screen as well and not just for making my pic look good.

They are an expensive item for sure and I understand the hesitancy in the purchase; I too genuinely struggled even knowing that I had the Radiance XD doing a fantastic job on my previous HD setup.

I would be surprised if you did opt for one and did not notice the difference
but for me its down to the TRUST and EXPERINECE I have had in Gordon over the years and I have to say his advice has been 100% thus far.

Adding a lumagen to your projector,is it a huge stepup in picture quality or are there subtle differences & does it justify the cost?
I'm a little hesitant of purchasing one because many times I've purchased home cinema equipment after reading reviews,listening to enthusiasts all saying a particular piece of kit is amazing & then personally not really seeing what all the fuss is about.
 
Today its hard to justify the Radiance Pro at its hefty price, its hard not to concider adding some of its cost towards a better display device.

The radiance is justified to correct for bad display behavior, most high end displays these days tracks far better than a few generations ago, and with the error posibilities in the Radiance, to not say the multiple software involved constantly beeing debugged, both processor and auto calibration softwares, ill say there is no such thing as a perfect calibration these days.

If you have a entry level or older display, you might very well get a much better one for less than the price of a Radiance, and in the end get overall better results.

If i was in the marked for a quality image i would first try to get a display device that would not need a external processor.

As far as UHD, ill be carefull investing to much in the Pro, unless you have money to burn, it will most likely be outdated shortly, and at this moment there is no sign of DV HDR support, also concider the posibility of 4K processors getting much cheaper in near future.
 
Lumagen have a very good update policy for all their devices.
Every display benefits from proper calibration and a lumy gives it that extra
At the end of the day its the poster's call you have opinion i have mine. he has asked a very valid question and one i toyed with greatly but in the end as said above i went ahead and very happy with results
 
BTW I dont have money to burn ... it was a SUBSTANTIAL purchase in that it blew a massive hole in my av room budget and there still is an element of reeling from it :(

but like everything else it will be paid off soon enough :)
 
I have 2 Radiance processors here as well, but if you cant do it yourself, count in a hefty extra bill to get a pro to touch up on your calibration 2 times a year or so if running any lamp based projector.
The de value dont tell much about image quality, so yes for displays with poor tracking and calibration controle the Radiance is nice to have, but be carefull not to replace one problem with another, as remapping a complex image often have a price, and a external and a internal processor/ CMS feature in series is not the optimum if you have a capable display.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom