Darbee Visual Presense Darblet Owners Thread

Status
Not open for further replies.
Is "broadcast HD" film, video or both ?

Maybe it's as basic as the content of each frame or certain types of artistically stylized content in terms of how effective the result is judged. IIRC Darbee mentioned avoiding certain type of movies even on Blu-ray.



Avi

Sorry, if it appears different when broadcast compared with the same 'item' on Blu Ray is what I meant to say.

In other words I was really asking whether the method of delivery would be expected to affect PQ.
I know my Sky HD STB will not do any disruptive deinterlaceing with HD but the delivery process is dependant on many factors that a BD player does not have to face.

Just curious.
 
In other words I was really asking whether the method of delivery would be expected to affect PQ.
I know my Sky HD STB will not do any disruptive deinterlaceing with HD but the delivery process is dependant on many factors that a BD player does not have to face.

Just curious.

I think so.

Sky HD and even Sky SD content can look outstanding at times but HD is often let down by compression/bit rate artefacts that can be very noticeable on certain types of image especially as image size increases. Beyond inherent source issues the display/VP de-interalacing can also have an impact as can variables related to per source display chain calibration. The type of Sky HD box also has an impact on what can be with SD material.
 
I'm a bit late to the party but may as well share my thoughts after 4 weeks use.

I'm watching 99% Blu-Ray on my HD350 in a velvet lined dedicated room. I bought the JVC to replace a Panny AX100 as due to the limited blacks it had become all but unwatchable to me. To say the HD350 was a revelation in this regard was an understatement!

I'm probably like many others in that I'll find faults given long enough, and sure enough in certain areas my viewing has left me a little underwhelmed for a while now. As far as black levels & dark scene performance go, I'm still uber chuffed. Motion isn't a strong point of the JVC's but I've always been fine with it though. It is in the areas of detail, sharpness & image depth in brighter scenes that I've found myself growing increasingly dissatisfied over time.

I actually annoy myself with this and have posted a thread in the past about 'watching films in the wrong way', basically finding myself scrutinising the image rather than just enjoying the film.

I'll cut to the chase - I am really enjoying the Darblet. I won't get dragged into whether it changes the director's intent etc etc - I'm certain it does in certain ways - but I really like what it's doing to my image.

One of the first films I watched with it (Hi-Def 45%) was Batman Begins. The last time I watched it, I was put off by how flat and soft it looked. This time, the image was clearly better defined and had true depth. This isn't a remarkable looking disc and the Darbeefied image wasn't spectacular, but I just watched the film all the way through and enjoyed it without being put off by the flatness, which normally 'takes me out of' the film experience.

I had a play with a few other movies at this setting, and it's true that the better the material you throw at it, the better the result. Tron Legacy in particular looks great without it, but astonishing with it, yet still totally natural.

Another film I watched recently was 'Legend of the Guardians' which is reference quality CG, and I spent the entire movie thinking it didn't look as sharp as I'd expected, and had no depth. Straight to the point - I watched 5 mins with the Darbee set to 80% Full pop, and it was utterly stunning. Switching it on and off, it clearly altered the image to a remarkable degree and I'm sure technically altered the directors intent. Being CG though it isn't as obvious that something looks 'wrong' and the phenomenal increase in detail, sharpness and a palpable sense of depth totally transformed the viewing experience for me.

Perhaps the most surprising thing for me was watching the Star Wars prequels. It's startling how bad the CG is in the first movie, at times it's almost like watching 'Sky Captain' with the flat washed out looking CG not blending at all well with the live action. It can be off putting, so (and I know a lot of you will be shocked!) I ended up watching it through on Full Pop at 60%. It's fair to say that this setting can give faces a waxy appearance, but entire scenes take on a far greater depth and I'm finding myself being drawn into the picture as a whole rather than concentrating on little details, so I'm rarely noticing the waxyness. To me, I am willing to trade off some skin tones for the other benefits it brings.

I watched the remainder of the trilogy at Game Mode 60% and thoroughly enjoyed the experience, again it was just the sense of depth that drew me into the image.

For me I am very much appreciating the comments made by some other users that it enables the DILA image to look a little more like a 3 chip DLP or even a bit more plasma like, by enhancing perceived ANSI contrast or MTF. It may not be everyone's cup of tea, but for me, it really works. Due to time and money I've foregone calibration and won't be upgrading for probably 3 years. In truth the later generations of JVCs have incrementally improved each generation, but still fundamentally share the same strengths and weaknesses as my HD350. The only feature that the newer models have now which really interests me is e-shift so hopefully when the time does come to upgrade, this or 4k may be available on a more keenly priced model.
 
My understanding is that the film makers are encouraged to put their own stamp on a film in an 'anything goes' type of way.
This encourages originality and I have to say I am usually an admirer of their final product.

However it does seem to be a somewhat 'cavalier' approach to an original storyline (often a book or short story), that they have presumably selected as worthy of filming.

This is in stark contrast to the 'distortion' charges levelled at the Darbee, where 99% of the original is retained and very selective perhaps subtle changes are made to video only.

If technology was made available that changed the voice charactoristics of certain actors I wonder what the reaction would be?

'Director's intent' is really a red herring. The objective of a calibrated system is that what you see at home is what they saw in the mastering suite (a 'reference' calibration where the thing you are 'referring' to is the mastering suite product). If you deliberately change the former by introducing an 'effect' then you no longer have a reference calibration. This is fine - it then becomes a 'Preference' system instead of a 'reference' system and anyone is entitled to their own preferences - just like some people prefer the contrast wound up to eye-bleeding levels etc. Some of us though want a Reference system and that means adhering to REC 709, D65 and so on.

The discussions that "our equipment at home isn’t as good as the equipment in the mastering suite" is pointless too - it may be the case, but to deliberately move even further away from reference doesn't somehow magically take you closer to reference.

BTW, when I say "you" it's not personal - it just sounds daft to keep saying "one" but please take that as read.
 
Stuart: Given that your room sounds like a nice batcave, then I wonder if your brighter scene issues might be to do with gamma. The HD series is known for the gamma dropping well below what it is marked at (2.2 often measures less than 2.0 for example). This will give you better shadow detail perhaps (maybe too much even) but will make the brighter scenes washed out. In a batcave a '350 should be able to take a measured 2.3 gamma and that was how I ran mine when I used the 'tent'. I used to compare lower gammas with the calibrated 2.3 setting on paused scenes and there was extra 'pop' and depth compared to 2.1 or lower pre calibration. Nothing for or against the Darbee in this case, but the base calibration might help, then add the Darbee on top.

Secondly (and a use for the Darbee which might help): I used to find that any sharpness or detail setting above 0 on my old '350 would cause a very slight 'micro judder' most noticeable on end credits, but must have been there for other moving objects too. Try setting the '350's sharpness and detail controls to 0 and if necessary increase your Darbee's settings instead: It may give you the sharpness you want, but without the motion effects. Certainly worth a try anyway, or some combination of the two device's settings. The '350 tends to create 'halos' or ringing even at very low sharpness settings, whereas the Darbee apparently doesn't so might be preferable to use the non ringing sharpener. :)

The above is also an example of how a proper calibration before adding a Darbee might get more out of the overall set up rather than trying to use it as a band aid to fix something that isn't really broken: The Darbee could then enhance the resultant image perhaps with lower settings, or perhaps with even better subjective results.
 
Last edited:
Stuart: Given that your room sounds like a nice batcave, then I wonder if your brighter scene issues might be to do with gamma. The HD series is known for the gamma dropping well below what it is marked at (2.2 often measures less than 2.0 for example). This will give you better shadow detail perhaps (maybe too much even) but will make the brighter scenes washed out. In a batcave a '350 should be able to take a measured 2.3 gamma and that was how I ran mine when I used the 'tent'. I used to compare lower gammas with the calibrated 2.3 setting on paused scenes and there was extra 'pop' and depth compared to 2.1 or lower pre calibration. Nothing for or against the Darbee in this case, but the base calibration might help, then add the Darbee on top.

Secondly (and a use for the Darbee which might help): I used to find that any sharpness or detail setting above 0 on my old '350 would cause a very slight 'micro judder' most noticeable on end credits, but must have been there for other moving objects too. Try setting the '350's sharpness and detail controls to 0 and if necessary increase your Darbee's settings instead: It may give you the sharpness you want, but without the motion effects. Certainly worth a try anyway, or some combination of the two device's settings. The '350 tends to create 'halos' or ringing even at very low sharpness settings, whereas the Darbee apparently doesn't so might be preferable to use the non ringing sharpener. :)

The above is also an example of how a proper calibration before adding a Darbee might get more out of the overall set up rather than trying to use it as a band aid to fix something that isn't really broken: The Darbee could then enhance the resultant image perhaps with lower settings, or perhaps with even better subjective results.

Cheers Kelvin, I had already switched sharpness & detail down to zero.

I think so much of this is down to personal choice. I am aware that the gamma drops considerably over time (thanks to reading some of your other posts!) and know that a calibration might help to add some image depth back in, but calibration is something that whilst being very tempted, I've chosen to forego.

The main reason for this is that I know from past experience that it won't end there - if I have a calibration then the next thing will be a Lumagen Mini 3d to sort the colours out......pretty soon I'll be a few grand out of pocket. It's like an illness the constant tinkering and I've had to draw a line somewhere!

The big attraction of the Darbee is that for a modest fee I gain an improvement that won't be lost when I change the bulb, or the projector, and which doesn't require periodic adjustment. Certainly with more disposable income and a lot more time on my hands, calibration would be a step worth taking for me!

Cheers
 
Just to tempt you a little more: Even an old i1LT sensor that has drifted as far as colour temp is concerned still measures gamma accurately. Find one cheap in the classifieds (and don't bother trying to set colour temp with it) plus (free) HFCR software and have a play with sorting the gamma out using one of the custom gamma memories, if you have the time and patience...

Or just enjoy the Darbee. :D
 
Just to tempt you a little more: Even an old i1LT sensor that has drifted as far as colour temp is concerned still measures gamma accurately. Find one cheap in the classifieds (and don't bother trying to set colour temp with it) plus (free) HFCR software and have a play with sorting the gamma out using one of the custom gamma memories, if you have the time and patience...

Or just enjoy the Darbee. :D

Sorry if its in the wrong forum section but I tend to use a lower gamma on the lower grayscales and raise it from 30 IRE upwards.
My TV Display has better black crush control that way without sacrificing anything at high contrast.
Does the Darbee automatically do a similar thing in the realtime picture sections it feels need 'adjustment'?

I'm sure AVI will tell me its not that simple.
 
Sorry if its in the wrong forum section but I tend to use a lower gamma on the lower grayscales and raise it from 30 IRE upwards.
My TV Display has better black crush control that way without sacrificing anything at high contrast.
Does the Darbee automatically do a similar thing in the realtime picture sections it feels need 'adjustment'?

I'm sure AVI will tell me its not that simple.

Changing a gamma target as describe at a given stim level remains consistent in it's relationship unlike DVP. Have you compared EOTF calibrated to BT 1886 ?

I can only speculate about how DVP decides what to do (the info isn't published or part of patent) and how much it does it by. It may treat the same data level differently resulting pixels that are supposed to 40 being changed to 70, 60, 20, 55, 35 in some parts of the frame and changing the uniformity relative to the original relationship. Now apply that to a range of data values that may vary within the same frame and be different again between frames depending on the specific image makeup .

Avi
 
We've not been able to really confirm it, but I suspect that the Darbee works on much smaller areas than what you're talking about. In fact I used to do the same thing with my projector: 20 IRE and below would drop to 2.2 or 2.1 gamma, then transition to 2.3 above that.

I don't see why you couldn't have the gamma tweak and use the Darbee on top as I think you'd get different results. Technically I'd love to look into it some more: I once read up on how Dolby noise reduction works. Once I realised it was very critical of levels I discovered why my tape deck at the time sounded muffled whenever I recorded using Dolby. I then matched the playback levels exactly (back in those days the PCB had potentiometers that I could adjust and I had access to signal generators and oscilloscopes). Result was I got it spot on and future recordings sounded great and I could use the noise reduction too and gain from the hiss reduction rather than living with it and turning Dolby 'off'.

In a similar fashion, just understanding how something works and what it works on empowers me: When I first got into calibration I was baffled by terms such as gamma and gamut. By creating differently calibrated settings and comparing them (mostly with paused scenes) I gained an appreciation of what effect they had on the overall image. From this I discovered I liked the gamma of 2.3 on brighter scenes and the lower gamma 'trick' on low IREs.

Having said that, I'd probably just try the Darbee on different settings and leave it where I like it best, but I doubt I'd leave it there without doing some research...I can't help it. :D
 
Changing a gamma target as describe at a given stim level remains consistent in it's relationship unlike DVP. Have you compared EOTF calibrated to BT 1886 ?

I can only speculate about how DVP decides what to do (the info isn't published or part of patent) and how much it does it by. It may treat the same data level differently resulting pixels that are supposed to 40 being changed to 70, 60, 20, 55, 35 in some parts of the frame and changing the uniformity relative to the original relationship. Now apply that to a range of data values that may vary within the same frame and be different again between frames depending on the specific image makeup .

Avi

Thanks as always for this explanation.

First, no I have not yet tried BT 1886, more out of ignorance than anything.
I understand it does 'correctly' what I guess manually to low IRE areas?
Do you still aim for say 2.2 and it automatically adjusts?

Regarding Darbee:

So the usually small number of affected pixels do in fact stay within the Displays calibrated parameter extremes and don't 'un calibrate' it?

As I understand it a variable number of pixels are affected that follow the displayed video triggered by the DVP's algorithm.
Therefore a large majority of pixels remain usually /always unaffected dependant on the source video content.

In my laymans terms is that fair?
 
Last edited:
@KelvinS1975

Thanks for the detail and your comments.

I see that now, adjusting low IRE's forces all of Display to behave that way wherever darker areas are evident whereas the Darbee is very selective targetting only certain video dependant areas.
 
Regarding Darbee:

So the usually small number of affected pixels do in fact stay within the Displays calibrated parameter extremes and don't 'un calibrate' it?

I guess it depends what you define as small. Using some of the images posted in this forum and the null comparison the number affected maybe a significant % of the overall image in some circumstances.

If source value "A" when properly calibrated to a specific target should read "A" but with DVP may read A or B or C or E or F.... etc is viewed as correct and not affecting calibration. It's definitely affecting EOTF (gamma) relative to what the source should be. Something else to consider is how a VP may pre-shape data to achieve a desired display chain calibrated target.

AFAIK DVP attempts to avoid clipping but looking at some of the images there appears to be some range compression i.e. there only so far you can move the values relative to other value. So it appears to avoid near ref black and white but potentially may change everything in between on the fly.


Therefore a large majority of pixels remain usually /always unaffected dependant on the source video content.

In my laymans terms is that fair?

In some cases potentially yes and in some cases no. In the null comparison linked above what % of the image appears unchanged ?

Until we have more detailed analysis of raw HDMI data capture it hards to say objectively and again it will vary by frame content.

Avi
 
Last edited:
I understand it does 'correctly' what I guess manually to low IRE areas?
Do you still aim for say 2.2 and it automatically adjusts?

It calculates target luminance based on the actual measured MML and peak white performance of your display. It's target exponent is 2.4 but its aim is perceptual uniformity meaning it can change the entire luminance range targets or be a classic power function 2.4 depending on your displays MML/peak white performance.

Once the total range values are calculated the calibration target is to achieve the calculated target luminance at specific levels but this may be a problem if the display has a limited range of gamma adjustment points.

Avi
 
It calculates target luminance based on the actual measured MML and peak white performance of your display. It's target exponent is 2.4 but its aim is perceptual uniformity meaning it can change the entire luminance range targets or be a classic power function 2.4 depending on your displays MML/peak white performance.

Once the total range values are calculated the calibration target is to achieve the calculated target luminance at specific levels but this may be a problem if the display has a limited range of gamma adjustment points.

Avi

Thanks again Darren,

So not a flat gamma on all stimuli I guess.

I normally use a Duo on 11 Point grayscale so probably an option for me?
 
So not a flat gamma on all stimuli I guess.

I normally use a Duo on 11 Point grayscale so probably an option for me?

It may be a flat 2.4 if your displays measured MML/peak white support it but my guess is in many cases it will be adjusted in an attempt to maintain perceptual uniformity. If the display has a low MML a potential issue is probe low level sensitivity but the likes of the i1 Display Pro should ok to around 0.003 cd/m2.

11 point adjustment should be an option.

Avi
 
Last edited:
This is of course, is basically an extremely selective and (clever to the extreme) version of Pioneer Kuro's DRE. (To my eyes).

Something i and others have been very familiar with now for... years.

In fact "FULL POP" at about 60% is about the same basically as Dynamic Range Expander on the horrific HIGH setting.

DRE was something that basically i couldn't come to terms with over the years and we never really got on. I tried the LOW setting, but no it wasn't really for me.

Although it gave a 'percieved' enhancement to sharpness and contrast definition.

Just as this does.

I nearly always turned it off.

But i remember always thinking it would have been nice if Pioneer had been nice enough to introduce finer steps. But of course these were expensive enough to design and manufacture as it already was.

But it's here now and at least i can find a setting on the KRP that i am happy with.

Still can't get over the sometimes it still looks better off thing i have got going with it though.

But i love it immensely when it does work.

:thumbsup:
 
Last edited:
The following was posted on AVS Forum. A question was asked to a working director/DP living in New York:-

"rblnr
Would you mind to be asked, as a Director or DP how you view this technology using it perhaps to view one of your past projects? Improved experience? Does it override in negative fashion any specific processing aspects you recall putting some effort into? Often, if yes? Obviously that's a big issue for a lot of people against it philosophically. A lot of us have really wanted to hear from an actual director using one in his own system I'm sure.

Thank you"

"I have an action film I shot that would be a good piece to check its effects on, will do that in the next week or two.

Not having done that, my impressions are that the issue is less about altering the artistic intent, and more about addressing possible weaknesses in the playback system. The action film I'm referring to was very carefully color corrected using the latest iteration of the DaVinci correction hardware/software -- the film was actually shown as demo by DaVinci at their booth at the last years broadcasting convention. The colorist does all the Batman movies among many others -- pretty talented guy you might say. We we working with the 4K raw files out of a Red One camera using very good (sharp) glass for the shoot. I say all this just to give an idea of the quality of the source material and subsequent care in post-processing.

The corrected 4K master was absolutely stunning viewed on a 4K playback system. Knocking it down to bluray is obviously a big step back but that's where we're at now. My guess, and I will A/B to confirm this, is that the Darbee will give just a bit of the higher resolution feel of the 4K, without really changing anything else materially -- color palette the same, apparent contrast and tonal gradation, etc. That's an easy net plus to me. It's also just another layer of processing among those already in the playback chain. The bluray player processes, the PJ or TV processes, many have additional processors/scalers in the chain. To me, ad nauseum, I'm getting an apparent sharpness boost w/o penalty. That's good processing. I haven't done the work to A/B it on charts and so forth as some have here, but I gather the consensus is that the effects are benign in real world viewing.

I A/Bd using the new bluray of Jaws (tremendous IQ btw) -- I preferred the image with around 35% or so on the Darbee HD setting vs. zero. My guess is that Spielberg would too and wouldn't view it as compromising intent at all, rather just appearing to come out of a better playback system. I'll point out again that my gripe with the RS2 is a slight softness vs. what I see with some DLP PJs.

Sorry if I was repetitive and long-winded -- did an all-nighter and lack the energy to properly edit myself."
 
:thumbsup::thumbsup: whilst we do not and probably never will know the authenticity of the "director" it sounds to me what most movie producers would say on the subject of the darblet.

Anything that adds to the viewers experience of their artistic creation is bound to be welcomed even if not everyone agrees as it can be switched off or on at free will.
 
Thanks again Darren,

So not a flat gamma on all stimuli I guess.

Without going too far off topic.

These examples may provide a visual indicator of how ITU-BT. 1886 affects EOTF based on the displays actual native performance compared to standard power law.

All the examples are based on 100 cd/m2 peak white)but with different MML (assumed light controlled environment). The MML in the second image is around the lowest measurable using a device such as the i1 Display Pro (0.003) and the third image is based on a "native" full field CR of around 100000:1.

I've also included power law with BLC (black level compensation) for comparison which is what I think you may have been referring to earlier re your display.

This area of calibration is probably the one that causes most confusion and has a significant impact on the perceived image.
 

Attachments

  • EOTF Emaple A.jpg
    EOTF Emaple A.jpg
    78.9 KB · Views: 99
  • EOTF Exmaple B.jpg
    EOTF Exmaple B.jpg
    75.3 KB · Views: 112
  • EOTF Example C.jpg
    EOTF Example C.jpg
    74.7 KB · Views: 156
Last edited:
While a well calibrated system has an obvious and appreciable difference over an uncalibrated one. Differences in the capabilities of different displays, as well as the viewing room itself, provide equal if not greater differences. This is why I think many are misguided on what it means to be "as the Director intended". I guess for that, you should recreate the very same room the Director "proofed" his film in for every blu-ray that you purchase! If you put a SIMS DLP in the same room as a JVC LCoS (and yes I have seen this), both calibrated, the remaining differences are startling (and note I didn't say one is better).

This is why I keep pointing back to the simplest and most fundamental differences between display devices, the intrascene and on/off contrast variances and how much those alone take you away from what the "Director intended". I also quoted a specification for a very very expensive Sony 4K cinema projector and how that isn't even close in terms of contrast specifications to a home projector.

Assuming for one minute that the quote above really is from a Director, I think the most pertinent sentence is
"Not having done that, my impressions are that the issue is less about altering the artistic intent, and more about addressing possible weaknesses in the playback system. "

I cannot endorse this point enough as it is precisely what I have been harping on about the last few weeks but more eloquently phrased. Particularly, you need to understand that blu-ray is NOT that close to the original master. And blu-ray even in 4K form never will get close enough due to the limitation of blu-ray capacity and therefore enforced compression of that medium. We need true uncompressed (or lossless compressed) movie reproduction (as we do for sound) and this isn't happening in the near future unfortunately.

In my personal opinion, the Darblet is capable of ruining a good film. In fact when you first switch it on, with its defaults, it will ruin anything you play! I have been critical that the defaults reflect a device never intended to get into the grubby hands of a videophile or home cinema enthusiast. It's original intended market, is something else, for which I haven't quite figured out "who".

However it is also my personal opinion, that set to a much more subtle level of adjustment (which will vary enormously depending on the display type, particularly the likes of a plasma vs a projector) that it can indeed obfuscate, or even perceptually eliminate some of the deficiencies in the end-to-end playback environment, from blu-ray source, to the display, to the viewing room.
 
Good post Jon.:thumbsup:
 
Obfuscate.

Now there is a verb you don't hear every day :thumbsup:

Has anyone had any issues with the V3 Darblet taking a life of it's own, and resetting itself back to factory settings with the loss of the "advanced" settings and having it lit back up and flashing like a Christmas tree ?
 
Is there somewhere that explains the differences between Versions 1, 2 and 3 of the Darblet, or can someone list them?

I understand that one can differentiate between V1 and V2 by the 1% incremental changes in the Advanced menu - how does one differentiate between V2 and V3?

Thanks.
 
kbarnes70 said:
Is there somewhere that explains the differences between Versions 1, 2 and 3 of the Darblet, or can someone list them?

I understand that one can differentiate between V1 and V2 by the 1% incremental changes in the Advanced menu - how does one differentiate between V2 and V3?

Thanks.

The only thing i have picked up on so far is that it is compatible with a Lumagen VP
 
Status
Not open for further replies.

The latest video from AVForums

Is Home Theater DEAD in 2024?
Subscribe to our YouTube channel
Back
Top Bottom