HDR Confusion

Xymox

Well-Known Member
Apr 16, 2019
425
374
170
www.appletvx.com
I am confuzed. How is HDR better with current displays ? I have a Sony VPL-GTZ380. So if a imager chip itself, can only do 12 bits, thats the imager chip not the HDMI format, is 12 bits... So the dynamic range is 12 bits as the last device in the signal chain, the imager chip, is 12 bits.. So how does HDR provide any benefit in a system with a fixed bit depth.

Or.. Is HDR just the Emperors New Clothes ?

No matter what i feed to this projector, or a Sony OLED, HDR ends up looking worse then SDR.
 
HDR on a projector is always a challenge as projectors don't have the brightness to display HDR's extended dynamic range fully. Colour gamut is often limited as well. A projector is never going to show what HDR is capable of. Is HDR a case of Emperor's New Clothes? Absolutely not, however current implementations of HDR can be poor which undermines the results. With projectors, using something like the Lumagen Pro is almost mandatory to get the best from HDR. On TVs you are at the mercy of the vendor's implementation. Things are definitely improving with each generation and HDR, done right, can look fantastic. But yes, it can also certainly look poor.
 
HDR on a projector is always a challenge as projectors don't have the brightness to display HDR's extended dynamic range fully. Colour gamut is often limited as well. A projector is never going to show what HDR is capable of
Are you sure?
 

Attachments

  • atc_with_people.jpg
    atc_with_people.jpg
    62.6 KB · Views: 7
HDR is absolutely wonderful on a decent projector if setup well.

I rarely watch anything but HDR these days; it makes a huge difference.
 
  • Like
Reactions: Lampie519
HDR on a projector is always a challenge as projectors don't have the brightness to display HDR's extended dynamic range fully. Colour gamut is often limited as well. A projector is never going to show what HDR is capable of. Is HDR a case of Emperor's New Clothes? Absolutely not, however current implementations of HDR can be poor which undermines the results. With projectors, using something like the Lumagen Pro is almost mandatory to get the best from HDR. On TVs you are at the mercy of the vendor's implementation. Things are definitely improving with each generation and HDR, done right, can look fantastic. But yes, it can also certainly look poor.

Hmm... So the bit depth of the imager chip is still important. I am a SMPTE, SID and SPIE member so I have some grasp of the topic.

How exactly, in technical depth, is HDR a improvement with a limited bit depth set by the imager chip or output driver chip for fixed displays , fixed top brightness and dynamic range limitation in a room ?

It does not make black blacker, it does not make white brighter, there are only 12 bits between white and black on the imager chip. If you start rendering a image where white is defined as less then full output on the imager chip you loose bit depth.

Because the display cant reach the full super nova HDR definition,, clipping then is implemented in controlled ways to try and use the space it does have. This introduces its own math issues.

I can see HDR being good to manage the HDR displays for outdoor use, like the signs in vegas, I don't see how its anything but bad for home use.

So far all I see is the Emperors New Clothes.

And then there are the myriad of terrible standards and implementation issues.

To me this looks like a marketing thing to sell displays and all the HDMI gear in-between.

I know exactly what i am doing with calibration and settings and I can produce a FAR better picture with SDR then HDR.

The brightness of the top Sony projector can now exceed the dynamic range you would want in a room and do so in a very linear fashion. I DO NOT want more dynamic range. Top white in the scene should exactly match the top white of the projector. Black should be off. There are 12 bits MAX in the material and 12 bits max in the imager chip. Why do I need HDR ?
 
The projection screen is your "black" when the projector is off, right?
So this will be the blackest you will ever get ! The moment you switch on you projector, light (false light) will reach the screen and will "polute" you extreem black (not black anymore). Now you have a limited dynamic range if you like it or not. In case of a laser PJ or CRT this can be limited but is still valid....
 
You seemed to have convinced yourself that SDR is better - at least for you. In that case, enjoy it and you'll save yourself a lot of frustration.
 
  • Like
Reactions: Lampie519
The projection screen is your "black" when the projector is off, right?
So this will be the blackest you will ever get ! The moment you switch on you projector, light (false light) will reach the screen and will "polute" you extreem black (not black anymore). Now you have a limited dynamic range if you like it or not. In case of a laser PJ or CRT this can be limited but is still valid....

Exactly. Plus other factors like lens, reflected light in the room onto the screen.

This is my point.. Dynamic range is fixed by a large number of factors. Physical, electrical and optical. This is also true for flat panels and phones.

So I am having a really hard time understanding why I need HDR. In fact why anyone needs HDR. If it was more bits between my ( or anyone ) black and white, OK that MIGHT be perceptible. The overhead in bitrate tho would end up crushed by compression again back to the same kind of bitrate for SDR. So..... Again I dont see a need for HDR.

Now *IF* we ever overcome the physics of being able to do a supernova bright pixel right next to a 0.001fl pixel then I absolutely see the need tor HDR.
 
Last edited:
You seemed to have convinced yourself that SDR is better - at least for you. In that case, enjoy it and you'll save yourself a lot of frustration.

I am still open minded. I am confuzed tho why everyone has this overwhelming *NEED* for HDR. To me this seems to be a case of The Emperors New Clothes and if so, I want to confirm my understanding of HDR and that it offers no benefits to anyone. I also want to have this thread up so this can be discussed and fully debated. I am also tired of people switching displays and things like AppleTV over to HDR and then spending zillions of hours to work out that it is a implimentation issue, when that might not be the case at all.

SO.. Please excuse my being unhappy with HDR. I have zillions of hours into this now over years and I have reached a point where I want to deal with it in a definitive way.

I have not looked into Dolby Vision yet technically. I do know Dolby makes a TON of money on it tho.
 
Standard dynamic range is "limited" to 8 bit and HDR is 10 bit so here you already have a difference, but what does it mean?
In order to see information in a "gray" area (because black is no longer black) you can do some trickery, if you have enough range (lumen) . If you "beef up" the signal you get a bit more info in the darker area but in the bright area info gets lost as everything will get white(ish). To remidy this we can now leave the upper end of the (white) spectrum as is and concentrate on the darker side (add more contrast and brightnes here only) this can only be done if this extra information is also recorded. Now we can make a mix of a dark and a bright image taking the best of both. The sky is now darker and contains more info as clouds are clearly visual and not only a blurry white blob. The darker areas can be more clear as we allow more light into the lens (this will clearly wash out all bright images) but in the "mix" both will be clear.
 
  • Like
Reactions: MadFloyd
Well said.

Another way of looking at it is this: there's more contrast with HDR. I've being doing home theater projection since 1996 and I've never had a more filmic presentation than I do now - with HDR. I get more depth and perceived contrast than I do with SDR material.

Of course you have to know how to process HDR. If you try to view it natively most projectors will make a mess of it. The trick is to convert it to SDR using finely tuned algorithms and displaying it in the proper color space. I use madVR (I also have a Lumagen Pro) and the results are fantastic.

The proof is in the pudding - you need to see it with your own eyes. HDR was a frustrating mess until some very smart people figured out (and not overnight) how to make the most of it on a projector setup.
 
  • Like
Reactions: Lampie519
I worked with CRT PJ's only (long time ago) and that was difficult enough to setup, but at least it was straight forward.
The results can be amazing when done correctly (almost 3d with a high end projector, very noisy though).
I have not yet seen any convincing projection other then CRT (not even in cinema's as they do not use reels anymore)
Unfortunately CRT's need some adjusting once in a while plus in order to get a bright image you will need to stack a few...no fun !
 
I started with a Runco CRT. I had it for 8 years!

I personally could never go back to CRT though.
 
  • Like
Reactions: Lampie519
I still own a Sony VPH1292 with HDMI input card. Just no space to set it up, but i would like to use it again.
 
Last edited:
I worked with CRT PJ's

I will respond to all these comments later tonight when i have time to fully address them all. I am strill quite clear on my view that HDR should be turned off. If your not reaching peak white from your HDR material your loosing bit depth from the imager. I have some background doing 35/70 on a home screen, so, I know what that looks like.

I made the best CRT ever made and reviewed by various magazines taking product of the year in Perfect Vision and Stereophile's Guide. I have some fairly good credentials. https://www.xymox1.com/Resumex/Press/

I will respond fully to each comment later tonight.
 
I still own a Sony VPH1292 with HDMI input card. Just no space to set it up, but i would like to use it again.

We may know each other from back in the day :)

I preferred the Marquee as a base CRT. I still have 2 post houses using my CRT.

So if your saying CRT is the cinematic reference, then I am your guy as I made the best CRT.
 
Last edited:
  • Like
Reactions: Lampie519
I personally could never go back to CRT though.
Sadly there is no reason to. I have 3 of my projectors, fed by HD-SDI and a HDMI secret box. They are limited of course to std HD. While they can do high scan freq, its a worse picture because of a interaction between the phospher and the scan time.

From a technical standpoint.. CRT has ZERO digital artifacts, so, its great to look at for some things because it adds NOTHING. A CRT is also very good for gaming because of a complete lack of any lag. The down sides tho are numerous. Non-linearity, resolution and illumination into the corners is really terrible. They cannot produce enough fL to really do the job. They are fun to watch tho. They do have a analog nature and for a large number of technical reasons are easier to watch for long periods. Laser and DLP have weird interactions with the eye and iris and this produces more fatigue.

The newest Sony that I have been playing with tho is really stunning. The VPL-GTZ380 is impressive.
 
  • Like
Reactions: Lampie519
The newest Sony that I have been playing with tho is really stunning. The VPL-GTZ380 is impressive.
It should be for $80k. :)
 
  • Wow
Reactions: Lampie519
I preferred the Marquee as a base CRT.
I have heard this from a distant friend of mine too, that these are very nice "bases" to start with....
 
Standard dynamic range is "limited" to 8 bit and HDR is 10 bit so here you already have a difference, but what does it mean?
In order to see information in a "gray" area (because black is no longer black) you can do some trickery, if you have enough range (lumen) . If you "beef up" the signal you get a bit more info in the darker area but in the bright area info gets lost as everything will get white(ish). To remidy this we can now leave the upper end of the (white) spectrum as is and concentrate on the darker side (add more contrast and brightness here only) this can only be done if this extra information is also recorded. Now we can make a mix of a dark and a bright image taking the best of both. The sky is now darker and contains more info as clouds are clearly visual and not only a blurry white blob. The darker areas can be more clear as we allow more light into the lens (this will clearly wash out all bright images) but in the "mix" both will be clear.

So your position is that the issues are all in implementation ? So the standard body failed ? Licensing should have tested the devices to confirm proper implementation ? Another good reason to not use it, the source material and source devices are kinda random. What a bad standard. I think we can all agree on that.

I am kinda a stickler to matching the pixels and bits to the source device. No math. No processing. As that kinda got obliterated by compression for streaming I suppose its kinda pointless, but, I don't want extra math in my picture.

Streaming services are going to cap bit rates, so, HDR will get crunched by most everyone down to about the same as SDR and 4K gets crunched down to 2K bit rates because people don't care about quality on a iphone. So pushing HDR down a pipe will only look worse then SDR because your still the same bit rate and now you have added extra overhead and away goes detail in motion.

Count me as crazy, but, a good 1080 BluRay, using a fully modded Oppo into these really good scaling engines in these Sony projectors can be jaw dropping, especially in fast moving action scenes with tons of detail. 40 Mbps of 1080P24 allows real resolution and bit depth to stay in tact.

The only good 4K I have seen are carefully encoded SLOW MOVING demo clips. Real life material from real sources can look worse then 1080.

I am all for 10 bit 4:4:4 but the bit rate must go way up on the source material to accommodate it. Otherwise, its junk to me.

SO.. HDR.. I still don't see any reason to use it. At least not with these display devices and sources. Real HDR would make the sun in a video equal in lumens on the screen and 0.0001fl be the same from source to screen. There should be a exact fL match to ANY real life situation. SOrta a Pantone for luma. WHen i get out my Photoresearch I should get the same numbers for everything from the real scene to the screen.

Sorry I am rambling. I am overall dismayed by the current state of video to consumers.
 
  • Like
Reactions: Lampie519

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu