HDR Confusion

I have heard this from a distant friend of mine too, that these are very nice "bases" to start with....
Indeed.. 1Ghz to the tube. Very few parts. CLEAN signal path. The Sony has too many jungle chips. To much processing.

I dropped in 200 parts. Tons of crazy caps. Better op-amps. Better DAC for control voltages. Lots of fine detail changes.
 
  • Love
Reactions: Lampie519
SO Overall I guess my thinking is,, no more adding things to video. Less is better... In fact we need to back up and remove things.. BECAUSE the Mbps is fixed.. 40 Mbps 1080P24 can look way better overall then 40Mbps 4K HDR / DV.. And its never 40 on streaming, more like 20..
 
You loose me here as i do know the theory but never tried any of this in "real life".
I came from the ProAV environment where any "consumer grade video" was a nogo (i know this is not so for everyone).
All i can say is that all display devices have limits and to overcome or hide these we need some tools (external or internal processing)
We also did this in the past with CRT's (mosquito filters, line doublers, scalers, what have you).
I must say that some scalers did a really nice job...
 
I have had a 100% change of mind.. Turns out HDR can be STUNNING. JAW DROPPING.. I finally found correctly mastered material. Its the only material I know of that is DEAD ON CORRECT and TRULY stunning..

The demo material on this disc. FOr some reason i never wandered down to look at the demo material. I just used the patterns. I spent the whole day playing with the new Sony. Got the new AppleTV. HDR thru it still looked like complete ****.. I was out checking everything using the disc and flipped down to demo material. 10000 / 2020 HDR10 - The Oppo does not do HDR10+ or Dolby Vision.

I fell of the chair. This was the best material on a screen I have ever seen. The projector being so good made it better then anything I have seen previous in a post house. It easily exceeded 70mm.

FULL brightness for the first time on HDR for me. Correct gamma. color rendering in real 10 bit. It was, shocking....

SO... WHY DOES ALL HDR LOOK SO HORRIBLE WHEN FOR SURE IT CAN BE DONE CORRECTLY ??

SO... I take back my comments. HDR can look truly shocking.. As long as you have 100Mbps and its all slow moving material.

WHAT A HORRIBLE STANDARD THAT HAS LED TO THIS MESS.
 
  • Like
Reactions: Lampie519
SO... WHY DOES ALL HDR LOOK SO HORRIBLE WHEN FOR SURE IT CAN BE DONE CORRECTLY ??
Isn't it with all consumer stuff ?
 
  • Like
Reactions: Xymox
OK... I talked to a friend who is SMPTE and very deep on these subjects.. I am going to explore this way more..

So... Correct me if I am wrong.. This is my understanding right now with what is wrong..

HDR material is rendered with a target luminosity ? Its not a object model where you can render the material on the fly into a luminosity space of the display device. That would require way too much CPU power. So a target is picked and the material is rendered. The target many people seem to be using is for a display that physics has yet to enable. This allows the material to be rendered correctly at some point in the future when displays have reached the ability to do supernova to 0.0001fl range.. So the material we see today is rendered in a way when they create the file that is for a device no one has and when displayed on current displays looks awful because it ends up not having any dynamic range on the actual imager chip in use. Streching it out to cover the actual display range is almost not possible and results in all sorts of issues. WHile its possible to make it look "OK", sorta,, its not making full use of the bit depth of the imager chip and not using the full dynamic range of the display device UNLESS its rendered with a luminsoity target that is close to the display...

Do I have that right ?

There is no way to fix this or adjust out a bad rendering..

I am going to go on a rant how this abhorrent "standard" made it into the world in the days ahead. I want to go look up who was responsible for this mess..

Further.. Dolby Vision also seems to use this same general model ?

The only way this will work correctly is a new standard that renders the video into the dynamic range of the display device on the fly. That seems way to mathematically intense to be implemented any time soon.. Sorta Atmos for video..

It IS possible to render into different luminosity spaces and offer these up by a choice. Like the S&M disc does. So its possible for a AppleTV for example to know what your choice of luminosity target is and then pick the file to play based on this. Its also possible the consumer could pick it.. This would be a good temp solution.. Of course source material people would need to render a number of files, grade each one, store and offer all these taking up like 7 times more space. So the business end of this might be the real problem.

HDR is abhorrent.
 
Last edited:
Isn't it with all consumer stuff ?
It is.... BUT most of my life has been spent taking high end post gear & cinema and using it in homes. I was stuffing DLP Cinema heads into home back in the day..

These pics show the Stack I was using when i was developing my projector. You can see the Pioneer laser dic, the 99, in the middle of the stack on the left...

I also did DLP. This was a pic from a setup i did where we were looking at CRT VS DLP.. The D5 tape machine for reference material in HD and the DLP cinema head with the mini lamp house. HD-SDI also went down to the CRT proj where I had a HD-SDI input card made for it. The CRT was the winner, the CRT was really more cinematic and easy to watch 2 movies.
 

Attachments

  • CurrentGear.jpg
    CurrentGear.jpg
    148.1 KB · Views: 3
  • DCineShootOut.jpg
    DCineShootOut.jpg
    303.9 KB · Views: 3
  • Love
Reactions: Lampie519
OF COURSE... HDR was developed by the Consumer Technology Association.. No doubt sales and profit drove this..

It was these b******S https://en.wikipedia.org/wiki/Consumer_Technology_Association

But I am certain it was a smaller group of people in that group. I want to call the exact individuals out for this mess.. HDR10+ was of course created by 3 companies who all wanted to sell gear..
 
Just to be clear... There is NO SUCH THING as a HDR display. There is no physics that can produce it..
That is why i stated that we need some kind of "trickery" here.... rDin mentioned Lumagen Pro in his post to do this...
 
That is why i stated that we need some kind of "trickery" here.... rDin mentioned Lumagen Pro in his post to do this...

But the damage is done already right as the rendering is into the wrong luminosity space. Ultimately the bit depth in the material, the dynamic range, is already compromized. Doing the math to stretch it out and cut off the stuff at the top and kinda align the bottom will vary with each piece of material. A normal user is not going to fiddle with adjustments for each thing they want to watch. If you hit a compromize set of levels some stuff will be wrong. The math itself introduces issues. This is not how a std should work. I should not need to apply math to fix a signal for HDR/DV. Thats crazy. While OK it might help for that material, after its adjusted, unless there is a bit for bit passthru on the lumigen and its got really esoteric HDMI rcvr and driver chip design, putting a lumigen into the path WILL cause some degradation VS a physical cable.

My complaint is about how P*** Poor the HDR and DV standards are. I also don't like metadata that can alter the picture on the fly based on director's intent and mood. How a DP sees the master will be on a very specific display that will not be like a consumer rendering device. So this whole idea is crazy UNLESS you have STRICT standards on every aspect of the display and you enforce these certifications.

Your Results WILL vary is not the way a standard should work.

I have respect for Lumigen, Jim is a great guy. Its a great product. But in my case I don't like adding math into my signal chain.
 
I had a discussion today with someone who really knows HDR.. One thing for sure, HDR is not a standard because each DP gets to pick the nits. Fail..

Also i discovered something interesting today.. So I need to confirm this.. The story goes that back before HDR existed, Dolby was trying to sell Dolby Vision to people from film industry people to consumer mfgrs and no one wanted to pay the steep licensing. Dolby then came out with a free to use system, HDR. I have to think, if this is true, Dolby knew exactly what they were doing and wouldn't mind at all if HDR failed and eventually everyone had to go to Dolby to save them with Vision...

If thats the case... Jeeze... I suppose its legal, but, thats just nasty..
 
  • Like
Reactions: Lampie519

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu