I gotta tell ya! the picture with the Appletvx is so amazing! But I still find myself sometimes comparing SDR vs HDR. Sometimes, its a close call, though HDR never seems to win out. Tonight, watching Season 4, Episode 1 of "For All Mankind", let there be no doubt. Especially the scene where with the asteroid that they are trying to tow to Mars. The blackness of space, stars in the background, the asteroid, the astronauts - the black level, the detail, the quasi 3D picture, its not a contast, as good as the Dolby Vision picture looks, the SDR picture no question looks better, more resolving, better everything! At least for the Appletv x. Note I am using it with 2019 LG C2 OLED, a Gigafoil v4 (converts ethernet to optical and back to ethernet - prior to Appletvx), and Fidilizer Etherstream2 network switch. Thanks a lot, Chris!
Your welcome Steve.
I like hearing from Steve. As i mentioned he was more like a reviewer and took months of A/B/C/D tests and lots of real world use to fully evaluate the ATVX. While I have gone over the HDR myth many times, he took it to task and is STILL doing A/B.. I love it.
He is right about the ethernet feed somehow affecting picture. I have NO idea how this can be technically, but, I have heard from everyone who tried it say it does matter. While my Switch X was meant for audio use, tons of ATVX owners have also fed a ATVX from it and report big jumps in pic and sound. I have a number of people who bought one just for the ATVX.
I have been battling the HDR myth now since my first encounter with it when Sony shipped its first laser projector. It was then I jumped fully into the technical side of it and realized video HDR was a lossy compression scheme made for a type of display that may never exist - A full HDR range display. It did not have more bits. It had bits that described where to put the bits when a display could light up a pixel literally as bight as the sun. As a uncompressed full range from a no moon nite scene to a full daylight scene would require at least 24 bits per color per pixel, a compression was required that was lossy and tuned to what the "average" person would not notice. NO ONE wants to capture, store, transmit or stream a 24b/color picture. This kind of thing would require at least 1Gbps. The SDR range matches real world TVs without any lossy compression math besides the streaming compression. Because a true HDR display, that does not exist even in science yet, has a huge 24 bit range the 10 bits needs to laid out in that brightness range in a way that the average viewer deems acceptable. So HDR meta data ( bits that describe bits ) lay out where a director thinks the 10bits of SDR look best in a HDR world. They just guess at how it looks tho because no HDR displays exist. So HDR "compatible" TVs of today do tone mapping to take that HDR signal and squish it back down to a SDR space on a LG/Sony/Panasonic OLED of today. There are no standards for tone mapping, so every TV maker does it differently. So every display displays a HDR encoded picture differently. In fact the director and post house monitor used to judge HDR pictures is also different from home units because of NITS and tone mapping math. So all HDR encoded content looks a bit different because of the lack of standards. DOlby Vision is just a fancier version of HDR, as is HDR10 and HDR+.
SDR tho is much simpler. None of the HDR lossy compression and then decompression by non standardized tone mapping. None of the math. You get what the director and post people see. The SDR matches the display native. SDR range is very pleasing and comfortable for the eye. I personally dont want a TV as bright as the sun. I dont want to go from dead dark to literally looking at the sun. I don't want my irises to work that hard. I do not want more NITS. I find some TVs too bright as it is now.
SDR done right can produce STUNNING pictures that look 3D, look Pantone accurate and cause wow in nearly every scene.
I feel HDR stands for High Dynamic Revenue. I believe HDR was pushed onto consumers like the Emperors New Clothes. HDR sold a incredible number of new TVs. The orginal idea for HDR for video on TVs came from the Consumer Electronics Association not SMPTE.
HDR/HDR10/HDR+/Dolby Vision have currently no reason to exist except to sell TVs. When/If a real HDR technology comes out - then _maybe_ HDR will be good to map by then older SDR material into a HDR 24 bit space. By then hopefully our streaming can reach 1Gbps at least and support a native HDR display space. Until then, SDR is the way to go.
IMHO...