AppleTV X - I am playing with something new

One thing I wanted to ask you, Chris, is how you deal with having only 1 HDMI output? If you connect the ATVX directly to your display, how do you get the audio out? Does the display have an HDMI output? If so, that's probably compromising the audio signal.
I use a Lumagen 5348 which splits it and reclocks. But ideally, the ATVX would provide two HDMI outputs. What do you think?

Yes, A very good question... Experimentation is required to get the best audio & video in each system..

Ideally... In a surround system.. I use a Datasat rs20i to switch audio/video and this output goes to the projector. I dont use a Lumagen even tho I have one. For me the Lumagen was used to try and make HDR work. As I now know HDR in depth and have no need for it with any display I know of, the Lumagen was extra processing I did not need with the Sony 380. I believe a pure path is best.

Projectors can benifit with a Lemagen. It can do cool stuff with gamma and a list of other things. So using a Lumagen is really specific to each projector and its use. But I would not use it for HDMI switching. Audio is killer important for me, so, I want the ATVX plugged diretly into the Datasat.

From the 300+ clients who have the ATVX I have heard many ways to hook it up.

2ch.. SOmeone with a killer 2 ch rig can drop a OLED between the speakers and going 2 channel fully. This produces the best sound. Atmos and DD have a lot of processing. Putting the ATVX into 2 ch is a fairly amazing experence. Getting PCM out of the ATVX can be done 2 ways. The display has a optical out and this plugged into a high end 2ch DAC can be crazy good. Its also possibel to get devices to pick off the 2CH PCM from the HDMI. That does put a device inline in HDMI. But lots of 2ch people like this option and I have reports it REALLY sounds great.

Atmos.. Obviously a Lumagen can be used for HDMI switching and can outout a audio only HDMI stream. This is what a lot of ATVX users do. Also a lot of ATVX clients hook it directly to a Trinnov or other surround processor and this sends video to the display.

What I like to do for sound evaul is, hahaha,, different.. Besides using a insane level Atmos system/room I have found the clearest way to evaul mods and quality for sound is, ummm, not normal, hahaha.. I hook a ATVX up to my HIGHLY modded Oppo input. The Oppo has insane mods including a rubidium clock. I then pick off 2ch SPDIF coax and feed that to my HIGHLY modded Levinson 40 setup for 2ch. This highly modded gear is way better then anything you can buy. No way to make this gear tho because of the labor. It took me 6 months to mod the Levinson and who knows how long on the oppo. So this kind of hand made gear produces better sound then anything you can just buy that comes from some kind of production. BUying 200 resistors of one value and measuring each one for all sorts of things and picking 1 out of the 200 just cant be done in any kind of production. So I use this rig to really listen to the sound. I check all this tho using a Datasat rs20i and a lot of other gear.

So the best performance for sound REALLY depends on each setup. Some experimentation is required tho, there is no easy single answer because each system is different. For example how long HDMI cables are will produce different results with different gear. I relaize tho rewiring to do these A/Bs is annoying.. I am kinda crazy and ALWAYS do TONS of A/B/C/D/Es and pick out what is best.

Your projector has really good Lumigen profiles and I understand it really is a good thing.

"Professional calibration" can be periless. Its possible for a calibrator to get a device to measure perfectly and look awful. This is a deep subject, but, a projector should never be set any lower in light output then fact because this reduces contrast ratio because of losses in optics and light path. Bit depth on the imager chip can drop if brightness is adjusted away from factory. This of course VASTLY varies by device. BUT I have seen too many calibrators make a device measure perfectly and look awful.
 
I have also seen some weird stuff. Like HDR gets turned on, and the show is "in HDR" yet there is no HDR metadata. So it just shows the native SDR stream.

Yes - no doubt there is fake or shoddy Dolby Atmos on some streaming shows as well as fake or shoddy Dolby Vision or HDR. And some really good Dolby Vision particularly on Appletv Plus. And I suspect if you set to SDR that some shows may stream at lower video bitrate or without the Dolby Vision or HDR metadata. Maybe a factor why a great show like "Death And Other Details" on Disney Plus simply looks better to me, much better, in SDR than in Dolby Vision!
 
One thing to keep in mind. I have a VPLGTZ380 handy and my Sony OLED A90J just blows away the projector for picture quality. The OLED shows so much more contrast ratio its insane and this makes it look so much clearer. Projectors have loss that is unavoidable because of the light path and optics. A direct emission OLED pixel is a huge difference and has stunning resolution VS pixels thru the optics. You get into MTF and optical issues with a projector.

So the differences become FAR more clear and objective when using a direct emmisive display like a SOny OLED.

Of course a screen and proj allows for a far more immersive experience allowing a big screen and you can get speakers behind the pic. A huge flat panel is a big issue acoustically.

I use 2 display to eval pic. Well 3.. The Sony A90J - best pic I know of. But I have not had hands on with a Panasonic OLED. I also use a Panasonic Plasma 1080 display. The reason for this is the Panasonic has basiclly no video processing, so I can see the pic native. Getting back at a good distance the plasma can be shockingly good. I also check things on the Sony 380, but, for best viewing its all about the Sony OLED.

Doing TONS and TONS of A/B/C/D/E with all sorts of sources and mods and cables and .... the Sony OLED is my tool for this.

On well. My Sony 380 projector should be up in a few months finally (14' wide screen) in my renovated theater. I will live with the immersiveness. And an Appletv X for the theater (as well as for my basement 65" LG OLED system).
 
So Chris, a sidebar, since you mention Lumagen (which I will be using the 5348 in my theater). Have you played with MadVR. Any impressions incl vs Lumagen?
 
I can tell right away if the content is actually in HDR because I switch picture modes on the projector. If the extended color space (i.e., for HDR) is not used, the colors are off. I've never seen the Lumagen be wrong - if it's seeing an HDR signal from the ATVX, it will show it, or not.

I would think that nearly 100% of DPs and Post houses will edit on an HDR display if they are releasing in HDR.

Well there is no such thing as a HDR display. Science cant do that yet. The real spec is a MEASURED 1,000,000:1 CR. If a device uses tone maps, its not HDR. A tone map is used to fit a HDR pic into a SDR space. Projectors are the lowest CR. Remember HDR is a perceptually lossy compression. The actual brightness range is like 20 bits.. Because a 20 bit per color depth, 60 bits, is impossible to deal with in a camera, in the post production and then streaming enviroment at 20Mbps, HDR does compression of this space using a perceptual model of a "average human" to map out where 10 bits can be spread around to to give the impression of a wider dynamic range. This data is just metadata, data describing the data.

So this HDR meadata *can* be picked up when the scene is shot. Many times its not. Its added later as a best guess while looking at a tone mapped display. Each display has a different set of tone maps as there are no tone map standards. So a DP will look at a display that has unknown tonemaps. So they will make decisions on picture based on tone maps that have no standards.

They know all this. Well, mostly.. They tend to stick to SDR as that is standardized and the displays are standardized. Post production later will then deal with the HDR and make best guesses.

If you have not read my paper on HDR I covered this.. https://www.appletvx.com/HDR.pdf

Color space is a different discussion. I like bigger color spaces. BUT the mess with luma makes the color space kinda moot. Of course the color space needs to be supported by the display, otherwise you get remapping.

Gamma is another consideration and another discussion.

The remapping of HDR into a different luma space depends on the metadata. There is a bunch of material where they just pass the SDR luma thru as is. sorta.. You can choose how you remap the native SDR pixels. Some material gets wildly different levels in HDR where tone maps have to bring it back into range on a display like a projector. Some material stays in the SDR range. This is all HDR.

I choose to ignore HDR remapping. I just want the native SDR. Like Steve mentions, mostly, material with HDR metadata looks better if you ignore it and do SDR. BUT. As I mentioned, who knows.. If a DP and Post only use a tone mapped display and never look at the native SDR then I could see a HDR stream looking more correct to what the DP saw when mastering.

What I have seen, is when the 2 are close... the HDR looks a bit artifical. Kinda a bit AI like. A bit cartoony. Too much processing for me.

BUT all this... Is very system dependent.

YES the Lumagen does grab every flag and does a killer job with everything. But just using the Lumagen you cant see what remapping is occuring per pixel/per frame.

As SDR looks killer, I just turn off HDR and I am happy with no switching anything. BUT. Steve could be right. I am wathing these same shows, so I will fool around with settings.
 
Last edited:
So Chris, a sidebar, since you mention Lumagen (which I will be using the 5348 in my theater). Have you played with MadVR. Any impressions incl vs Lumagen?
Oooo.. Good question... I have not personally played with the MadVR.. I know some clients have one, but I have not heard anything from them. So I do not know. Jim at Lumagen tho is REALLY smart. Its a good product. I would need to do a direct AB with C being no processing.

I would start with the 380 and settings changes. Get that as good as you can and then play with adding processing. I believe "less is more" but hard to tell..
 
So a lumagen, and maybe the MadVR, is way better at tone mapping then most displays. So if your going to do HDR then a processor is most likely better then a built-in tone maps on a display. A video processor might also better at gamma and color space then a display is. So there ARE applications for a video processor.

I would not use it as a HDMI switch tho. I would switch with a Trinnov or Datasat and put the lumigen between the display and the Trinnov or Datasat. IMHO..
 
I only have the Apple 2021 box and the Dolby Vision content is amazing on Apple +. I have seen some poorly implemented HDR usually with live sports. If you prefer the SDR it is likely because the HDR was poorly implemented. If has nothing to do with the box or the display. My 83 A90j OLED is an amazing set. I have a77 95l in another home that looks great too but not yet calibrated. The preset professional setting is damn good Though.
 
  • Like
Reactions: Young Skywalker
  • Like
Reactions: Young Skywalker
Projector focus also makes a big difference; perhaps not surprisingly.
Once I learned how to nail focus, and fed the ATVX clean power and clean Ethernet, OMG!
 
Yes - no doubt there is fake or shoddy Dolby Atmos on some streaming shows as well as fake or shoddy Dolby Vision or HDR. And some really good Dolby Vision particularly on Appletv Plus. And I suspect if you set to SDR that some shows may stream at lower video bitrate or without the Dolby Vision or HDR metadata. Maybe a factor why a great show like "Death And Other Details" on Disney Plus simply looks better to me, much better, in SDR than in Dolby Vision!

The bitrate for HDR is the same as SDR. A HDR stream is just the SDR stream plus HDR metadata which is very very little bitrate. The whole point of HDR is to use about the same bitrate as SDR and hence why they do perceptual based lossy compression. The other advantage is its one stream for SDR or HDR. The device on the rcving end either uses the metadata or not. Using the developer mode on a ATV you can see bit rate and CODECs and stuff and HDR does not use a higher bit rate. Its the same stream. Having 2 streams is what the Content Delivery Networks want to avoid. They want one file for everything. One stream to rule them all. HDR does this.

A LG display is a tad odd that for some material, SDR can look terrible and HDR looks bad, but for some reason Dolby Vision looks great. For LG clearly DV is a very valid choice. But on Sony OLEDs Dolby Vision on the same stream looks way worse then SDR. Each display is dealing with these things differently. Its a complete lack of real standards.

For me, with my devices, SDR has worked really well. Its good enough, I don't ever feel a need to flip to other formats to see what they look like.

BUT.. You could well be spot on. That some material is rendered differently and mastered using various methods using equipment that has different tone maps and so the series, or post house, or even a DP, might have a favored display with its own tone maps that the DP likes. But this experience cant be matched as each make/model of TV implements its own tone maps and other things like gamma and more stuff in the HDR specs. Even with those same tone maps and settings can't be applied to each any device because each device is different. Then you might have a MadVR or Lumagen. At least with those there is some hope of getting closer with different tone maps as long as the tone maps in the display are also accounted for.

SO.. No doubt using a ATVX you can see every detail in all this and it SHOULD be that each bit of content looks different. It should all look different. And yeppers, in this new environment without standards some stuff is gonna look different..

All that said.. For me.. Just leaving it on SDR with the SOny is just shocking good. I don't feel a need to switch things.
 
Hi Chris:
I think you may have been away in December so perhaps overlooked my below questions. I’m still curious. Thanks.
======

December, 2023

Hi Chris:
Alex from UpTone here. I am just now catching up on your success since our technical conversations via phone and e-mail back in April 2022, when you were just beginning to experiment and measure with the ATV boxes.
Wow. Congratulations on your success!

So I have a question for you:
In watching your video on the Aruba WAP, I noted your disdain for WiFi 6 (802.11ax) and preference for the simpler (potentially quieter/less current draw bursty) WiFi 5 (802.11ac).
Then, when I went to see which generation of Apple TV 4K is sitting next to my LG OLED B7, I found it is the first generation 4K model, A1842.
As you can see from the specs (https://everymac.com/systems/apple/apple-tv/specs/apple-tv-4k-5th-generation-2017-specs.html) it is based on the A10X Fusion processor--and is just WiFi 5 (802.11ac).
Looking at the iFixIt.com teardown of ATV A1842 I see that, just like the A2169 you have been modding, this slightly earlier 4K unit (which I bought the newer remote for long ago) has the exact same SMPS, machined metal divider plate, isolated wifi antenna corners, etc. A few chips differ of course, but I have not had any issues with its responsiveness or ability to run the latest tvOS or current 3rd-party apps.

Thus my question is: Have you considered modding and comparing the A1842? Might be fun.

The other question I have--for you and other Apple TVX owners--is with regards audio extraction:
In our living room with TV I do not have a full A/V system with all-in-one HDMI receiver (yuck!) or high-end A/V separates (all my $ and tech are down the hall in my custom listening studio ;)). And really for best video performance I would expect that direct into the projector or monitor would be best.
Yet since Apple dropped the TOSLINK audio output jack at the introduction of the 4K models, users are left with either:
a) Running the HDMI into a some A/V processor box (yes, I know that a lot of your clients probably have some megabuck equipment for that);
or
b) First running the HDMI from the ATV into an HDMI audio extractor box. That is what I do. I bought the best 4K/high-speed rated cheap one I could find. This: https://www.amazon.com/gp/product/B074HHSJVN/

But I can not imagine that the extractor I am using is doing the video any favors, and thus wonder if the visual gains from your ATVX system (or my own pondered 12V conversion mod--fed with a spare UltraCap LPS-1.2) would be lost.
Your thoughts?

Cheers,
Alex C.
 
A LG display is a tad odd that for some material, SDR can look terrible and HDR looks bad, but for some reason Dolby Vision looks great. For LG clearly DV is a very valid choice. But on Sony OLEDs Dolby Vision on the same stream looks way worse then SDR. Each display is dealing with these things differently. Its a complete lack of real standards.
My basement 65" OLED is a 2019 LG. So what Chris points out above may help explain some of the funky visual observations I have had!
 
Hi Chris:
I think you may have been away in December so perhaps overlooked my below questions. I’m still curious. Thanks.
======

December, 2023

Hi Chris:
Alex from UpTone here. I am just now catching up on your success since our technical conversations via phone and e-mail back in April 2022, when you were just beginning to experiment and measure with the ATV boxes.
Wow. Congratulations on your success!

So I have a question for you:
In watching your video on the Aruba WAP, I noted your disdain for WiFi 6 (802.11ax) and preference for the simpler (potentially quieter/less current draw bursty) WiFi 5 (802.11ac).
Then, when I went to see which generation of Apple TV 4K is sitting next to my LG OLED B7, I found it is the first generation 4K model, A1842.
As you can see from the specs (https://everymac.com/systems/apple/apple-tv/specs/apple-tv-4k-5th-generation-2017-specs.html) it is based on the A10X Fusion processor--and is just WiFi 5 (802.11ac).
Looking at the iFixIt.com teardown of ATV A1842 I see that, just like the A2169 you have been modding, this slightly earlier 4K unit (which I bought the newer remote for long ago) has the exact same SMPS, machined metal divider plate, isolated wifi antenna corners, etc. A few chips differ of course, but I have not had any issues with its responsiveness or ability to run the latest tvOS or current 3rd-party apps.

Thus my question is: Have you considered modding and comparing the A1842? Might be fun.

The other question I have--for you and other Apple TVX owners--is with regards audio extraction:
In our living room with TV I do not have a full A/V system with all-in-one HDMI receiver (yuck!) or high-end A/V separates (all my $ and tech are down the hall in my custom listening studio ;)). And really for best video performance I would expect that direct into the projector or monitor would be best.
Yet since Apple dropped the TOSLINK audio output jack at the introduction of the 4K models, users are left with either:
a) Running the HDMI into a some A/V processor box (yes, I know that a lot of your clients probably have some megabuck equipment for that);
or
b) First running the HDMI from the ATV into an HDMI audio extractor box. That is what I do. I bought the best 4K/high-speed rated cheap one I could find. This: https://www.amazon.com/gp/product/B074HHSJVN/

But I can not imagine that the extractor I am using is doing the video any favors, and thus wonder if the visual gains from your ATVX system (or my own pondered 12V conversion mod--fed with a spare UltraCap LPS-1.2) would be lost.
Your thoughts?

Cheers,
Alex C.

WHATS UP ALEX !!!!!!!!!

It has been a bit, We should talk on the phone :) Its always fun to talk :)

Sorry I somehow missed your post :(

Thank you, its been a fun adventure. I have learned a LOT of interesting things, some of which seem to be unique. I say unique because i have passed some of my obserations by some top people engineering stuff these days and they told me I should patent :) We should talk on the phone.

The A1842 can be modded as well. As can any ATV and benifits will occur. The A2169 looked to be the last good gen they were going to make, which now looks to have been true. The A2169 has a better CPU and some features missing on the A1842. I also wanted to be as current gen as i could for long term support.

While the little cap bank on the ATVX is really important, I do SMD rework and replace a bunch of parts including 2 clocks and use lower phase noise parts. I also use better regulators and change some other stuff. All of this is very specific to the A2169.

I did a LOT of really intense and the most demanding EE work I have ever done on it. It was WAY more interesting and deep then i could have imagined. I needed to impedance match power supply rails to the chips in ways I don't think anyone has done. Digital chips, of all flavors, are VERY complex in power impedance spectra. CPUs for example "rumble" as they think in the 5hz to like 200hz range. There is also current spectra up to really high freq topping out at like 5Mhz. These are very random, bursty and spiky. Matching up this sorta high current impedance spectra produced stunning changes in pic and sound quality when addressing this per chip. Each set of chip rails also is fed from a PWM regulator. These are a whole insanity by themselves. I can discuss more on the phone. But the PWM gets mixed up with the current draw and makes for a big mess. So the freq of the PWM is really important, its spectra becomes part of the overall impedance equation. Along the way I discovered some important things to address all that. The result is a very low impeadance power rail at the chip for all types of current spectra draw. It also results in a huge drop in noise at the chip pin and much tighter regulation. I am leaving out a few really key things I ended up employing in this solution. The linear is r-core based and also not normal. There is a LM431 being used in a very abnormal way to do remote sensing full bandwitch. The sense works out to 1Mhz. This is all one tuned system measured at the chip rails. Even the dia of the wire, type and lenght. When dealing with 1Mhz you end up doing impeadance matching and the power cable becomes a transmission line with impeadance.

You may have also played with all this, your products are really cool and clever in engineering.

Ultimately looking at jitter of busses and data lines on the PCB I vastly lowered noise and jitter of all the data busses. The noise on the pin supply ends up on the data lines. Regulation also changes where a state goes from 0/1 occur so regulation and noise really affect jitter between chips. And also HDMI.

Fun to discuss deep tech stuff with someone who understands it :) The above stuff also applies to any of the digital world today.

So 6 months of tuning all that with tons of rework and testing anded up with a very specific set of mods that really only work for the A2169.

Extraction... Yes I have even been asked if I can add digital audio back in. I cant. Even if the chip were somehow stuffed in, it would need a driver and i cant, and wont, touch the firmware.

The crazy benifits from all the dejittering everything is suprizingly robust thru devices. For example I ran 2 of them thru a Crestrom DM HDMI distribution. A big 16x16 matrix and 200 foot runs of cat 8 to boxes on the other end. A AB between the stock and modded ATVX was breathtaking. The better HDMI signal and clocking at the source affects everything down stream.

BUT yes it ALL matters. HDMI cables are now very clear and every brand and make look different. Power cables seem to matter somehow. TONS of people also report ethernet matters. I tell people to treat it like they would a high end DAC and not like a AppleTV.

Let me go look up something on this thread. Or maybe it was on the Audionet thread. There is a high end extractor I have not tested yet. Let me get the device most people are using.

The best way is to just plug it into a surround processor and then pass the video out of it to the display.

Extraction has been done

With your awesome engineering experence, get into your extractor or into a good one and make it better. I am certain you can do wonders. For a quicky, just stuff 33uF multilayer ceramic caps after the PWM regulator on the chip rails. Measure the noise with a spect analyzer and scope..

OH.. Scopes.. I ended up dooing a lot of renting of a insanely expensive Tek MSO.. For analyzing ethernet, HDMI and busses it was great, but,, it was HORRIBLE at looking at noise on power supplies. Like TERRIBLE. These things "aquire" and do statistics and "digital phospher".. This all results in ****.. You should grab a analog Tek off ebay and restore it. You really can't use a digital scope for noise analysis. Nothing like a CRT and all analog electronics. I ended up with a few Tek 7904's and about every module. I can go to 1Ghz and do 10uV / div differential and i can see EVERYTHING..

I will give you a call tomorrow to catch up.

Go mod your ATV. Its annoying tho physically. Its a fun experence tho. Just doing like 5000uF inside the ATV and using some WIMAs and then feeding it 14.15 V will be very surprizing jump in pic and sound.. That started me down the rabbit hole.
 
  • Like
Reactions: Superdad and Rob181
My basement 65" OLED is a 2019 LG. So what Chris points out above may help explain some of the funky visual observations I have had!

I think ATVX owners are seeing, and hearing, more then even post and DP people are seeing, so its no surprize we are uncovering all sorts of stuff that post and DPs most likely could not see. I have recently become really aware of depth of field focus on STart Trek The Orginal Series. Its quite obvious now. TOS looks really great. Even on Pluto.
 
  • Like
Reactions: Rob181
Audio extraction.

A number of people i know are using this for 2ch extraction into high end DACs.

This has come up and looks really nice. I do not have any direct experence with this tho.

Linears can be used by both.

The other method is to use the optical out from a flat panel or proj and go to a DAC or surround processor.

A limited method I use in the lab is I feed a Oppo HDMI input on the back of it with the ATVX and this then has a SPDIF. This is limited because there are limitations for video on the rear Oppo HDMI input. In the lab I feed a Panasonic Plasma 1080 so its no problem there.

I have cracked open the Key DIgital to look at modding it, its stuffed full of boards and not much room. That JVB Digital I have not looked at, but, it seems pretty good right away. But maybe,, someone might make a nice extractor :) ( Alex hint )

But dealing with HDMI your gonna need to either buy a $100,000 tek/R&S or rent one which aint cheap. Not to mention you would need new designs with each new flavor of HDMI. However, I think we are mostly topped out with HDMI now.

What I see as the future will be things like a RAVENNA Network ( AES67 / SMPTE-2110 ) IE IP based. Like Dante.
 
Last edited:
  • Like
Reactions: Rob181
Hi Chris:
Alex from UpTone here. I am just now catching up on your success since our technical conversations via phone and e-mail back in April 2022, when you were just beginning to experiment and measure with the ATV boxes.
Wow. Congratulations on your success!

I have to give a shout out to Alex. He was asked by one of his clients if there was anything that could be done to help a AppleTV. We had talked a few days earlier and so Alex gave him my info and that resulted in my first real sale of a ATVX. So thank you Alex.

I had a LOT of beta testers that had them, but they were all people I knew. Alex turned me onto my first actual sale.
 
The bitrate for HDR is the same as SDR. A HDR stream is just the SDR stream plus HDR metadata which is very very little bitrate. The whole point of HDR is to use about the same bitrate as SDR and hence why they do perceptual based lossy compression. The other advantage is its one stream for SDR or HDR. The device on the rcving end either uses the metadata or not. Using the developer mode on a ATV you can see bit rate and CODECs and stuff and HDR does not use a higher bit rate. Its the same stream. Having 2 streams is what the Content Delivery Networks want to avoid. They want one file for everything. One stream to rule them all. HDR does this.

A LG display is a tad odd that for some material, SDR can look terrible and HDR looks bad, but for some reason Dolby Vision looks great. For LG clearly DV is a very valid choice. But on Sony OLEDs Dolby Vision on the same stream looks way worse then SDR. Each display is dealing with these things differently. Its a complete lack of real standards.

For me, with my devices, SDR has worked really well. Its good enough, I don't ever feel a need to flip to other formats to see what they look like.

BUT.. You could well be spot on. That some material is rendered differently and mastered using various methods using equipment that has different tone maps and so the series, or post house, or even a DP, might have a favored display with its own tone maps that the DP likes. But this experience cant be matched as each make/model of TV implements its own tone maps and other things like gamma and more stuff in the HDR specs. Even with those same tone maps and settings can't be applied to each any device because each device is different. Then you might have a MadVR or Lumagen. At least with those there is some hope of getting closer with different tone maps as long as the tone maps in the display are also accounted for.

SO.. No doubt using a ATVX you can see every detail in all this and it SHOULD be that each bit of content looks different. It should all look different. And yeppers, in this new environment without standards some stuff is gonna look different..

All that said.. For me.. Just leaving it on SDR with the SOny is just shocking good. I don't feel a need to switch things.

Chris, so you, as the Mad Scientist video expert (which I concede that you indeed are), prefer Sony OLED vs LG OLED, that Sony gives the best most accurate picture? How about the new Samsung OLEDs? And I understand Sony is discontinuing OLED and going to higher brightness mini-LED I think this next year - how might this affect your opinion? I'm asking for any of us with Appletvxs in particular who are interested in getting the OLED with the Mad Scientist best picture!
 
Sony is not getting out of the OLED business. This year, their flagship TV will be a mini LED but they will continue with OLED too. I have an 83“ Master series a90j and a 77” 95l. Both have staggeringly great image quality.
 
  • Like
Reactions: Young Skywalker
Sony is not getting out of the OLED business. This year, their flagship TV will be a mini LED but they will continue with OLED too. I have an 83“ Master series a90j and a 77” 95l. Both have staggeringly great image quality.

Technically I think now you are correct. Though it may be that Sony will not be making efforts to noticeably improve their OLEDs.
Why Sony pulling out wouldn't kill OLED TV - CNET
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu