AppleTV X - I am playing with something new

@Xymox I haven't personally compared 4K vs HD in the last year, and I know Apple has revised their system considerably in that time.

Do you still prefer HD over 4K with the current ATV OS?
 
A TV locks to a HDMI signal. So does a surround processor. The HDMI source clock/jitter propagates to the rest of the TV/surround decoding. So a HDMI source device is your master clock if you will. So feeding a super clean HDMI signal to devices results in a lot of the systems downstream having less jitter.

There is a LOT more going on besides HDMI signal clean up. On at ATVX the CPU, RAM, HDMI chip and ethernet is all getting dejittered and power supply to each system is vastly cleaner and tighter regulated. This results in much cleaner clocking of data thru the whole system. A measurable / visible result of this as one example is video frame decoding jitter. Each frame sent out is more periodic. Each decoded frame ends up being really well spaced in time.

If your using a surround rcvr/decoder then you can plug the ATVX directly into the surround rcvr. This is a vastly better way to decode sound then using the TV optical or eARC out if your doing surround for it. eARC is a whole additional set of issues and a whole additional HDMI link that degrades sound VS hooking the ATVX directly to the rcvr.

You gain a lot of control of the source material. For example there was a issue with some Amazon series stuff and I was able to switch the ATVX to 4K23.98 and this mostly fixed the issue.

You can choose SDR/HDR/DV or even things like 2K VS 4K.

SO having a separate streaming source device has a lot of advantages over a built-in app AND will have a better pic and sound because of a list of reasons only some of which i covered.

A app running inside a TV has a lot going on. The CPU in a TV is doing other things besides just streaming, so these interrupts and stuff cause unavoidable jitter in various aspects of the process. The TV is a terrible electrical environment and the power supply rails are shared with other systems and are very noisy and poorly regulated. The RF noise environment means the data bus and other linked systems will end up with noise which will end up as jitter. The ATV is a much more powerful hardware platform then what you have in a TV and so this can lead to apps being able to do more powerful things. For example i can play 700Mbps video streams off a local server. So TV apps are a lot less worthy of high performance use.
This is incredibly helpful and very insightful. I overlooked a few obvious details, but learned a bunch just from this. Thank you for the info and I look forward to purchasing one soon!
 
@Xymox I haven't personally compared 4K vs HD in the last year, and I know Apple has revised their system considerably in that time.

Do you still prefer HD over 4K with the current ATV OS?

You mean HDR vs SDR ? Or actual HD ( 2K ) VS 4k or even 8k ?

Both are good questions..

The HDR vs SDR vs Dolby Vision question is quite topical as it turns out..

I recently redid part of a section of my lab/production area. I have used a computer monitor and a 4 port HDMI switch to check incoming ATVS and do burn-in once I mod them. The computer monitor was fine because I do not judge pic on that setup, I just need to see them working, update firmware etc.. I got a Sony 42" A90J and swapped out the computer monitor. After setting up the unit the ATV asked if I wanted to "Try out Dolby Vision".. Ok, ok,, sure... Just mainly to test the new 4 port HDMI2.1 switch.. So it flipped into Dolby Vision mode. This looked surprizingly good. I have another A90J that has older firmware, I fact reset it and looked at DV on it. Sure enough, the newer firmware looked better. I updated its firmware and yep, it looked better.

So fact reset on the TV, then ATVX to Dolby Vision. This defualts to Dolby Vision "bright".. WHich in my lab makes sense. After a lot of playing with SDR and DV, DV turned out to be reasonable. At least so far on the material I have tried. This is with the ATVX set to match frame rate but match dynamic range to OFF.. Meaning its always in Dolby Vision. I made minor changes to the A90J, the normal stuff like motion stuff.

SDR can still look better. But its gotten a lot closer. It most likely is also that 18.x Apple firmware updated DV too. Dolby vision can make things look a bit unnatural. Colors look more "cartoonish" rather then natural. Dolby is doing some gamma stuff and making it more appealing to the eye even tho that is not the way it ACTUALLY looks. So if your a post production person keeping tone maps out of the picture as each one is different for each TV & model, then SDR is better then pretty DV, so SDR is still best. But its much closer now and many people might actually prefer it. At least on the Sony A90J/K..

Its getting hard and harder to stay in SDR. Directv 4K channels for example REQUIRE HDR / DV not just 4K. Some other apps also do this. Its sorta like OK Open reel tape is better, but its impracticable.

A quick refresher.. HDR/DV does not have any more data. Its the same bitstream. HDR and DV take the SDR data and use metadata to spread out the existing SDR data into a bigger space of brightness range. However, there is no such thing as a real HDR TV. Science can't do that yet. This requires way more contrast range then any science currently. TV can take the HDR/DV metadata and then use a tone map for that set to guess where best to map the brightness. There is no standard for tonemaps and so this is all over the place. TVs today are SDR, so giving them the straight SDR without the SDR > HDR/DV > HDR/DV > tonemaps > TV math and conversions can produce really good pictures.

So for a pure picture to see it the way it was encoded without extra processing, 4K SDR is the way to go.

DV is eye catching but not accurate and will vary by make/model of TV and on a badly HDR/DV mastered movie or show can look horrendous and so dark you cant even see it.

So for reference work on video material SDR.. For casual stuff that is not accurate but eye pleasing, DV..

___________

2K HD vs 4K

I know there are people out there doing HD who swear its way better then 4k. While that sounds crazy, its not. Your bitstream from a service is gonna be about the same for both. This is because content owners are hyper focused on costs and bandwitdh to stream content is one of the biggest costs. Mbps = $ / sec... What REALLY controls pic quality is the amount of compression.

A 4K pic is 8,294,400 pixels in monochrome. 8 bits minimu per pixel. 66,355,200 bits per frame. lets say 24 frams per second. 1,592,524,800 bits per second. 1.6 Gbps. Without color. 4:2:2 will double this minimum. So uncompressed 4K is about 3 Gbps - conservativly..

So right off compression starts right at the camera. Then more compression for storage.. Then cascading compressions from various stages along the way..

Final playout to your device is like 30Mbps avg. This 100:1 compression takes its toll.. Cascading decompression and recompression also takes its toll..

So the pipe at the end, the stream to you, is REALLY comportant to quality and its not gonna get higher because the costs grow exponentially.

OK if our pipe into the house is fixed at 30-50Mbps,, is it better to have less compression or more pixels and more compression ? There is a good argument that keeping compression lower is better then more pixels.

COmpression works by lowering resolution of things in motion. Well its one aspect of compression, but a big one.. So a 4K pic that is moving a lot drops in resolution because of compression. In fact, VERY little actual material is REALLY full 4K. Studios have figured out people cant really see 4K and they can still say its in 4K even tho after compression its mostly 2k or less..

So 4K is pretty much never 4k.

So 2K, HD, is less compressed to fit thru the same size pipe. Its lower compression can look pretty darn good. Its actual resolution stays closer to 2K.

So watching things in 2K is not as crazy as it sounds...

BUT.....

WHat the TV does with 2K/HD becomes critical. Upconverting. Scaling.. There are a lot of patents on this and it costs money for a TV maker to do this best. LG spends no money on this and thier upconversion looks terrible. You need a lumagen to play good 2K/HD on a LG. However a Sony 4K can look really good as they spent the money and engineering on upconversion. I can easily fool someone into going "OMFG, Thats the best pic I have ever seen" by using a highly modded Oppo and doing 1080 bluray on a Sony GTZ380.. They all looked shocked when i tell them is 1080 HD. So watching things in 2K vs 4K COULD make sense and even be better IF your TV handles upconversion well.. It varies..

On a ATVX you can go to video settings and pick any resolution and try these things out. The list is incredibly long and some more esoteric options are way down the list.

_______



There is another way to watch video. But I am fairly insane on this and its extremely impratical.

I have a insanely modded Panasonic Plasma ST60 that uses boards from a VT and has a tons of really extreme mods. Its so modded its illegal because it radiates RF so intensely its really a 200 watts radio jammer. No joke. I knock out AM radio for a 1 block radius with it on. BUt OH MAN the pic is insanely good..

While plasma has serious limitations like bit depth ( some banding ) and has flicker because its REALLY displaying the frames in cleanly defined flashes like a film camera with a shutter,, its other qualities are just stunning..

This was before a TV could really process a video signal. So it just takes the pixels and displays them. Its 2K. So I can feed it a 2K signal and this matches up to the pixels exactly and no scaling required. 4K is donwconverted which is MUCH easier and better quality they upconversion.

I have done countless blind A/Bs for people, lots of my SMPTE friends who are HIGHLY technical and VERY versed in picture quality and EVERYBODY prefers the Plasma VS a OLED. Well,, with a caveat, they all complain of course about some banding with the display panel being bitstarved. BUT after doing a AB, they still prefer the plasma.

So while I have handy a GTZ380 and a few generations of Sony OLED,, the best pic comes from a higly modded plasma that works as a radio jammer.

I like using the Plasma for development work because i can see subtle pic differences i can see with 4k device.
 
You mean HDR vs SDR ? Or actual HD ( 2K ) VS 4k or even 8k ?

Both are good questions..

The HDR vs SDR vs Dolby Vision question is quite topical as it turns out..

I recently redid part of a section of my lab/production area. I have used a computer monitor and a 4 port HDMI switch to check incoming ATVS and do burn-in once I mod them. The computer monitor was fine because I do not judge pic on that setup, I just need to see them working, update firmware etc.. I got a Sony 42" A90J and swapped out the computer monitor. After setting up the unit the ATV asked if I wanted to "Try out Dolby Vision".. Ok, ok,, sure... Just mainly to test the new 4 port HDMI2.1 switch.. So it flipped into Dolby Vision mode. This looked surprizingly good. I have another A90J that has older firmware, I fact reset it and looked at DV on it. Sure enough, the newer firmware looked better. I updated its firmware and yep, it looked better.

So fact reset on the TV, then ATVX to Dolby Vision. This defualts to Dolby Vision "bright".. WHich in my lab makes sense. After a lot of playing with SDR and DV, DV turned out to be reasonable. At least so far on the material I have tried. This is with the ATVX set to match frame rate but match dynamic range to OFF.. Meaning its always in Dolby Vision. I made minor changes to the A90J, the normal stuff like motion stuff.

SDR can still look better. But its gotten a lot closer. It most likely is also that 18.x Apple firmware updated DV too. Dolby vision can make things look a bit unnatural. Colors look more "cartoonish" rather then natural. Dolby is doing some gamma stuff and making it more appealing to the eye even tho that is not the way it ACTUALLY looks. So if your a post production person keeping tone maps out of the picture as each one is different for each TV & model, then SDR is better then pretty DV, so SDR is still best. But its much closer now and many people might actually prefer it. At least on the Sony A90J/K..

Its getting hard and harder to stay in SDR. Directv 4K channels for example REQUIRE HDR / DV not just 4K. Some other apps also do this. Its sorta like OK Open reel tape is better, but its impracticable.

A quick refresher.. HDR/DV does not have any more data. Its the same bitstream. HDR and DV take the SDR data and use metadata to spread out the existing SDR data into a bigger space of brightness range. However, there is no such thing as a real HDR TV. Science can't do that yet. This requires way more contrast range then any science currently. TV can take the HDR/DV metadata and then use a tone map for that set to guess where best to map the brightness. There is no standard for tonemaps and so this is all over the place. TVs today are SDR, so giving them the straight SDR without the SDR > HDR/DV > HDR/DV > tonemaps > TV math and conversions can produce really good pictures.

So for a pure picture to see it the way it was encoded without extra processing, 4K SDR is the way to go.

DV is eye catching but not accurate and will vary by make/model of TV and on a badly HDR/DV mastered movie or show can look horrendous and so dark you cant even see it.

So for reference work on video material SDR.. For casual stuff that is not accurate but eye pleasing, DV..

___________

2K HD vs 4K

I know there are people out there doing HD who swear its way better then 4k. While that sounds crazy, its not. Your bitstream from a service is gonna be about the same for both. This is because content owners are hyper focused on costs and bandwitdh to stream content is one of the biggest costs. Mbps = $ / sec... What REALLY controls pic quality is the amount of compression.

A 4K pic is 8,294,400 pixels in monochrome. 8 bits minimu per pixel. 66,355,200 bits per frame. lets say 24 frams per second. 1,592,524,800 bits per second. 1.6 Gbps. Without color. 4:2:2 will double this minimum. So uncompressed 4K is about 3 Gbps - conservativly..

So right off compression starts right at the camera. Then more compression for storage.. Then cascading compressions from various stages along the way..

Final playout to your device is like 30Mbps avg. This 100:1 compression takes its toll.. Cascading decompression and recompression also takes its toll..

So the pipe at the end, the stream to you, is REALLY comportant to quality and its not gonna get higher because the costs grow exponentially.

OK if our pipe into the house is fixed at 30-50Mbps,, is it better to have less compression or more pixels and more compression ? There is a good argument that keeping compression lower is better then more pixels.

COmpression works by lowering resolution of things in motion. Well its one aspect of compression, but a big one.. So a 4K pic that is moving a lot drops in resolution because of compression. In fact, VERY little actual material is REALLY full 4K. Studios have figured out people cant really see 4K and they can still say its in 4K even tho after compression its mostly 2k or less..

So 4K is pretty much never 4k.

So 2K, HD, is less compressed to fit thru the same size pipe. Its lower compression can look pretty darn good. Its actual resolution stays closer to 2K.

So watching things in 2K is not as crazy as it sounds...

BUT.....

WHat the TV does with 2K/HD becomes critical. Upconverting. Scaling.. There are a lot of patents on this and it costs money for a TV maker to do this best. LG spends no money on this and thier upconversion looks terrible. You need a lumagen to play good 2K/HD on a LG. However a Sony 4K can look really good as they spent the money and engineering on upconversion. I can easily fool someone into going "OMFG, Thats the best pic I have ever seen" by using a highly modded Oppo and doing 1080 bluray on a Sony GTZ380.. They all looked shocked when i tell them is 1080 HD. So watching things in 2K vs 4K COULD make sense and even be better IF your TV handles upconversion well.. It varies..

On a ATVX you can go to video settings and pick any resolution and try these things out. The list is incredibly long and some more esoteric options are way down the list.

_______



There is another way to watch video. But I am fairly insane on this and its extremely impratical.

I have a insanely modded Panasonic Plasma ST60 that uses boards from a VT and has a tons of really extreme mods. Its so modded its illegal because it radiates RF so intensely its really a 200 watts radio jammer. No joke. I knock out AM radio for a 1 block radius with it on. BUt OH MAN the pic is insanely good..

While plasma has serious limitations like bit depth ( some banding ) and has flicker because its REALLY displaying the frames in cleanly defined flashes like a film camera with a shutter,, its other qualities are just stunning..

This was before a TV could really process a video signal. So it just takes the pixels and displays them. Its 2K. So I can feed it a 2K signal and this matches up to the pixels exactly and no scaling required. 4K is donwconverted which is MUCH easier and better quality they upconversion.

I have done countless blind A/Bs for people, lots of my SMPTE friends who are HIGHLY technical and VERY versed in picture quality and EVERYBODY prefers the Plasma VS a OLED. Well,, with a caveat, they all complain of course about some banding with the display panel being bitstarved. BUT after doing a AB, they still prefer the plasma.

So while I have handy a GTZ380 and a few generations of Sony OLED,, the best pic comes from a higly modded plasma that works as a radio jammer.

I like using the Plasma for development work because i can see subtle pic differences i can see with 4k device.
Thank you for that! This may be one of the "top 10" posts I've ever read about HT video quality.

I must have misinterpreted something you wrote previously, and it may not even have been in these forums. Personally I don't like HDR because it's so stinkin' bright it hurts my eyes. But... I could've sworn you advocated for HD rather than 4K, and I've actually been using that on my ATV-X. Call me oblivious, but I've been quite happy with it.

I guess I'll try some more experiments...
 
So fact reset on the TV, then ATVX to Dolby Vision. This defualts to Dolby Vision "bright".. WHich in my lab makes sense. After a lot of playing with SDR and DV, DV turned out to be reasonable. At least so far on the material I have tried. This is with the ATVX set to match frame rate but match dynamic range to OFF.. Meaning its always in Dolby Vision. I made minor changes to the A90J, the normal stuff like motion stuff.
But if you leave it on Dolby Vision always, then its converting non-Dolby Vision HDR and also SDR video to Dolby Vision. Isn't this BAD picture quality wise?
 
But if you leave it on Dolby Vision always, then its converting non-Dolby Vision HDR and also SDR video to Dolby Vision. Isn't this BAD picture quality wise?
You would think so huh ? BUT In *Theroy* Dolby could just pass thru the SDR to the SDR TV..

I watched some Pluto TV. Startrek Original Series, Mission Impossible,, other very SDR and 2K things. They looked OK. Was SDR better ? for me yes, but, over 25 years of picture evaluation and being in front of the best displays and pic sources ever made,, I am jaded and I can find LOTS of faults with ANY picture.. Well.. a good print of Baraka in 70mm in a small screen with the right glass and xenon,,, well,, that is pretty damn good. I suppose carbon arc would make it better, but, I doubt that will ever happen. I am a fan of spectral reproduction, so i think all these displays that are based on Tri Stimulus are lacking in ways that we dont have good science for yet. And LED / Laser based razor thin spectra I feel are a long ways from real spectral reproduction.. IMHO.. So I think ALL image reproduction is lacking and maybe somewhere in the future we might figure out how to do full spectrum reproduction, not just razor thin tri stimulas.

So far for me,, 70mm film all made and processed back decades ago just leaves all electronic repoduction in the dust.

IMHO...
 
Switch X's getting ready to ship.. I run 20 TERAbytes thru them before they leave..

I'm thinking this is my next system move. Do you have any APs available to go with them? I assume Bill at GT Audio is still the right contact for a purchase...

Edited to add: I believe we have to source our own modules for the SFP ports, correct? What's the current "preferred" version and are there any recommended suppliers?
 
I'm thinking this is my next system move. Do you have any APs available to go with them? I assume Bill at GT Audio is still the right contact for a purchase...

Edited to add: I believe we have to source our own modules for the SFP ports, correct? What's the current "preferred" version and are there any recommended suppliers?
Bill can give you all of this info.
 
  • Like
Reactions: msimanyi
I had some family in from Minnesota staying with me for a few weeks in warm Arizona, so I had to set my basement Atmos system with an Appletv-X so that the Appletv-X remote would sleep and turn off the system. I still got to watch a bit down there for a few weeks. The catch is they left 10 days ago, and I reset the Appletv-X to never sleep. Well, 9 days later, for the past few days, WOW! I could again hear the sonic improvement and see the video improvement by simply leaving the Appletv-X always on never to sleep. I'd say it took a good 9 days or so for this change to fully register!
 
Interesting, thanks Steve.
I leave mine on all the time too but I restart daily, before every use. Any thoughts from Chris on a daily restart and effect on A/V quality?
 
Interesting, thanks Steve.
I leave mine on all the time too but I restart daily, before every use. Any thoughts from Chris on a daily restart and effect on A/V quality?
Daily restart has no effect on audio and video quality as far as I can see and hear. Also, I find the Appletv-X (or regular Appletv-4k) funky at times and have to restart it anyway - its a good idea to restart daily. Also a good idea at least once a month to leave it off for a minute, then restart, as this allows the random access memory to clear (at least I read something to this effect, not being a computer whiz myself).
 
Yep, VERY good reasons to make sure it stays on all the time. Honestly i find this true for all high end electronics. Just IMHO of course. The reason is thermal stability. ALL electronic parts drift with with temp. This can be rather dramatic. Thermal changes also mean physical size changes.. things normally get bigger with heat and smaller with cold. this matters way down on chip levels as the silicon changes in very subtle ways. Things like clocks rely on materials that change with temp.

A device does not just "warm up". Waves of heat spread out from devices that generate heat. Different materials have different thermal coefficients. So these heat waves are not uniform. In ulta slow motion its like dropping rocks into a pool of water and in the water are a number of obstacles and different depths. Its also like the water has different density in different areas. So a "warm-up" in any piece of electronics is very dynamic as it reaches a steady state.

Heat takes time to penetrate some components with low thermal conductivity. For example ceramic caps.

Eventually it will all reach thermal equilibrium. This is also where it is all aligned to and things like clocks are tuned to.

This is also true of my Switch X.

I think this is true of all electronics that really work at the cutting edge of electronic engineering science. Once you work out all the other engineering issues things like thermal drift, thermal noise, 1/f noise, shot noise and a number of physics issues are much more difficult engineering challenges.

It is indeed VERY important to keep the ATVX on all the time. Make sure that little white light on the ATV is always on.

Hmm... Restart... Well yes Apple has had its bugs. I am using the dev beta 18.4 now and it seems fine. Nothing broken I know of. But supposedly there is more coming besides the Snoopy screen saver. The thing that gets me and causes me restarts are lipsync and at times apps get just borked enough only a restart will get them back. The lipsync issue comes and goes and I have no idea what causes this. If I use Pluto and I set it to keep playing 24/7 it will over a week get out of sync. Thats repeatable. Other times its just way off and usually just a pause and play can get it back, sometimes not.. This is actually a industry issue and even SMPTE is working on standards as there is no standard to be sure lipsync is correct when playout for a consumer occurs. There are tools to manipulate it, but, no end to end standard to make it dead on.

I have not heard/seen a restart change anything.

I admit, its kinda appealing to start fresh each time.

My family uses a ATVX. This is a bunch of series on all sorts of platforms a night. The only complaint is when a app stops working OR OF COURSE the dreaded "you gotta login again" that things like Netflix and TCM throw ALL the &^%&^% time.. I have found 20 apps running on thier ATVX as kinda normal. Now and then a app misbehaves, I have taught them how to close a app and restart the app. This is also rare.

As far as full power cycles. Wowee. The family unit is right now at 7 months on 24/7. I am running dev betas on it with auto update on. So it does restart for those updates and of course doing auto dev updates is VERY dangerous. But it helps me be aware of any issues as they use it for at least 6-10hrs every day. Its on all day doing things like DirecTV Stream and news and stuff.
 
I want to talk a second about illegal content and things like side loading.

The ATVX warranty is blown if you side load anything. The reason for this is because sideloading allows a app Apple wont allow to have direct access to the TVos. This *could* corrupt things I cant fix and a full reset and download cant fully wipe. So this is a big no no.

The ONLY reason to do side loading is to load apps that play content illegally.

This is widely popular for Nvidia Shield.

Illegal content is also downloaded and played off a local NAS. This also has issues as a lot of that content has formats needing CODECS specifically not supported by Apple.

The real issue here besides the felony involved per title, is that the user IS traceable. Despite VPNS and the like. This has all happened before. Napster was all the rage. Everyone downloaded crappy quality music. But one day, the music industry unleashed lawyers with powers to make money of people violating the law. Music studios set up Napster sites where they put up music and then collected who downlaoded it. This is coming for all these people downlaoding and streaming illegal content.

The ATVX is about the quality of video and audio. This illegal material is crap. IMHO people who can afford a ATVX can afford to pay for streaming services and movie rentals and this is top notch quality.

So I am opposed in every way to someone buying a ATVX and using it for illegal material. I also know people in the buis and this is stealing from them. So I want to be clear, if I think someone is going to use a ATVX for these types of uses, I will not sell them a unit for a big long list of reasons.
 
And since I am posting..

A number of clients are saying the Switch X improves pic/sound on the ATVX. I am NOT trying to promote it. I have more sales then I need already and do not want to get ANY bigger.

But I admit I do see/hear a improvement with a Switch X. So have others.. WHY ? I DO NOT KNOW and honestly, I still find it hard to follow why.

BUT.. SOme notable guys in the industry who have had ATVXs for a while had recently talked about it.

 
  • Like
Reactions: msimanyi
Hello there @Xymox,

I have been following your interesting posts. Your work and research are really impressive. Wishing you all the best and great success!
Wanted to ask a simple question if you wouldn't mind?
Theoretically, could a wireless access point located far from the hi-end system, on a different room, but connected physically to the Lan network as a layer2 bridge to the main (ISP provider) router which sits near the hifi system (wifi signal is disabled on this router) introduce interference to the network?


 
Last edited:
  • Like
Reactions: msimanyi
Hello there @Xymox,

I have been following your interesting posts. Your work and research are really impressive. Wishing you all the best and great success!
Wanted to ask a simple question if you wouldn't mind?
Theoretically, could a wireless access point located far from the hi-end system, on a different room, but connected physically to the Lan network as a layer2 bridge to the main (ISP provider) router which sits near the hifi system (wifi signal is disabled on this router) introduce interference to the network?

Just adding that your timing is perfect: I'm debating if I should do the WiFi-X with my Switch-X order.

My AP would also be remote from the Switch-X and stereo/home theater equipment.
 
Some of you may be curious about this:
I just installed the Schnerzinger EMI and GRID. These are devices that remove EMI and clean the AC (phase, ground, noise) without using capacitors and filters.There is a thread on their products in this forum.
What really amazed me in regards to video is that it noticeably removed noise from the picture. My projected (JVC NZ9/25LTD) looked more like a panel - the small amounts of random noise that I could normally see were gone!
Keep in mind that the ATVX is already powered from a Shunyata Denali V2 and I have a Taiko Ethernet Router and Switch, and the ATVX is connected to that via Ethernet. Each of these already improve the noise noticeably. I was therefore not expecting to see such an improvement from Schnerzginer’s EMI and GRID.
I have no affiliation to the company. Highly recommended!
I have a JVC NZ8. I just received my apple TVX a few days ago. Im having a hard time getting a good picture. SDR seems to be OK, but anything else is not sharp. Can I ask what settings you have on your NZ9 and your apple TVX?
 
Thank god... There is work progressing towards a standardization for pic controls to do setup. Tontoverde has the same issue we all have, the settings and combo of settings like a rubicks cube are nearly impossible to navigate even for someone like myself or other SMPTE engineers and calibrators. I have seen calibrators get things really wrong. Things are a huge mess. There is a video on how to set a NZ8 that starts off by dropping laser power to 0 for SDR but keeping laser power at about fact setting for HDR10+. WHY ? In the comments this "expert" explains that SDR does not need the brightness !?!.. What he is really doing is dropping SDR brightness to be closer to the dysfunctional low brightness of HDR, at least IMHO. It makes no sense. This vid shows the issue discussed in this SMPTE paper. TONS of those settings are nearly impossible to figure out. But its not just JVC of course. Sony has all manner of settings that are completely unintelligible. People who really work with each model of device WILL eventually figure most of it out, but then you get the issues with combos of settings. For example, its possible to loose bit depth ( number of steps ) in brightness and levels of each color. So while it might be possible to better adjust gray scale, the bit depth might suffer. I have not encountered a "calibrator" who understood what the adjustments did to bit depth measured off the screen. This is just one example. Generally,,, the least changes away from factory, but done in the right way, yield the best results. One has to be VERY VERY careful with anything that adjusts any thing that affects brightness ( light output ). The reason for this is 2 fold. 1) depending on the imager chips and how all this works, most likely you will be using less of the full dynamic range of the imaging system. This can be a loss of bit depth or other more analog type results. The other very physical issue with dropping brightness, in projectors mostly, is you ALWAYS end up with less contrast ratio because of losses in the optical path. You end up with less signal to noise ratio in a light path. So IMHO you always make sure to use the entire contrast ratio range of a display. Maximize the digital and analog electronics and the optical path. This is a mistake I see almost all "calibrators" make.

Changes in contrast ratio can look like sharpness decreases to the eye.

Display makers KNOW all this. Factory settings can be close in some ways to maximize the actual light output bit depth and luminance resolution.

On top of this whole big mess of cryptic settings that are so complex SMPTE engineers can't fully figure out there are "experts" on the web doing things that are just CRAZY wrong.

On top of that.... Most displays today bring up full sets of different settings for each video type. So if you are viewing HDR and it switches to some content in Dolby Vision or SDR a whole different set of settings is applied.. So most displays require 2 or 3 complete sets of settings. Lots of people miss this and think when they adjust something like brightness it affects everything. But along comes another show and evenrything looks different again and the setting they did seems to have vanished.. They reset it and only later it will switch back to some other content and go back to other settings..

Its all so ****** it no wonder even smart savvy consumers resort to going to "experts" on youtube and web sites for settings. WHO CAN BLAME THEM. It makes for fear of messing things up, which is a real possibility.

As each display has completely different settings, each model of a mfgr can be different, there is no way to simply educate yourself ISF style and go adjust your display.

Thankfully Sony OLEDs come out really close in most aspects. Just a few simple adjustments can make them amazing. Projectors are far more complex because of thier lower contrast ratio. Even the best projectors made by man have limited CR simply because of optical elements.

So this SMPTE led mission to standardize settings and have them make sense and be common is WAY overdue.. I pray this mission that Disney has championed will work out...

SMPTE paper https://assets.swoogo.com/uploads/4654942-673f4ae2e7d03.pdf
SMPTE presentation of this paper. https://assets.swoogo.com/uploads/4654936-673f4ac81219a.pdf
 
Last edited:
Hello there @Xymox,

I have been following your interesting posts. Your work and research are really impressive. Wishing you all the best and great success!
Wanted to ask a simple question if you wouldn't mind?
Theoretically, could a wireless access point located far from the hi-end system, on a different room, but connected physically to the Lan network as a layer2 bridge to the main (ISP provider) router which sits near the hifi system (wifi signal is disabled on this router) introduce interference to the network?


Thank you Surfing_dude. I consider all this my life long passion. My electronics are my art form.

I did a video showing and discussing how wifi can affect audio. Normal access points BLAST RF at the maximum legal power and have gotten bigger and bigger spectrums to transmist more "speed".. A access point is best thought of a radio transmistter emitting more garbarge then a plug in switch mode power supply wall wart. We all know those are bad news. So a access point is also a bad device.

"RF power decreases logarithmically with distance, meaning that doubling the distance results in a significant drop in power received, following the inverse-square law"

So this means that the RF power induced into other devices as noise drops dramatically with distance. BUT... If you can recieve the Wifi at the distance involved, then, that RF signal is picked up by interconnects and metal gear and bits. While a tiny signal, it is present, even if you only pick up one bar on a phone.

When I do a dedicated listening room, I try for a RF anorchoric chamber. In the best room I made I use all sorts of materials and crazy stuff to block ALL RF from the room. Things like 3 feet thick conductive concrete walls, floor celiing. These walls had 3 layers of 1" rebar on 10" centers with every point welded. These triple cages encompassed the whole room inside the concrete. Each one was grounded at 1 point and then hooked to the grounding system that included 14 40 feet deep chemical ground rods. The rebar acts as a faraday cage to WAY down into VLF frequencies. Nothing is perfect, but, inside the room is a stunningly low amount of RF. This also means the power system all had to go into steel conduit to keep power line noise from getting into the room. Lights and plugs all done like you do a MRI scanner room. This extreme treatment was meant to remove RF from the world into the room. I used a high end spectrum analyzer a lot in that room. So no cell service of course in that room. Once fully RF dead I could then put in wifi. This all led to knowing wifi really well and learning what it did.

RF is best avoided. There are literally millions of channels of RF in the air around us. TONS of noise sources.. Wifi and cell service is the loudest. Wifi is really loud. Because of the inverse-square law having a access point close to gear is BAD..

Get it as far from your gear as possible.

The Wifi X was tuned in the above room. It uses a very narrow 5ghz single channel. Its max speed is 130Mbps. Speed = bandwidth = interference. 130Mbps is fast enough for doing nearly anything in this use. It is adjustable of course all the way to 1Gbps 2.4/5ghz but its not recommended. I used a Kronos TT and a CH Precision phono pre and looked at the RF induced into that gear to verify its use close to sensitive gear as one fo the things i did.

That EXACT model of Aruba access point is the right one to use after testing of, well, everything I think.. Only older models of this Aruba access point are best. HP bought Aruba and IMHO screwed it all up. So I have access to a stock of NOS Arubas and I load custom firmware and config made for audio use.

I give away these WiFi Xs. I do not make money on them. We dont really sell them seperate. A wifi access point is REQUIRED with a Switch X because of its isolated network. So the Wifi X is required for everything except maybe when its used ONLY for a ATVX. Its a optical item when ordering because people also stack 2 Switch Xs and then you only need 1 Wifi X.

So the Wifi X is actually a well researched device that took a bunch of time to work out firmware and config. It also was the end result of years of experence in a RF dead room.

Why wifi affects audio devices.


Why the Aruba is best..

 
Last edited:

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing