Does anyone understand the need for all of these digital components?

Laptop with better powersupply (old linn dirak 19V dc)fits purely by chance from the voltage;) router fritzbox better powersupply from keces.
I run foobar for ripping,streaming and cd/dsd playback so much good plugins for this program.
remote from the couch with foobar app andriod. With usb in mutec 3+ reclocker then with 110ohm aes/ebu to the metronome dac. Much better sound then usb.
Close to analog playback when same master used for cd/lp.
 
  • Like
Reactions: Rexp
I have an Apple Vision Pro that I use to stream very high resolution VR movies from Apple TV. Guess what, it started to pixelate. It choked, and it spluttered, and it struggled. Not just VR movies, but also Netflix, and also Disney+ and MAX etc. What was the problem? Well, as it turns out, the Apple Vision Pro is not a Wifi 6E device (that's only on the iPhone 15 Pro and iPhone 16, and some MacBook Pros and the new Mac mini's). So, it had negotiated a lower bandwidth transmission protocol which I had to manually reset. Finally, it found the right negotiation protocol at 5 Ghz, and all is well again.

Moral of the story: in real-time content streaming, there is no such thing as bit-perfect streaming, either in video or in audio. It does not exist, and it cannot exist.
This is not correct. Steaming is actually bit perfect.
When you see pixels, that’s noise or dropout related most likely. The bits are getting there though.

You don’t need Wifi 6, let alone 7 to stream content to Apple Vision Pro. I’m not supposed to say this, but I (now) work for Apple. You also cannot stream from ATV to Vision Pro as far as I know. My point is that if the stream isn’t playing correctly, it is not related to your Wifi speed or the resolution of the file. It’s probably interference, especially since you say that 5 GHz solved it.

The issue is not that the stream isn’t bit perfect (it is). The issues are noise and timing of the bits.
 
  • Like
Reactions: audiobomber
So this switch between my router and my Antipodes stuff, is it likely improve to improve the sound?
All networks are different and I don't use an Antipodes, but a demo of a Synergistic Research switch made a surprising improvement to the noise floor (my assumption) when used with a Grimm streamer/DAC (which is known to mostly be immune to noise carried along ethernet). Needless to say, you don't need to know how this works (I don't) nor believe that it might (I didn't), in order to try one (with return privileges).

Of course, there are many such switches. My impression of the SR is that it does not add its own sound, just lets the DAC do its thing more effortlessly. The SR typically includes a decent power cord and ethernet cable, but if you have a preferred brand of those cables available, I suggest you try them. Both the power cord and ethernet cable make a difference (I was skeptical about ethernet cables until I tried a few).
 
  • Like
Reactions: AudioHR
I am referring to the Taiko Audio network Switch and Router. They each made a significant improvement to the audio quality. These things are highly dependent on the design of the circuit, so I can’t speak for any and all of these “audiophile” routers and switches. Most are probably snake oil or have limited benefit… best to try them in your system in a way such that you can return them if you’re not happy.

Taiko actually offloads most of the network related processing to the Router, and the connection to the Switch is not via Ethernet. They really went crazy (in a good way) with the circuit engineering.
 
  • Like
Reactions: MarkusBarkus
Wow, have you ever streamed video from Netflix or Amazon or Hulu etc.? Did you never notice the pixelization of the video?
I have never seen pixelation or freezing from Netflix or Prime Video when connected via LAN cable. I have seen it with wi-fi, but that was due to the poor wi-fi transmission in my home, which was built using a lot of concrete and rebar.

I once did a homebrew bandwidth test. I streamed 1080P Netflix from the internet and DSD 512 (8X DSD, 512 times the sampling rate of a CD) from my NAS, simultaneously, through a 100Mbps dual pair ethernet cable. My streamer and TV were both connected to an EtherRegen. Both systems performed flawlessly. Disconnecting the TV link had no impact on sound from the DAC. Your "need for speed" is nonsense.
No streaming algorithm that works in real-time could possibly ever be guaranteed to work in a bit-perfect mode. That's impossible.
Ethernet transmission is not real-time. Packets are queued by the sender. If packet delivery is delayed too long or lost, there will be, per my previous post, "ugly distortion or a dropout". That is what you are seeing with your pixelated Netflix issue. Fix your home network.
 
Last edited:
So, these algorithms are adaptive. They try their best to recover the signal. I'll give you a concrete example. I recently "upgraded" my house wifi to Wifi 7 (Netgear Orbi 7) and also upgraded by internet to 1 gig download, and my cable model to DOCSIS 3.1 (Motorola). Great, but this technology is pretty newfangled. I have an Apple Vision Pro that I use to stream very high resolution VR movies from Apple TV. Guess what, it started to pixelate. It choked, and it spluttered, and it struggled. Not just VR movies, but also Netflix, and also Disney+ and MAX etc. What was the problem? Well, as it turns out, the Apple Vision Pro is not a Wifi 6E device (that's only on the iPhone 15 Pro and iPhone 16, and some MacBook Pros and the new Mac mini's). So, it had negotiated a lower bandwidth transmission protocol which I had to manually reset. Finally, it found the right negotiation protocol at 5 Ghz, and all is well again.
I just noticed this post and am somewhat confused.
Because you specifically mentioned something about negotiation protocol at 5GHz.

Previously, when I was on WiFi 5 or below, often this is so much interference from my neighbors that I often have to split the 2.4GHz and 5GHz WiFi band and lock my devices to the 5GHz WiFi band to get good speeds and no drops via WiFi. Moreover, I also try my best to ensure that my own WiFi and my neighbors don’t overlap in the 2.4GHz & 5GHz range. I totally do a lot of manual stuff with my router.

But once I started with WiFi 6, I find that it’s way better to just let the router combine 2.4GHz and 5GHz together and let the router occupy as much bandwidth as possible as the WiFi steering starting with WiFi 6 onwards is so good that I get much better throughput naming my WiFi network under a single name with 2.4GHz & 5GHz (& 6GHz for 6E & 7) together and letting the network router figure out which device use which band. I also find even my older non-WiFi6 2.4GHz devices like my Anova oven locks on to the WiFi6 much better than when I manually separate out all the bands.

So when I read that you had to manually change transmission protocol and specifically for 5GHz, I wonder if you’re trying to run your WiFi 7 network as if it’s the older WiFi5 or earlier networks and getting suboptimal WiFi 7 performance. I actually did that at first with my WiFi6 until I started experimenting further.

And no. My Vision Pro doesn’t pixelate on WiFi6 with this setup.
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu