@treitz3
I'm not schooling anyone... just pointing out, in a "Measurement Based Audio Forum" that you can not "measure" the impact of ethernet on streaming data.
The "measurement" fact is undeniable unless the hardware itself is faulty.
I have, indeed, done development of audio over IP. So, I have "measured" this. I provided some of my work background to support my claim on this. It was important when we were laying out the first and second generation of multimedia over IP ( in the 90s working with ATM, optical, ethernet, firewire, USB, satellite, cellular, cable, ISDN. DSL ) to ensure that we could eliminate this. There was a LOT of money at the time flowing into the field. Cable, satellite and telcos all wanted a piece of entertainment streaming to the home.
Indeed, the biggest issue we used to have was the use of USB-1 by many commercial manufacturers up until so late in the game. We knew, and measured, the issues with the limited bandwidth of USB1 and synchronous mode. We knew, and measured, that asynchronous mode using USB 2 was the solution, but the cheap skate consumer audio manufacturers were way behind the curve. Oddly, the more High End they were, the worst they were because of their low sales volumes and the relative high cost of their non recurring R&D - they couldn't afford to keep up with the fast pace of the computer electronics industry.
By Y2K the problem was solved. At the time, if you wanted high quality streaming sound you used at least 100Mbit ethernet (from the modem) and USB2, PCI or Firewire. And a player with a big data buffer to eliminate any synchronicity issues between data I/O. As it turns out, the size of the memory buffer was the single most important key as it could eliminate the bursting of a bad datalink ( cellular was the worst at the time ).
Still is. But most datalinks today are quite robust nowadays and memory is dirt cheap.
As an aside... CD players also had a similar issue with jitter, remember? I recall that Arcam (I think) were the first to put a little bit of a RAM buffer to decouple their IO. The clocks used for reading the CD and for driving the DAC were separate. This is pretty much the same thing we do with a data link and a DAC.
BTW, as an audiophile, I provided just a hint of the complexity of the LAN in my own home. I stream data off my PLEX servers and I hear no issues with ethernet. I also mount the file systems and "pull" using VLC and Foobar and I hear no impact on the music. My current system now uses GigE drops to all wired ethernet interfaces and 802.11n for the wireless devices. I can assure that there is no impact on either on audio playback.
I do hear the difference between VLC and Foobar, but that's the quality of the decoding. I prefer Foobar. Tidal HiFi's own player is very good as well... The biggest issue that I have with all of those players that is the inability for Chromebooks to play at higher than 24/48. Whereas Android and Windows do support Perfect Bit play back. But that's the drivers in the OS, not the fault of the decoding engines.
I hear differences in the op amps in the DAC. I suggest that all High End DACs should support op-amp rolling by supplying a socket interface, not an SMD set up.
But these are issues in the audio drivers in the software, not the data link.
If someone wants to dispute this, fine with me, provide some proof. Or at least qualify your experience on this part of audio.
My simple claim is that you can not hear an impact from ethernet streaming on your audio playback. If you hear such, then the problem is elsewhere in your system, most likely in a ground/noise plane in your system.
But as always, it's up to you, if you feel happy with it, go for it. After all, I love a bit of negative 2nd order harmonic myself. Heck, even sometimes, I enjoy just a little bit of the old 3rd order harmonic too... it makes the drum kit sound faster.... seriously, and yes! you can measure that!
Have fun.