How COULD upgraded Ethernet cables make a positive difference? What's behind it?

This is not at all used by me as a strawman. I know the argument of noise and have asked for data on who has measured it way back in the thread. Nothing about answering Carl's question has anything to do with dismissing that argument.
From my perspective you have employed here & on other forums at least two & possibly 3 strawman arguments, consistently:

- you continuously simplify & misinterpret, despite all the explanations given to you, the meaning of noise in this context
- as a consequence you create a strawman argument that DAC manufacturers are negligent in not isolating their devices from such a SIMPLE noise issue
- you have in the past shown by your measurements, your misinterpretation (which I now believe to be willful) of noise in this context
- you introduce data corruption as a red herring not as anything now relevant to the thread - only apparently relevant to you & cjf, who, btw states this on ComputerAudiophile "I'm a believer when it comes to using good quality cables of any flavor. Ethernet cables are a perfect medium for carrying the really ugly stuff upstream to the devices that they attach to." So I'm not sure why he's acting as a dupe for your continued strawman arguments?
 
From my perspective you have employed here & on other forums at least two & possibly 3 strawman arguments, consistently:

- you continuously simplify & misinterpret, despite all the explanations given to you, the meaning of noise in this context
- as a consequence you create a strawman argument that DAC manufacturers are negligent in not isolating their devices from such a SIMPLE noise issue
- you have in the past shown by your measurements, your misinterpretation (which I now believe to be willful) of noise in this context
- you introduce data corruption as a red herring not as anything now relevant to the thread - only apparently relevant to you & cjf, who, btw states this on ComputerAudiophile "I'm a believer when it comes to using good quality cables of any flavor. Ethernet cables are a perfect medium for carrying the really ugly stuff upstream to the devices that they attach to." So I'm not sure why he's acting as a dupe for your continued strawman arguments?
John, you keep resorting to personal commentary. I am not interested in that as I mentioned back in the thread. It is all noise, angst and frustration on your part that doesn't interest me to address.

As for Carl acting as my dupe, I am going to ask you to be more professional and not resort to such name calling.
 
Ah well, I did try!

I already stated where this research is at but it appears that you don't seem to understand? There are various stages in any research - the first of which is a broad question for research - such as "can different ethernet cables audibly affect sound?" which needs to be narrowed down & honed into a testable hypothesis - mostly through observation & fitment of these observations to known mechanisms - often extending these mechanisms.

At least one possible hypothesis has been formulated but testing of it requires some specific skills & measurement techniques.

However, with or without measurements we always need to personally evaluate with listening so in essence you want measurements but this is not a prerequisite for audibility - in fact I would say they are relatively orthogonal.
I have said three times what I don't understand. I am looking for a *name*. The name of the person and what work so far has been done. You have post half a dozen times but still not saying who the person is that is doing this research.

What is the hypothesis you talk about and what is the reason for its secrecy?
 
I have said three times what I don't understand. I am looking for a *name*. The name of the person and what work so far has been done. You have post half a dozen times but still not saying who the person is that is doing this research.

What is the hypothesis you talk about and what is the reason for its secrecy?

So you have looked at the work that has been done by the two engineers I referenced?
What are your exact questions?
To anyone reading this thread & doesn't have an agenda, it is very clear what the hypothesis is - S&M just restated it.
 
So you have looked at the work that has been done by the two engineers I referenced?
What work? You gave this link for Chord just today: http://www.head-fi.org/t/702787/chord-hugo/12795#post_12262342

There is nothing in there with respect to Ethernet cables. And what is there is stated as summary position. There is no measurement that he is speaking of and certainly no detail of any listening tests to back what he says occurs audibly. I wouldn't call that "research."

As for John, what link is there for him on the topic of Ethernet cables we can read?
 
What work? You gave this link for Chord just today: http://www.head-fi.org/t/702787/chord-hugo/12795#post_12262342

There is nothing in there with respect to Ethernet cables. And what is there is stated as summary position. There is no measurement that he is speaking of and certainly no detail of any listening tests to back what he says occurs audibly. I wouldn't call that "research."

As for John, what link is there for him on the topic of Ethernet cables we can read?

Trolling of the highest order, Amir

You asked what work was being done in the area & you are unwilling to investigate the leads I gave you. If you bothered to search you will find measurements.

You claim to not understand the hypothesis being presented here & show no interest in trying to investigate it.

I call that trolling.

As I said, Amir you are willfully blind & no one can open your eyes despite all your faux pleadings for others to present "data" which you 'claim' to be interested in.

Let's see what excuses you use to reject these references to noise floor modulation & audibility?

"Noise Modulation in Digital Audio Devices" From Richard Cabot of Audio Precision
Many methods exist for characterizing nonlinearities in digital systems including harmonic distortion, intermodulation distortion and level linearity. These techniques, while useful for engineering purposes, are not based on psychoacoustic principles and do not predict the audible performance of the equipment. The noise modulation technique offers an alternative which is modeled on the human hearing process and provides good correlation with subjective measurements. Results on typical equipment will be compared with those obtained using conventional techniques.​

Or this from Sterophile "Noise, Modulation, & Digital/Analog Conversion"
Where he also references Lous Fielder from Dolby Labs:
Psychoacoustic research by Louis Fielder at Dolby Labs (those guys know something about noise-floor modulation!) indicates that noise-floor shifts of 2dB are audible. Further, Dr. Cabot's paper asserts that the ear is very sensitive to shifts in the noise floor's spectral balance; changes on the order of 1dB are reportedly audible.​
 
Last edited:
Trolling of the highest order, Amir

You asked what work was being done in the area & you are unwilling to investigate the leads I gave you. If you bothered to search you will find measurements.

You claim to not understand the hypothesis being presented here & show no interest in trying to investigate it.

I call that trolling.

As I said, Amir you are willfully blind & no one can open your eyes despite all your faux pleadings for others to present "data" which you 'claim' to be interested in.

Let's see what excuses you use to reject these references to noise floor modulation & audibility?

"Noise Modulation in Digital Audio Devices" From Richard Cabot of Audio Precision
Many methods exist for characterizing nonlinearities in digital systems including harmonic distortion, intermodulation distortion and level linearity. These techniques, while useful for engineering purposes, are not based on psychoacoustic principles and do not predict the audible performance of the equipment. The noise modulation technique offers an alternative which is modeled on the human hearing process and provides good correlation with subjective measurements. Results on typical equipment will be compared with those obtained using conventional techniques.​

Or this from Sterophile "Noise, Modulation, & Digital/Analog Conversion"
Where he also references Lous Fielder from Dolby Labs:
Psychoacoustic research by Louis Fielder at Dolby Labs (those guys know something about noise-floor modulation!) indicates that noise-floor shifts of 2dB are audible. Further, Dr. Cabot's paper asserts that the ear is very sensitive to shifts in the noise floor's spectral balance; changes on the order of 1dB are reportedly audible.​

Where is the reference to Ethernet cable in any of that John? There isn't any.

This is what I asked you a few pages back that:
You have some data on that relative to the topic of this thread, i.e. differences between Ethernet cables?
This is what you answered:
If by data you mean measurements then no, it's too early - this is still at the investigative stage & like any such investigation evidence of effect & circumstances are being accumulated.

Where is anything that you presented above referring to Ethernet cables?
 
Where is the reference to Ethernet cable in any of that John? There isn't any.

This is what I asked you a few pages back that:

This is what you answered:

Where is anything that you presented above referring to Ethernet cables?
Well, I see you can be led to water but you still refuse to drink or as Dorothy Parker is claimed to have said “You may lead a whore to culture but you can’t make her think”

I presume you accept that that noise floor modulation & spectral makeup are audible but I would ask you to explicitly state whether you do or not?
 
Perhaps, it would help to reiterate a key point. The only mechanisms (of which I'm aware) that can affect the sound across a digital audio signal interface in a normal operating environment are 1) bit-errors, 2) jitter and 3) common-mode noise coupling to the analog circuits.

1) Different ethernet cables will effectively have the same bit-error-rate across the short, low-rate digital audio links commonly in domestic use. Therefore, bit errors are very unlikely to be responsible for any perceived sound differences.

3) Common-mode noise coupling to the analog circuits is a plausible factor, however, transformer coupled ethernet interfaces should effectively relegate such noise coupling to ultrasonic frequencies. So, although audio frequency common-mode noise coupling to the analog circuits via the ethernet interface is itself probably inaudible in a direct sense, the potential for intermodulation downconversion effects make it not impossible.

2) This leaves jitter. There are at least two sub-mechanisms for interface induced jitter. One, is common-mode noise corruption of the clock recovery (or clock generation) circuits via the interface. Clock jitter is an analog phenomena. The other is impedance mismatching across the digital interface, it's connectors and it's board terminations. Impedance mismatching should only be a major concern should the interface carry a clock recovery reference signal, such as S/PDIF does. I don't know anything about the transmission protocol for digital audio via ethernet. I would assume that the data-flow-control master can be at the DAC, such as with asynchronous USB, but that's just my assumption. Perhaps, someone knowledgeable on this will clarify.

In case the digital audio transmission protocol over ethernet contains a reference clock recovery signal, the following impedance matching concerns apply. Ethernet commonly utilizes Cat5 twisted-pair, which apparently is typically specified with a characteristic impedance of around 100 ohms +/- 15%, which is quite a variation for applications requiring an impedance matched line, such as for those carrying a jitter sensitive clock signal. That variation is just for the cable impedance, I've no idea the characteristic impedance of Cat5's modular RJ-45 type connectors, which should match that of the cable. Which is unlikely, since the cable's impedance is permitted a relatively wide variation. Then, there's the resistive board termination, which while stable, likely would be incorrectly impedance matched from one set of ethernet cable and connectors to the next simply due to the cable's permitted variation.

The mechanisms necessary to cause jitter related artifacts which might vary between different ethernet cables and their connectors seems present, whether that interface has the data source as flow-controller (via impedance mismatching, and via the source located clock generation circuit), or has the DAC as flow-controller (via common-mode noise corruption of a DAC located clock generation circuit).
 
Last edited:
Well, I see you can be led to water but you still refuse to drink or as Dorothy Parker is claimed to have said “You may lead a whore to culture but you can’t make her think”
jon-popcorn.gif


I presume you accept that that noise floor modulation & spectral makeup are audible but I would ask you to explicitly state whether you do or not?
Depending on level and make up, noise modulation can be hugely audible. I have heard it countless time in both music independent and music dependent versions. Indeed I earlier provided some experiments to see if they exist in Mike's system. Examples include NTSC audio tracks, old cassette tape decks, cheap computer DACs, etc.

If you are asking if it is audible in high-performance DACs that our membership here is likely to use the answer is no, I have not ever heard it in such systems. Any such DAC would immediately lose its credentials for being anything worthy of high-end.

Have you heard it and are you saying swapping Ethernet cables fixes it? If so, what were the circumstances and what did it sound like?
 
@Ken
I believe I mentioned this before as another possible mechanism which relates to any differential carrying signal cable:
- the balance of the differential signals needs to be maintained at the transformer otherwise you get differential to common mode conversion i.e CM noise is actually generated by the transformer. Again, these are of no importance for error free digital signal transmission but we are not dealing with that
http://www.sigcon.com/Pubs/edn/DifftoCommonMode.htm

And measured CM conversion slide_51.jpg

Cable issues like skew can cause unbalanced signals & hence CM noise generation
4218Fig01.jpg
 
Perhaps, it would help to reiterate a key point. The only mechanisms (of which I'm aware) that can affect the sound across a digital audio signal interface in a normal operating environment are 1) bit-errors, 2) jitter and 3) common-mode noise coupling to the analog circuits.
Hi Ken. Thanks for the summary. While potential for impacting the output of DAC using Ethernet interface exists as you note, my sense is that in practice when we are talking about cables, it is so far below other bits that matter that we won't be able to pin on cables. I am hesitant to post measurements and such on this forum but I will make an exception and post this test that Archimago performed: http://archimago.blogspot.com/2015/02/measurements-ethernet-cables-and-audio.html

As you see, he found no difference in every test he performed including jitter when he changed the nature of Ethernet cabling significantly.

That said, I plan to do some of this testing myself. I just bought an Audioquest Ethernet cable and plan to do more extensive testing than Archmiago did.

Thanks again.
 
jkeny:

I know that you're aware of all the following. I'm just stating it for the benefit of those who may not be familiar with the issue.

Common-mode to difference-mode conversion is why the best input transformers have had great pains taken to match the parasitic capacitance presented on each input lead. The problem isn't really the parasitic capacitance, it's that they aren't exactly matched. Which then serves to increasingly imbalance an otherwise balanced interface as common-mode noise frequency increases. Any impedance imbalance inherently causes any common-mode noise to convert to difference-mode noise and be amplified by the succeeding stages. Of course, a perfect matching of transformer parasitics isn't possible, efforts to do so can only push the resulting degredation up in frequency, where it will have less impact on the interfaced system.
 
Hi, Amir,

Thanks, for the link to that interesting page. Particularly interesting, of course, is the J-Test jitter chart featuring superimposed jitter sprectrum graphs. The measured results were quite impressive. My initial observations are that the excellent results appear to indicate that the Ethernet protocol utilized DAC side flow-control and master clock generation, which is ideal and should effectively eliminate any interface impedance mismatch induced jitter. However, the test does not appear to exclude ground-loop type noise as a jitter causing mechanism - unless a purpose common-mode noise injection was included in the test which I didn't note was mentioned. Ground-loop noise (which is one form of common-mode noise) depends on a number of factors, such as the quality of the local power distribution system and the presence of interference producing devices, such as motor appliances and appliances utilizing switch mode power supplies. Ground-loop noise also depends on the design specific susceptibility of the powered devices being interfaced.
 
Last edited:
jkeny:

I know that you're aware of all the following. I'm just stating it for the benefit of those who may not be familiar with the issue.

Common-mode to difference-mode conversion is why the best input transformers have had great pains taken to match the parasitic capacitance presented on each input lead. The problem isn't really the parasitic capacitance, it's that they aren't exactly matched. Which then serves to increasingly imbalance an otherwise balanced interface as common-mode noise frequency increases. Any impedance imbalance inherently causes any common-mode noise to convert to difference-mode noise and be amplified by the succeeding stages. Of course, a perfect matching of transformer parasitics isn't possible, efforts to do so can only push the resulting degredation up in frequency, where it will have less impact on the interfaced system.

Sure,Ken,thanks & readers should also be made aware that it's not just the matching of windings in the transformer that is crucial & can lead to the creation of CM noise but also cabling differences that can cause different levels of unbalanced signals which when presented to even a perfect transformer will also lead to the creation of CM noise. Cable issues such as twist topology, shield structure, grounding topology, shield current induced noise, etc. are all factors that can be in play.

My experience is with USB cables which I can confirm there are audible differences between some of when the replay system is reasonably good quality. I haven't direct experience of the audibility differences of different ethernet cabling but I see no electrical reason why it should be immune to the same CM noise issues that are audibly evident on USB transmission. The very fact that transformers are involved in ethernet brings pluses & minuses to the equation - pluses in so far as ground is isolated & negatives regarding the CM noise issues above. But I can confirm that in my experiments with USB, ground was not where the noise issue was, it was on the differential signal wires - isolation removed it but something else was still (or was introduced) on these signal wires that USB signal regeneration removed. I suspect that it was skew of the differential signals which when being handled by the final USB differential receiver, resulted in CM noise, much the same as would be the case with an ethernet transformer.

Of course Archimago (always reminds me of Mr Magoo) is as blind as Amir & his simplistic measurements will always show no difference, that's what his internet persona is about - quelle surprise
 
Hi, Amir,

Thanks, for the link to that interesting page. Particularly interesting, of course, is the J-Test jitter chart featuring superimposed jitter sprectrum graphs. The measured results were quite impressive. My initial observations are that the excellent results appear to indicate that the Ethernet protocol utilized DAC side flow-control and master clock generation, which is ideal. However, the test does not appear to exclude ground-loop type noise as a jitter causing mechanism - unless a purpose common-mode noise injection was included in the test which I didn't note was mentioned. Ground-loop noise (which is one form of common-mode noise) depends on a number of factors, such as the quality of the local power distribution system and the presence of interference producing devices, such as motor appliances and appliances utilizing switch mode power supplies. Ground-loop noise also depends on the design specific susceptibility of the powered devices being interfaced.

Ken, Magoo's jitter measurements are typical of the (lack) of care that he invests in his measurements. Has he characterised the jitter of the device he is using for measurement? Is it an order of magnitude below the jitter he is trying to measure? I mean are we relying on such rookie, not even at the level of technician, agenda driven measurements as evidence?

Magoo's comment "As you can see, there is some normal variability in the noise floor around the 12kHz primary frequency but otherwise, nothing sticks out. There's some low-level jitter around 12kHz, some of which I'm sure related to the E-MU device itself rather than just the Transporter." Shows the sort of lack of "nothing to see here" attitude typical of such attempts - if it isn't gross & hitting one in the face,it's of no consequence. What variability in the noise floor is seen around 12KHz? Is it cable specific? Is it repeatable? Of course we can't see the variability because of the gross measure used in the frequency x-axis. He has no interest in following up this - not even a technician's level of interest.

Commenting on them in any serious manner other than pointing out their flaws is granting them a status undeserved of such blundering. Citing them as examples of measurements says more about the citer than he may wish revealed
 
...There's some low-level jitter around 12kHz, some of which I'm sure related to the E-MU device itself rather than just the Transporter." Shows the sort of lack of "nothing to see here" attitude typical of such attempts - if it isn't gross & hitting one in the face,it's of no consequence...

Yes, the above faulty assumption seems to be too common among audio engineers. If some instrumented readout or display result doesn't offend the observer's eye, then it's assumed that the result also shouldn't offend the ear. Such a conclusion is not only patently faulty for an audio device, but illogical as well. I've before encountered that attitude in debates on another website (I'm not referring to anyone here) usually presented as proof of some preferred technical point. That person often derisively accused others of technical 'handwaving' when failing to prove some technical contention via only words, but apparently couldn't see that his misuse of an instrument reading essentially made him guilty of the same. I suppose the instrument made him feel more empowered, though no less faulty, in his technical handwaving.
 
Yes, the above faulty assumption seems to be too common among audio engineers. If some instrumented readout or display result doesn't offend the observer's eye, then it's assumed that the result also shouldn't offend the ear. Such a conclusion is not only patently faulty for an audio device, but illogical as well. I've before encountered that attitude in debates on another website (I'm not referring to anyone here) usually presented as proof of some preferred technical point. That person often derisively accused others of technical 'handwaving' when failing to prove some technical contention via only words, but apparently couldn't see that his misuse of an instrument reading essentially made him guilty of the same. I suppose the instrument made him feel more empowered, though no less faulty, in his technical handwaving.

Yes, I know of whom you speak ;)
 
Sure,Ken,thanks & readers should also be made aware that it's not just the matching of windings in the transformer that is crucial & can lead to the creation of CM noise but also cabling differences that can cause different levels of unbalanced signals which when presented to even a perfect transformer will also lead to the creation of CM noise. Cable issues such as twist topology, shield structure, grounding topology, shield current induced noise, etc. are all factors that can be in play.
You say these things as if you have such data but earlier I asked if you had any measurement data for Ethernet cables and you said no. Are you changing that position or these are stuff you are throwing out there with no basis to verify them for the topic at hand?
 
My experience is with USB cables which I can confirm there are audible differences between some of when the replay system is reasonably good quality. I haven't direct experience of the audibility differences of different ethernet cabling but I see no electrical reason why it should be immune to the same CM noise issues that are audibly evident on USB transmission. The very fact that transformers are involved in ethernet brings pluses & minuses to the equation - pluses in so far as ground is isolated & negatives regarding the CM noise issues above. But I can confirm that in my experiments with USB, ground was not where the noise issue was, it was on the differential signal wires - isolation removed it but something else was still (or was introduced) on these signal wires that USB signal regeneration removed. I suspect that it was skew of the differential signals which when being handled by the final USB differential receiver, resulted in CM noise, much the same as would be the case with an ethernet transformer.
Ethernet is both a physical interface and boatload of software protocol above it. It always, always runs asynchronous from the audio stream. It is also always non-real-time. It is also transformer interfaced. None of this is true of USB. To equate them is to think that riding an elephant is the same as a car because they both move you!

I recently did a tear down of the Sonore MicroRendu. I suggest reading that to understand the massive difference. A dual-core CPU running Linux with more power supplies than you can shake a stick at, including a switchmode input one is used to provide support for Ethernet input. None of that is remotely necessary for USB only interfacing.

Every bit audio sample likely goes through hundreds of thousands of CPU instructions before being available to be sent to the DAC. There are a ton of asynchronous and synchronous activities going on in the CPU core(s) and networking stack, none of which is the case with plain USB.

You put all that aside and went after what the cable does?

Please spend some time to read and understand the full system here. If you don't understand networking you really have no business even entering such discussions. You are just as blind as the wireworld person was talking about data errors and such.

There is a lot of complexity here that goes beyond even the most addicted forum junkie. You can't latch on a phrase like common mode noise or noise modulation and think that gets you any mileage.
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu