HDMI vs. Coaxial Digital Interconnects

fas42

Addicted To Best
Jan 8, 2011
3,973
3
0
NSW Australia
I must admit that I am fatiguing of your posts that look down upon all of us "who don't yet understand how audio works".
I apologise if that it is how I am coming across. What I am trying to do is suggest that there are factors at work, which people working at highly technical levels in electronics have to think about all the time, in terms of very low levels of distortion, and secondary electric effects, which people in the general audio world ignore at their peril, so to speak ...:D:D:D. Talk to an engineer at Analogue Devices about how scrupulous you have to be to make a 24 bit D/A converter REALLY work properly, to understand how fussy one needs to be about such things.

All I have been talking about has already been said by technical people many times before, going back 25 years or so, Ben Duncan in HiFiNews used to pound on about these things over and over again! Yet to read what people say even now, all that understanding and thinking has been thrown in the rubbish bin, just because it is a bit hard!

I'll also say I'm selfish enough to want to have other people gain greater pleasure from listening to their own systems, through better understanding of what they can do to improve them. For that too, I am very sorry ...

Okay, sorry, rant off ...:):)

Frank
 
Last edited:

amirm

Banned
Apr 2, 2010
15,813
37
0
Seattle, WA
If the designer CLAIMED that he had designed such a amp, that, 2000 watts and zero distortion, then it would be a failure.
You are right. I used a bad analogy. It was bad because the Federal Trade Commission regulates what companies can say about power rating of amps. Sadly, they have no regulation over empty claims of zero jitter. :D

I have seen countless DACs claim they have no jitter or ridiculously low jitter. Yet they never back that with a measurement, nor do independent reviews show that to be the case.

But the process of creating a jitter free clock from jittery data in is a straightforward design exercise, executable with standard design techniques, and in fact the purveyors of many DACS claim the ability to to be able to deal with high levels of jitter.
Straightforward? I consider PLL design to be the most challenging job in circuit design. It is part science, part black magic to get right. I have had to nearly fire engineers who after six months and three spins of their designs, couldn't get the PLL right to capture bits properly from SDI video connection over long cables. Tendency is to take cookbook diagrams from the IC company and use them but that never performs. You need to have excellent analog design skills including esoteric effects of PCB, power supply, coupling, etc.

And again, there is no shortage of "claims." There is a shortage of proof.

To repeat, Naim have solved the problem very nicely by sidestepping it: the DAC watches the data coming in and selects among a number of internal extremely stable clocks the one that best matches the incoming signal, the jitter is as low as how expensive a quartz clock the manufacturer wants to use ...

Frank
Have they really? Here is the web page for that DAC: http://www.naimaudio.com/hifi-product-type/583. Please tell me why there is not one measurement proving that? I mean if they have achieved zero jitter, then they should be shouting from mountaintop with measurements of the same. We are not talking subjective things that can't be measured.

I spent an hour looking for measurements. Sadly, most of the reviews are subjective in nature and lack measurements. I did fine one with some data: http://www.naimaudio.com/userfiles/modules/product/reviews/pdf/dac_hifi-world_jan2010.pdf

" Distortion levels were low at higher music levels but rose above
the expected value of 0.22% at -60dB, measuring a high 0.53% with a 16bit
signal. With a 24bit signal this figure should drop to below 0.1% or lower, but
with the DAC it remained a stubbornly high 0.38%, so the DAC isn’t especially
linear in itself.


Output was a normal 2.3V and EIAJ dynamic range a modest 106dB due to
the mediocre performance at -60dB. The Naim DAC should sound smooth
enough. It isn’t a low distortion design however and is unimpressive here. "

Well, guess what? Jitter increases noise floor or harmonic distortion. So while we don't know for sure the cause of the high level of distortion, we certainly have some arrows pointing to less than perfect execution. If these things are so easy to do, then why the above results?

There are many ways jitter can get introduced into the DAC. Even if you separate the clocks completely as they claim, jitter can still bleed into it from the power rails, by EMI, RF, circuit leakage, etc, etc. This is why I said good PLLs are hard to design. heck, even the power supply ripple can create jitter independent of anything else.

Also note that if Naim can't match the rate to its internal clocks, it resorts to resampling the audio (interpolation). That will arguably degrade the audio more than jitter.

So no, nothing about this is simple. Unless you have a lot of gray hair and really, really understand high-performance PLL, analog and audio design, you are likely to get this more wrong than right.
 

amirm

Banned
Apr 2, 2010
15,813
37
0
Seattle, WA
Here is some more. Quick search on sterephile web site shows these measurements on Estoteric D-07 DAC with two PLLs: http://www.stereophile.com/content/esoteric-d-07-da-processor-measurements

Keep in mind that anything more than 250ps removes low order bits of 16-bit sample at 20 KHz:

"I assessed the Esoteric D-07's rejection of word-clock jitter in many different configurations: TosLink, AES/EBU, or USB inputs; in single- or dual-PLL receiver modes; and with 2Fs, 4Fs, or DSD upsampling. Upsampling to DSD always gave the lowest measured jitter level, with AES/EBU better than TosLink and USB about the same as TosLink. Surprisingly, the dual-PLL mode gave only a small reduction in jitter sidebands, though it did affect the spectrum (see later).

Fig.14 shows the D-07's output spectrum decoding a 16-bit J-Test signal from the Audio Precision SYS2722 via 15' of plastic TosLink cable, with no upsampling and PLL1. Strong sidebands can be seen at ±120Hz (power-supply–related) and ±229.5Hz (data-related; all other data-related sidebands are at the test signal's residual level). The Miller Analyzer calculated the level of the sidebands to be equivalent to a high 1340 picoseconds (ps) peak–peak of jitter, which improved to 1065ps p–p with PLL2. Upsampling to 2Fs reduced the jitter further, to 900ps; 4Fs gave a slightly worse performance, 985ps, and DSD gave a reduction to 600ps. A similar trend could be seen with AES/EBU data: 1340ps with no upsampling, 705ps with 2Fs upsampling, 695ps with 4Fs, and 646ps with DSD.



Fig.15 shows that switching in the second PLL changes the level of the data-related sidebands but not the supply-related spuriae. But it also adds a broad hump of low-frequency random jitter, which spreads and increases the level of the noise floor either side of the central spike that represents the 11.025kHz tone. In this, the D-07's PLL2 mode very much resembles the Meridian 518 processor I reviewed in January 1996 (see the grayed-out trace in fig.1).



Note the bolded section. In theory, the second PLL should be all goodness but it is not. It is adding errors of its own. Did I say PLL design was hard? I thought I did :D. Here is what Estoric says on the DAC and Jitter:

"and a 2nd PLL effectively reduces the jitter for digital devices...."
 

RBFC

WBF Founding Member
Apr 20, 2010
5,158
45
1,225
Albuquerque, NM
www.fightingconcepts.com
Here's what Steve Nugent (of Empirical Audio fame) has to say about SPDIF cable length:

http://www.positive-feedback.com/Issue14/spdif.htm

The write-up for the Van den Hul digital SPDIF cables also mentions signal reflections. Other readings I did go as far as suggesting a custom length for each transport/DAC combination to minimize reflections. Stealth Audio suggests different digital cable architecture for different lengths to assure consistent 75 Ohm performance.

It simply seems that there is a mountain of information which supports the existence of cable-length effects upon DAC performance.

Lee
 

fas42

Addicted To Best
Jan 8, 2011
3,973
3
0
NSW Australia
There are many ways jitter can get introduced into the DAC. Even if you separate the clocks completely as they claim, jitter can still bleed into it from the power rails, by EMI, RF, circuit leakage, etc, etc. This is why I said good PLLs are hard to design. heck, even the power supply ripple can create jitter independent of anything else.
That, in one sentence, has perfectly expressed everything I have been rabbiting on in this forum. Yes, every one of those things is affecting the audio signal, distorting it, expressed here by the term jitter, but the very same things are happening in the phono stage, the preamp, the power amp, even within the reel to reel! Just elsewhere, it's not called jitter, it's called glare, haze, hash, noise, lack of musicality, on and on it goes. Low level distortion!!

The Naim can't claim zero jitter distortion, in that case it would be directly related to the phase noise of the crystal, as a major contributor. The fact there is still distortion is because it is still a box of electronics as you say, everything interacts, everything affects everything, the best you can do is to minimise the MOST significant problems as best you can, for the money spent.

Frank
 

RBFC

WBF Founding Member
Apr 20, 2010
5,158
45
1,225
Albuquerque, NM
www.fightingconcepts.com
The fact there is still distortion is because it is still a box of electronics as you say, everything interacts, everything affects everything, the best you can do is to minimise the MOST significant problems as best you can, for the money spent.

Frank

So, if I can reduce an audible distortion in my digital chain by merely lengthening an inexpensive cable, how does that contradict what you are saying here? Rather than trying to out-design extremely competent audio engineers, does it not make sense to perform tasks that reduce distortion with minimal invasion into the circuitry as one's starting point?

Lee
 

fas42

Addicted To Best
Jan 8, 2011
3,973
3
0
NSW Australia
In theory, the second PLL should be all goodness but it is not. It is adding errors of its own. Did I say PLL design was hard? I thought I did
Yes, never any guarantees that it has been done right. Though Tim might have a thing or two to say about how well a piece of decent pro gear would do that sort of thing ...:):):)

No, I won't pretend to comprehend fully the graphs, but I do note the levels these anomalies are occurring at: -100dB, -95dB, -115dB. Hmmm, where was that article about the $200k valve amps everyone was raving about that had wideband distortion at -60dB down, rising to -40dB on peaks ...:):):)

Frank
 
Last edited:

DonH50

Member Sponsor & WBF Technical Expert
Jun 22, 2010
3,944
304
1,670
Monument, CO
Lee, that write-up does a decent job of explaining the cable issue, at least to me. It correctly correlates risetime with cable length, reflections, and explains why a reflection returning at the wrong time matters. The only thing not covered is the impact when translated to the DAC, after clock recovery that should minimze (but not eliminate) the effects.

Frank, I am not sure where you are coming from, nor your level of technical expertise. I simply don't have additional time to devote to this and suspect I will never answer to your satisfaction if the answer does not agree with your perception. I am likely simply not good enough at explanations via the 'net; at least with colleagues and grad students I get a white board. I suggest you check out the Jitter 101 and 102A/B/C threads in the technical forum; there is a lot more detail there. For PLL's there are any number of textbooks available that have lots of pictures (and a few equations ;) ). Amir has also provided quite a bit of information and references, as have others. If you already know this decades-old stuff, perhaps you could contribute with some explanation and pictures of your own?

BTW, I have friends at ADI, and some have been coworkers through the years. After over a quarter of a century designing high-speed data converters, I guess I'm too old and slow for you, as I do not consider the clock circuit "easy", even when the requirements at audio are in the ps or 10's of ps range instead of the few to tens of fs I've had to deal with now and then. My experience is more in line with Amir's -- it takes science, skill, and a bit of magic to make a high-quality clock circuit. Process technologies evolve, requirements change, and the addition of large digital signal processors add design considerations and challenges unheard of 25 years ago. Not to mention that there weren't many if any 24-bit DACs around... There are a legion of ways jitter can get into the system.
 

fas42

Addicted To Best
Jan 8, 2011
3,973
3
0
NSW Australia
So, if I can reduce an audible distortion in my digital chain by merely lengthening an inexpensive cable, how does that contradict what you are saying here? Rather than trying to out-design extremely competent audio engineers, does it not make sense to perform tasks that reduce distortion with minimal invasion into the circuitry as one's starting point?
Lee, I agree with you there 100%. If you can do mod's, tweaks, enhancements which compensate for problems, no matter how minor, which improve the sound, yes, go for it as much as you can! What I am trying to say in part, is that the design engineers, for all sorts of reasons, such as cost cutting, design philosophies, will do things which are not optimum, and so you obviously should get around those if you can!

The only thing, sometimes I feel the mod's start to get a bit silly, a $2,000 cable on a $500 DAC, that type of thing, and that the mod's only create another level of instability: a good example is a cable that subtly distorts when vibrated by the sound waves, say.

Frank
 

garylkoh

WBF Technical Expert (Speakers & Audio Equipment)
Sep 6, 2010
5,599
225
1,190
Seattle, WA
www.genesisloudspeakers.com
Don and Amir, thank you for taking the effort and time to explain. While I don't pretend to understand 80% of this discussion, at least now I know what I don't understand. Because of the imperfection of the interface, it takes a longer digital cable to give the interface a chance to work better. Is that reasonable characterization of this problem? You both were far more comprehensible than two other engineers from competing companies who tried to explain to me why their DAC didn't work perfectly with the other's.
 

fas42

Addicted To Best
Jan 8, 2011
3,973
3
0
NSW Australia
Don, I am most certainly not proclaiming my knowledge to anything more than that picked up from the readily available literature. My point is that the "normal" jitter, that is, that due to the signal coming from the transport and passing through the cable up to the point where it plugs into the DAC can be, and has been, eliminated by a number of techniques, another example, the Genesys Digital Lens (spelling probably not right) years ago did this using a huge buffer.

Jitter can then be re-introduced inside the DAC by poor implementation within, and by other design deficiencies. Where this all started is by debating what cable is good enough to do the job -- my comment is that the worst cable should do the job IF the DAC circuitry was done properly ...

Frank
 

fas42

Addicted To Best
Jan 8, 2011
3,973
3
0
NSW Australia
at least now I know what I don't understand
You made me laugh, Gary, I needed that ...

it takes a longer digital cable to give the interface a chance to work better. Is that reasonable characterization of this problem?
No, not at all. The DAC needs to tune in and lock on to the signal, same concept as a car radio seeking for a station. No more and no less than that. If the radio transmitter is a lousy one, like those old, terrible short wave overseas radio broadcasts you were struggling listening to, then the poor old DAC has to wobble around electronically trying to keep up with the frequency changes. If it can't, no sound; if it has to struggle like crazy doing it, this will often mangle the sound as its circuitry bounces around keeping up. Hope that helps ...

Frank
 

RBFC

WBF Founding Member
Apr 20, 2010
5,158
45
1,225
Albuquerque, NM
www.fightingconcepts.com
This whole discussion began with my comment that HDMI, a notoriously high-jitter interface (according to many comments and manufacturers), sounded better as the link to my Krell's DACs then a 1m coaxial digital cable. It was suggested that the short length of the coaxial cable may be contributing to its comparatively poor sound. In order to rule out cable length as a causative factor, as well as confirm that the cable was not "damaged", etc., I elected to borrow a few digital cables and try them out.

Then, as can often be the case, the discussion took a wonderful turn as jitter in digital cable interfaces became the topic. Amir provided evidence that even very-high-end transports and DACs still struggle with the various forms of jitter. Apparently, no manufacturer has actually completely conquered the bugaboo of jitter, else everyone would be using that implementation in their digital products. Does that summarize things reasonably well?

I felt it would be good to review just how we got to this point in the discussion. :)

Lee
 

Orb

New Member
Sep 8, 2010
3,010
2
0
Hooray found a nice to read article that combines what we are saying :)
Covers in a nice simple way the aspect of clocking-data format I was going on about to get a feel how the 1s and 0s have meaning, and also shows-mentions the boundary effect of jitter (and also critically the D-A converter stage).

If it does not seem to make sense, please take time reading it as it is pretty good, and covers both discussions on digital.
http://www.soundonsound.com/sos/feb07/articles/interfacing.htm

Although shame it does not mention impedance mismatch (e.g. 75ohm slightly out which can be a problem with RCA or short cable length).
Thanks
Orb
 

Orb

New Member
Sep 8, 2010
3,010
2
0
Here is the best engineering paper I can find that explains S/PDIF-clocking-data very well, including showing rise and fall times, please scroll down to page 31 Stereo audio codecs with SPDIF interface.
Shows nicely with diagrams just how those 1s and 0s are structured for the 2 channels (important to remember that each channel is seperate - caused some arguments on SP forum as these can be used independantly for timing tests).
Takes a little time to fully load the document:
http://www.scribd.com/doc/12964494/SPDIF

Thanks
Orb
 
Last edited:

Orb

New Member
Sep 8, 2010
3,010
2
0
This whole discussion began with my comment that HDMI, a notoriously high-jitter interface (according to many comments and manufacturers), sounded better as the link to my Krell's DACs then a 1m coaxial digital cable. It was suggested that the short length of the coaxial cable may be contributing to its comparatively poor sound. In order to rule out cable length as a causative factor, as well as confirm that the cable was not "damaged", etc., I elected to borrow a few digital cables and try them out.

Then, as can often be the case, the discussion took a wonderful turn as jitter in digital cable interfaces became the topic. Amir provided evidence that even very-high-end transports and DACs still struggle with the various forms of jitter. Apparently, no manufacturer has actually completely conquered the bugaboo of jitter, else everyone would be using that implementation in their digital products. Does that summarize things reasonably well?

I felt it would be good to review just how we got to this point in the discussion. :)

Lee

Because we are geeks and audio-digital engineering is fun :)

But seriously all the banter recently helps readers to build up a good understanding of audio and how it is streamed/clocked.
Helps to give some meaning as to how those 1s and 0s are sent and relate to cables or S/PDIF and clocking-jitter.

Cheers
Orb
 
Last edited:

DonH50

Member Sponsor & WBF Technical Expert
Jun 22, 2010
3,944
304
1,670
Monument, CO
Don, I am most certainly not proclaiming my knowledge to anything more than that picked up from the readily available literature. My point is that the "normal" jitter, that is, that due to the signal coming from the transport and passing through the cable up to the point where it plugs into the DAC can be, and has been, eliminated by a number of techniques, another example, the Genesys Digital Lens (spelling probably not right) years ago did this using a huge buffer.

Jitter can then be re-introduced inside the DAC by poor implementation within, and by other design deficiencies. Where this all started is by debating what cable is good enough to do the job -- my comment is that the worst cable should do the job IF the DAC circuitry was done properly ...

Frank

Can be, yes, up to a point, but many manufacturers do not take those steps. Why? Because it is hard, and takes a lot of R&D (and lab tiime) to implement, and of course is costly both to develop and produce.

The cable can corrupt the data stream in a number of ways (attenuation, dispersion, impedance mismatch, poor shielding, etc.) so I would not say the worst cable would work... The other issue, reflections, can be a problem even if the cable is perfect. In fact, a perfect cable would allow ALL the reflections to bounce around, potentially making that problem worse.

Making a good high-frequency source and load is a challenge, even if not real high (maybe 10's of MHz for a coax link but could be 100 MHz+ for a video link -- I do not know the A/V data rates off the top of my head). A mismatch that is small to an RF engineer can cause large jitter at the DAC.

There is no simple answer to this issue...

I did find an old Analog Devices app note -- search their website for AN-756, "Sampled Systems and the Effects of Clock Phase Noise and Jitter". Brad's a good guy...
 

DonH50

Member Sponsor & WBF Technical Expert
Jun 22, 2010
3,944
304
1,670
Monument, CO
Don and Amir, thank you for taking the effort and time to explain. While I don't pretend to understand 80% of this discussion, at least now I know what I don't understand. Because of the imperfection of the interface, it takes a longer digital cable to give the interface a chance to work better. Is that reasonable characterization of this problem? You both were far more comprehensible than two other engineers from competing companies who tried to explain to me why their DAC didn't work perfectly with the other's.

Well, no... The problem with this subject is that, in trying to be "right", we are continually heading deeper into the swamp!

Let me hit just two points and try to bring some closure to the cable part. Note that this is simplified, so all you other engineers out there please ignore my lack of rigor.

1. A signal is launched from the source with impedance Zo into a cable X feet (meters, whatever) long and then hits the load. It takes a little time for the signal to travel through the cable from source to load. Assume the cable is lossless and impedance Zo ("perfect").

a. If the source, cable, and load are all perfectly matched, i.e. all have impedance Zo over all significant frequencies, then ALL the signal from the source goes into the load and all is well.

b. If the load's impedance is not Zo, then a voltage (and current) divider is formed. Part of signal goes into the load as it should, but the rest (the return signal) bounces back toward the source. At the source, the return signal now modulates the outgoing signal. Remember it took some time for the signal to reach the load, and the same time for the returning "error" signal to get back to the source. The source is still spitting out data, of course. If the return hits the source during the middle of the high (1) or low (0) period of the data, then (unless it is very large!) nothing really happens. A tiny little glitch will not cause any problems in the middle of the bit. BUT, if the return hits while the bit is changing, then it can move the edge a little and change the zero-crossing, and thus the clock circuit on the other end sees a slightly different period than it should That's where the jitter comes from.

It is impossible to perfectly match everything, of course, so there will always be some small reflections going back and forth.

Now, hopefully it is clear why changing the cable's length might help (or hurt). Changing the length changes the time at which that return signal from the load hits back at the source and corrupts the next outgoing bit. If the cable's length is such that the return happens when the signal isn't moving, no harm. If it hits when the signal is changing state (on an edge), it can effectively move the edge around, and we have trouble!

2. A longer cable can do a lot of harm but some good due to various non-ideal effects. Loss (attenuation), dispersion (smearing edges), noise coupling, and impedance mismatches are among the harmful things. However, a longer cable's slight loss in bandwidth may actually help by reducing the edge speed at the load, which in turn reduces the magnitude and bandwidth of the return signals.

Insert big grey box here...

HTH! - Don
 

amirm

Banned
Apr 2, 2010
15,813
37
0
Seattle, WA
That's a wonderful summary Don. Can you please put a copy in the technical library?

BTW guys, if you are using RCA jacks as most consumer gear uses for S/PDIF, then you are essentially assured of mis-match impedance as getting 75 ohms there is near impossible. What you need is a BNC connector like you see on the back of this Berkeley Alpha DAC:




Unfortunately using adapters to get there is also bad. So ideally, your source and destination have BNCs.
 

amirm

Banned
Apr 2, 2010
15,813
37
0
Seattle, WA
I did a search and ran into this fun set of pictures of S/PDIF outputs of a few CD players. Any notion that the signal is "ones and zeros" and that the problem is "easy to solve" at the receiver, should be dispelled :D:




 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Co-Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing