Funny that this discussion has come up as for the last couple of days, I have been working on characterizing S/PDIF cables. Here are the results of using a generic RCA cable, vs a Transparent audio interconnect cable (NORMAL audio, NOT S/PDIF) which has a black box that filters high frequencies. Here is the impact on the S/PDIF waveform:This is the key. All sorts of properties that exist at radio frequencies, such as skin effect and VSWR, are irrelevant at audio frequencies. Yes, an impedance mismatch at connection points causes reflections and electrical standing waves. At 100 MHz this is an important consideration for maximizing power transfer. But it doesn't matter at audio frequencies, or even at the 2x audio frequencies used for digital signals. I've connected audio gear via S/PDIF many times using whatever random RCA cables I had lying around, and it never made any difference.
--Ethan
The nice looking square wave in blue is the normal coax, and the green, reduced bandwidth audio cable. Clearly we see the high frequencies taken away. Here is the numerical difference in jitter:
Normal coax: 435 picoseconds
Reduced bandwidth Transparent cable: 5,000 picoseconds
So we have a 10:1 increase in jitter when the cable is designed around audio frequencies. For 16 bit audio, we like to see 500 psec by the way.
We can see why in this frequency response measurement done two ways: one with my generator set to low impedance of 20 ohms and another, to a high of 600:
We see that the response is flat with 20 ohms for both cables. But at 600, Transparent cable shows a drop of about 0.2 dB at 40 Khz which matches your criteria of 2X audio bandwidth. Yet we see that by not maintaining that level at higher frequencies, jitter is sharply increased.
So no, 2X audio bandwidth is not even close to being enough for S/PDIF cable. Yes, your receiver is resilient against jitter and will extract audio samples and you hear the sound. But the induced cable jitter is very high and likely will show up in the analog output of the DAC.