You know, my overall take on the article was, firstly, these fellows did a huge amount of work trying to organize everything the way they did, so they deserve a round of applause for the efforts.
However, the computer audio systems they described seemed rather, well, mediocre with respect to the likely S/PDIF phase noise and jitter.
Using an onboard S/PDIF driver and a 25 ft cable seems like a disaster in the making. Why would anyone interested in critical listening use a motherboard-based S/PDIF? The jitter is possibly 1000 ps or more (Atkinson mentioned that a Mac Mini he tested has something like 1300 ps). Motherboard S/PDIF is in the "kazoo" category. And they said that USB sounded worse than that?
Now, on many mid-range DACs, sure, the USB input sounds poorly compared to the S/PDIF, but their S/PDIF configuation seems completely the wrong way to go. To me, the poor sound quality from a high-phase-noise S/PDIF would pretty much eliminate making subtle evaluations. My ears and body go into "tense" mode when the jitter is much over 300 ps. 3 ps is much, much better.
What were they thinking?
To me, computer audio is about file-based music sources. Do all the computer-to-audio signal format conversion outside the computer.