Bit-perfect USB audio is an oxymoron.

Assuming you are referring to this article: http://www.audiomisc.co.uk/Linux/Sound3/TimeForChange.html

Jim Lesurf compares the performance of the DacMagic (isochronous USB with adaptive mode synchronization) with the Halide Bridge (isochronous USB with asynchronous mode synchronization).
It is not up to me to understand what this got to do with error detection (CRC, implemented in hardware), error correction (not applicable to isochronous USB as it is a quasi real time stream) and bulk mode (not used by both products)

The point of this thread is about bit perfect USB audio streaming, I am pointing to the Jim Lesurf article as an example of something that comes across as being bit perfect and yet suffers massive jitter (larger than the examples Amir has provided in the past with HDMI).
That is my point.
As you point out and I agree it has no error correction nor data guarantee, hence it is not bit perfect and yet no-one found that issue with that type of USB audio stream setup.
This would apply to more than the DACMagic as it was more recently manufacturers moved to Async setup ala Gordon Rankin/etc or in that example Halide.
This is also compounded by computer drivers (depending upon OS and whether manufacturer own).

Even with Async design, jitter and/or errors can still be introduced when considering both ends and also possibly the driver - again context specifically audio streaming.

So comes back to how does one actually analyse to see if it is "bit perfect" USB audio streaming or influence of jitter on the decoded signal/data?
The loopback test removes async designs (which I mentioned earlier is a compromise) and is also bulk mode transfer (which you point out is not used by audio streaming products or designs)?
It would also remove the audio application and drivers relating to the USB DAC (plus internal receiver chip) from the test, along with the end audio device that can be interacting in a way to exacerbate errors/jitter.

Maybe a bad assumption on my part that no solutions use bulk transfer mode, I assume if it is used it is by a rare few.

Edit:
I see that you touched on part of the end-to-end jitter/error consideration for Async with the quote from Jkenny.
Cheers
Orb
 
Just to add.
The reason the J-test is a poor way to measure these days is because it is not simulator/modeller of jitter, but actually stimulates jitter by using the worst case scenario that is more specific to traditional interfaces+DAC (specifically AES/EBU and S/PDIF).

Cheers
Orb

Not sure that I follow. What jitter test is more appropriate and where do we find published results? Or, are you saying the test simply does not reveal the jitter levels of Asynch USB that are "well known" (?) to exist and to be audible and significant, in spite of the vanishingly low levels actually measured by the accepted standard of the J-test?
 
The J-test ( wouldn’t be surprised if the J says Julian Dunn) was invented with SPDIF in mind.
The test signal is send at ¼ Fs.
In case of USB or any other “PC” bus there is no relation between the rate of the bus and the sample rate of the audio. Hence one might wonder if a test designed for SPDIF (bus rate is sample rate x 2) is apt for other busses as well.
 
The J-test ( wouldn’t be surprised if the J says Julian Dunn) was invented with SPDIF in mind.
The test signal is send at ¼ Fs.
In case of USB or any other “PC” bus there is no relation between the rate of the bus and the sample rate of the audio. Hence one might wonder if a test designed for SPDIF (bus rate is sample rate x 2) is apt for other busses as well.

Yeah.
The square wave as you mention Vincent really exacerbates SPDIF with the DAC in terms of jitter, which is why a more appropriate jitter simulation-model and stimulator is required for networks-usb-etc.
It can show some usable information but it is far from ideal.
Cheers
Orb
 
Ok. On jitter, the J-test might not be the the ultimate torture test for USB since it uses signals at frequencies related to SPDIF, not USB transmission. I still have difficulty seeing how jitter would arise in Asynch USB, properly implemented of course, where the data is streamed from the receiving buffer in the DAC on to internal d-a, both under the control of the DAC master clock. Yes, there might still be jitter internal to the DAC in that process, not a USB transmission issue. But, input packets via USB have either been received into the input buffer on time or they have not. If not, we do not have have a proper implementation. And, agreed there is no error retransmission over the wire to intermittently slow some packets down, since transmission is not in bulk mode with error correction.


The receiving of input packets by the DAC is also at a much higher data rate than the rate at which data is streamed out of the buffer into to d-a process. And, additional packets to refill the buffers can be requested by the DAC in timely fashion to make sure there is always data for further processing and there are no buffer underflows or overflows.

It just does not seem all that complex to keep out of trouble with Asynch USB and effectively eliminate jitter in the transmission process from Computer to DAC. I suppose it is theoretically possible to have a problem on the Computer side with some computers where resource contention causes packet transmission to be delayed, hence not able to get the data to the DAC in time, leading to buffer underflows and audio dropouts. I have never heard that phenomenon, myself. I have loaded my PC with many additional tasks up to 100% CPU utilization, and I subjectively heard absolutely no difference vs. the normal very low CPU load imposed by my software player. I do, of course, use the USB 2 bus exclusively for audio to the DAC to avoid any possible contention there.

On bit perfection, I have seen no explicit tests for it, but I believe it can confidently be assumed to be an insignificant issue from other tests. These include tests for audio distortion at the DAC analog output, such as those done by Archimago. Corrupted bits would certainly give rise to obviously increased distortion levels, as bit corruption would have to take place randomly in signal words from MSB to LSB or even in control or header information. Yes, we do not have "guaranteed bit perfection" as bulk mode transfer would offer, but it appears to be sufficient nonetheless for error free playback in most computer audio setups, such that no one has ever to my knowledge seen a measurement of any kind that would even suggest its occurrence.

In my view, we have close enough to true bit perfection even if we do not have "guaranteed bit perfection" via the error detection and correction of USB bulk mode.
 
Ok. On jitter, the J-test might not be the the ultimate torture test for USB since it uses signals at frequencies related to SPDIF, not USB transmission. I still have difficulty seeing how jitter would arise in Asynch USB, properly implemented of course, where the data is streamed from the receiving buffer in the DAC on to internal d-a, both under the control of the DAC master clock. Yes, there might still be jitter internal to the DAC in that process, not a USB transmission issue. But, input packets via USB have either been received into the input buffer on time or they have not. If not, we do not have have a proper implementation. And, agreed there is no error retransmission over the wire to intermittently slow some packets down, since transmission is not in bulk mode with error correction. ......
Just to say, properly implemented S/PDIF also has jitter as low as USB, as you say it is all about the implementation, which IMO needs to consider end-to-end and throughout all that the various layers for physical-logical/session/driver-application.
Jitter can appear at the DAC due to sidebands influenced by its power supply, or something else that compounds the issue caused by say compromised galvanic isolation, other products on the USB or how designed-implemented on source computer (or other similar source) from a motherboard perspective.

Anyway agree with all you said (including the points I snipped), just that it does seem various manufacturers (or models by a manufacturer) do have issues to achieve SOTA measurements even with the less than ideal jitter measurement available for USB.
Vincent thanks for the new link as well, looks interesting and will follow up that thread.

Cheers
Orb
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing