One thing is obvious, sending a signal to a DAC is done using a 100% analog wave.
But can software affect this wave?
An explanation might be the way the data is processed.
Is it done in bursts or is it throttled. The burst might induce periodic jitter, the throttle a constant jitter level.
However, if this is true I want to see measurements e.g the eye pattern when playing software A or software B.
Personally I don’t mind any claim about improved sound quality by any piece of software but if it does something I like to see at least a measurable difference.
One issue with USB is that it sends regular bursts of info like the start of frame packet - "The SOF packet consisting of an 11-bit frame number is sent by the host every 1ms ± 500ns on a full speed bus or every 125 µs ± 0.0625 µs on a high speed bus". If the timing of this shifts or is variable, this could elicit a different & variable reaction from the USB receiver & translate into a different & varying jitter or noise spectrum. Making the PC end as solid & stable as possible without undue processing could be one factor in ameliorating this variation. It might not be the low level of jitter that we notice but the variation in jitter - that's one reason why I say that the measurements we currently run seem not to be capable of picking up these issues or we are not directing them to the correct target for measuring.