Sorry,again we are talking about nulling in the digital domain i.e the bits match. It says nothing about any other noise or distortion that might be riding on the digital signal. This could be very different between the two files & they would still null in the digital domain.What I've said is, if, in the digital domain, I can null two files against each other, 100% to the sample, that *suggests* to me what what I'm feeding my DAC is identical in both instances.
if the noise is different between both files then I think this expectation is very optimistic.I would expect them to be treated identically by the DAC chip, audio-wise, jitter-wise and any other way I can imagine.
Not if the above scenario is a possibilityErgo, I would expect the output from the DAC chip to the analog stages and ultimately, the output from the box to my power amps, to be identical too.
To be clear, I'm not saying this IS the case and I'm not saying it is a Universal Truth (please add reverb to those words ;-}). I'm saying this is my understanding and expectation, flawed though they may be in the face of any new evidence to say otherwise.
Not at all the same (at least to me) as "bits is bits" because the latter doesn't take into account timing between those bits during decoding and perhaps a host of other as yet unquantified factors. I've certainly heard differences, from very subtle to quite pronounced, in comparing CD pressings from different replication facilities against each other and against the master used to create them. The data, once extracted to computer, have consistently (i.e. 100% of the time over several thousand examples in the course of close to three decades now) proven to be 100%, bit-for-bit identical. Of course, except in the case when one is in the presence of a fax machine, we don't listen to data.
Best regards,
Barry
www.soundkeeperrecordings.com
www.barrydiamentaudio.com
Interestingly, I would love to see how you would expect jitter to show up in a null test of a digital signal? Any ideas?
Edit: I see this thread has progressed a bit. One thing I would like to tentatively add to this is that I'm of the opinion that it's not just noise that will cause audible issues - I'm convinced that the waveform of the digital signal will have an influence on the sound. I have done an experiment with USB cable Vs no USB cable & many people now confirm that no cable is a far better sounding configuration. Yes blind test were conducted. My tentative conclusion based on follow on tests, the digital waveform matters! Why? Maybe, like in the case of the extra processing required for Lossless decoding giving rise to noise, the less ideal waveform causes some extra level of processing effort with the USB receiver chip which causes noise? It's just a possibility, don't take it literally.
Last edited: