Would also beg the question why (sort of) single blind rather than the more credible double blind or ABX. I think Ethan has already stated that all you need is one person to pass a DBT or ABX to back your claim and he would change his opinion.
I don't think he has asked for ABX. As I noted, he uses AB tests to prove his own points so it would be odd to all of a sudden require a high level from the other person.
As for wanting that one person, he has that: me! Have him cross examine me all he wants to get comfort in that I knew what I was doing. Only then does he get to dismiss my results.
On my side, I have gotten him to say that he has heard the very distortions he says don't exist when going from 16 to 24 bits. His defense is that he had to turn up the volume to hear them. I am fine with that. As long as he agrees that the distortion is indeed there, and can be heard at elevated levels, then we are a hell of a lot away from "you can't hear jitter no matter what." If you want to turn this in a double blind test, I could force him to use the very clip he used in that test, turn up the volume as he did, and I would win!
After all, we know that he could hear the difference.
Now if you want to tell me that he likes to cook the tests by using loud material so that I can't turn up the volume as he has done in his test, then we don't need to do that either. I have already conceded that there are a million ways jitter is not audible and that digital audio fidelity is not about loud signals. What he has to prove is that jitter is never audible. If he indeed heard something when he turned up the volume, then what better evidence do we need?
Oh yes, proving that what he heard was due to jitter. I don't actually know if it was since I can't measure his gear. But I would declare victory anyway since it doesn't matter why the fidelity was not there. His own test proves that accuracy does matter and that digital is not perfect in all of its forms. That is something he would argue against just as much as he jitter.
Indeed, in this discussion we are trying to figure out if jitter reduces effective resolution below 16 bits, does it matter? If he has found that there is a difference between 16 bits and 24, surely he has conquered hearing a much more difficult impairment than the one we are talking about here!
It is a subject for another test but I suspect the test he really ran above was 16 bits versus lower than 16 bits. Most "24-bit" converters are not linear above 16. Run them in 16-bit mode, and they wind up only being good to 14 bits. So I shouldn't be unfair to him and make the last point above
. BTW, he would only know this if he had seen measurements of his DAC linearity as I have for all the devices I have tested. I am always disappointed when the advocates of objective testing, do away with measurements because it is too hard, yet they demand others to go through a lot work to prove their point of view. Either you believe in objectivity or not. Start with measurements and then move on to listening tests which tend to be subjective at some level.