I have spent dozens (if not hundreds) of hours comparing different digital formats and have found that I can often tell differences between high resolution formats and 44.1 kHz versions thereof, despite controlling for every variable that I can think of.
Tony, I don't doubt your observations in the least.
The question, however, after 4 decades of digital audio is still the same: is
a) the application of the Nyquist theorem in the CD format, leading to the frequency cut-off of the format, somehow intrinsically flawed, or
b) does the difference in performance between hi-rez and CD lie in the current practical technical implementations?
If an audiophile in 1985 would have heard the musical resolution from current top-level CD playback they would have been hugely surprised or rather, shocked, about the actual potential of the CD format which was not at all apparent at that time, with the implementations (recording and playback) available back then. Also, it is well known and agreed upon by audiophiles that CD on top-level playback still sounds better than hi-rez on second-tier playback, which again sounds better than CD on that playback equipment. In other words, it is all about implementation, and hi-rez is apparently more easily implementable to a satisfying degree than CD is. This is not surprising given that the CD format operates really on the edge of acceptable parameters, and hi-rez simply has more room to spare when it comes to its practical implementation -- that greater ease of audibly proper application would obviously be an argument in favor of hi-rez; theoretical arguments appear more debatable.
Perhaps there is indeed a clear audible ceiling to the resolution and transparency of CD playback, compared to hi-res, that we simply cannot get beyond, but after 4 decades of digital audio this still remains an open question: apparently, ever better CD playback still reveals more and more resolution from that format that hitherto was not thought possible or heard of. A ceiling apparently has not yet been reached, even though audible ceilings to the transparency of the CD format have been declared many times in the past.
Certainly, one might argue, as Amir does, that the bit depth of CD, its dynamic range, is not sufficient, which is a completely different issue than the 44.1 kHz sampling rate. Yet the question still remains open how much this matters in practice, given that, according to some engineers who have posted numbers here, the practical dynamic range of any playback system so far does not exceed a value somewhere around 80 dB. Certainly, manipulating the signal at 16 bit depth through all steps until the final master is problematic because you lose resolution at every step, but these days nobody does that anymore. The real question remains how much actual loss of resolution occurs for all practical purposes when limiting the bit depth down to 16 bit for the final CD after the signal has been processed throughout at higher bit depth up until that point.
In practice, a signal that is, for all practical purposes, band limited to frequencies below (say) 20000 Hz can be reproduced (for all practical purposes) by sampling the signal at 44100 Hz. For example, just last week I took a 88400/24 recording (of an orchestra including trumpets, cymbals and violins) and band limited it so that there was no energy above 18 Khz above -150 dBfs. This became my reference signal. I then converted the reference signal to 44100 kHz and resampled it back to 88200. The result was a signal that agreed to the reference, at every sample, subject to a worst case error of 2 least significant bits out of 24 bits. (This was due to roundoff error in converting the high precision values to 24 bits). The average difference between the two signals was -150 dBfs. If I had wanted, I could have changed the parameters of my experiment and gotten better results. I did this experiment using the commercial software (iZotope RX4 advanced) and my preferred settings that I normally use when converting a high resolution studio master back to the 44.1 kHz sampling rate for release on CD.
I could not hear a difference between the reference file and the file that had been passed through the 44.1 kHz sampling rate. I took the error signal (difference signal) and listened to it. It was still silent after I boosted it by 60 dB. Only after I boosted it by 100 dB was there an obvious difference in the form of white noise. So yes, in practice, taking audio signals that have no energy above 18 kHz and converting them to 44.1 kHz is possible without any audible loss (at least to most people's ears).
If, for example an original 88/24 recording sounded better without bandlimiting than the same file bandlimited to 20 kHz (here you went down to 18 kHz), then there may be technical reasons for the degradation of sound during bandlimiting other than the frequency limitation. Looking to the more extended frequency response of hi-rez for an answer, when it comes to either the theoretical limits of CD playback or an explanation of inferior sound during current implementations of the CD format vs. hi-rez, is in my view questionable. There is just not sufficient, generally accepted, scientific evidence that our hearing, or any musical perception for that matter, extends beyond 20 kHz. Again, extraordinary claims require extraordinary evidence.