Not impossible, perhaps, but unlikely. Perhaps some true computer experts can address this? Hard drive information is read into a buffer before being accessed by a program. In the case of Jplay (or several other good music players), this data is then read from the buffer into memory, and only then converted to PCM (from the computer's memory, not the hard drive). That's not even taking into account the fact that the hard drive can read a second's worth of digital audio data in a very small fraction of a second, making sure it is correctly read.
I think this has been mostly addressed but I will walk us through it.
The process of creating files involves segmenting it and storing it one block at a type. Block sizes can vary but for now, a common size is 4096 bytes or half as many PCM samples (since each is 16 bits).
In the process of ripping the CD, the OS is asked to find a free block on disc for each 4K chunk above. At the absolute, the OS is free to pick any random free block it wants. Doing so reduces performance so there is usually an optimization step where an attempt is made to find a free block that is closest to the one previously used as to make the file as contiguous as possible. But no assurance is provided. When people try to benchmark hard disks and such, they always start with a freshly formatted drive as to make the file allocation more predictable. But even that, is not a safe bet as background activity can cause files to be written that create fragmentation.
So it is safe to say that there are differences in block allocations of two identical rips on the hard disk on a normal system as used by the authors.
At some time later, we try to play those files. The Media Player will allocate memory to read the content and will ask the operating system to read that chunk of the file for it. The OS as I believe was mentioned, does that but also reads ahead, anticipating future requests from the same file. This is done because the drive keep spinning and if you don't read the block sequentially, you wind up "missing a revolution" causing you to wait a while before the block becomes available to be read by the stationary drive head.
How much the OS pre-reads is unpredictable. It will examine for example how much free memory it has.
Once the data is read into the OS buffer, it will then be copied to the application buffer, which will in turn pass it on to the audio driver (S/PDIF in this case).
In a multiple-trial blind tests, the second read of the file will most likely find all the blocks already in the OS buffers and no disk activity occurs. But the copying of the file buffers will go on. The disk drive will continue to spin but not perform any other operation *if* there is nothing else going on in the system. Alas, there is always a ton going on in the system. You will be horrified at how many disk read and writes are going in your system even though you are not doing a thing! There are apps you can download that show you this.
What the above means is that the behavior of the system in this manner is rather chaotic and unpredictable.
The notion that a digital file sounds different from another relies on two aspects:
1. The timing of digital samples has changed. The authors used S/PDIF interface on the PC motherboard. Read another way, they lowest quality digital interface they could find. So we can assume, if timing is going to change, it will.
How will timing change? It will change because the clock circuit powering the S/PDIF interface is under constant attack from all the PC activity. Every drive, CPU, or GPU activity is likely to create pulses on the power line in addition to leaking RF and such onto the S/PDIF clock. Such variations could range from making no difference whatsoever, to some difference. Without measurements, we are in the dark as to where we are in this.
Timing could also change if the S/PDIF waveform changes under load.
2. Electrical coupling. If you connect two devices using S/PDIF, you now have a common ground between the PC and your audio gear. This means that noise and such can bleed into the DAC and cause loss of accuracy, ground loops, etc.
So the theory of why something could make digital audio sound different when fed from identical PCM samples is sound. What is not sound is that by doing X, you reduce its impact. The OS as explained, is a random and unpredictable animal. It is entirely possible for example that when I don't read from the disk drive, I create more jitter or noise because the CPU is processing the data faster than if it idled between disk reads. On the other hand, maybe those X millisecond delays to read a block from disk creates a jitter at that frequency and that jitter is more audible than the random one created by the CPU.
Importantly, as I mentioned, if we are down to the level where identical files located in different spots on the hard disks making a noticeable difference, then all bets are off. You can count on the songs in one album to have varying levels of fidelity even if you ripped them with the app you thought was the best!
And let's dispense with a clear untruth: there is no way you should prefer one ripping application vs another when they both create identical bits! The ripping app's job is to create the bits, not to play it. If both apps create the same bits, then they are identical. If the outputs sound different because they are located in different parts of the hard disk, that has nothing to do with which one is a better ripper. It is simply by chance that you got the bits in different place. And in song after song you are going to experience the same anyway.
This whole issue is such a bizarre rabbit hole that I just like to avoid it altogether by using digital audio interface that deals with both #1 and #2 above. Once there, there is no worry about these factors and you have a reliable and performant system. The opposite of this approach is to use on-board S/PDIF interface and then have to spend good money to try to still get good performance out of it. Why not start with a much cleaner signal that is immune to PC randomness?