Could you briefly explain to this luddite how an analogue signal that is converted into computer language then converted again back into an analogue signal will be more “ accurate “ than an analogue signal being directly cut to lacquer? How is that measured?
The absolute accuracy is not really open for discussion. A signal is preserved orders or magnitude better going through a digital process than an analogue one. Consider the following:
Digital:
sound waves -> voltage at mic membrane -> preamps and manipulation -> high resolution ADC -> bit perfect storage -> high resolution DAC -> voltage at preamp input to feed your system.
Analog:
sound waves -> voltage at mic membrane -> preamps and manipulation -> to storage:
a) tape: voltage induces magnetic encoding in a magnetizable support. Timing is encoded by motor motion pulling a tape trough the place encoding takes place (the head)
b) direct to metal vinyl: voltage induces magnetic fluctuations that move a hard needle across a metal or lacquer support. Timing information is again entrusted to a motor speed. Encoding is performed by either active or passive elements to apply an eq curve to the signal (usually RIAA)
c) tape to vinyl: sum the two processes before and add the tape read
-> storage in material media directly -> to read
a) tape: feed again the tape, run motor and read magnetic fluctuations on tape to induce voltage coming out of a read head -> perform any corrections for used curves via passive or active electronics
b) vinyl: run motor and use a stylus to create voltage coming from shaking a magnet or coil, amplify that by orders of magnitude (40 to 80db?), re-apply equalization to undo the previous one that was applied in recording by passive or active components -> voltage at preamp input to feed your system.
It should be clear that the digital step is infinitely more accurate, there are no moving parts, very few parts at all, and they are all quite quantifiable, trackable and easy to account for.
Every part of the analog process is difficult and adds errors: signal goes though a incredible number of magnetic reconstructions, speed relies on motors all the way, everything has inertia (stylus, platters, bearings,...). There is no such thing as a 'pure' analog recording any more that a pure digital recording. The signal is tortured all the way from the mic to the final medium, and then tortured back. It is rebuilt so many times on the other side of a transformer, equalized in lossy processes and then back again to be produced, it is absolutely comparable to a (haphazard) digitalization process, just in analog media. There is nothing magical about it.
ADC is effectively transparent. DAC can be made effectively transparent as well. This is trivial to measure and quantify, don't get confused by people that say otherwise. Digital processes run the world and their resolution is not something we should trivialize with annecdote. 24 bits gives you just short of 17 million discrete intervals to categorize something.
Just a bit of fun, those 24 bits, how do they compare to the potential of vinyl? the smallest groove is about 0.04 mm, so half that divided by 24bits is ~1.2e-11. That's the information size, in meters, you'd need to get to reproduce 24bits on a record. So do we get it? A small molecule is about ~1e-9 meters. We're in about two orders of magnitude off, in favor of a simple 24bit recording. It is, at minimum, 100x more resolving than analog. This is the reason why people digitize tape to DSD and 'it sounds the same'. The tape signal literally fits within the digital signal, with headroom to spare, and that is only possible with the ubiquitous high quality ADCs and DACs we have.
So, digital is more accurate, in absolute terms.
Now here comes the kicker, after all this blabber:
does it matter? Are we even making the right questions?
Is absolute accuracy relevant or is there something else?
It is more relevant if we are considering just 'information'. But what about 'music', and more generally, sound?
Remember this all starts at the microphone and ends in our ears (assuming acoustic music of course). And that's important, they are the gatekeeper for all of this. A mic is a deeply flawed device: quite frequency selective, saturates quickly, directional, has internal resonances and so on, and those are always there, independent of the downstream being digital or analog. Our ears are laden with eons of evolutionary pressure to be also very selective, typically at the expense of linearity in most processes. There are a number of psychoacoustic clues as to why we prefer a given type of euphonic presentation over another, as to why a given type of noise renders details more present and trackable, and so on. Some of these things are natively (by coincidence) present in analog chains, and that seems to be a reason why they score high in the preference score, even trying to discount the cultural and ritualistic effects innate to using the medium. So it
could be that analog is more accurate to the music under certain conditions.
As a user of both, I generally don't share that opinion. I can get digital to sound analogue but not the other way around, and that tells me something. I certainly entertain these ideas and they inform my design choices. But I'm not sure we are debating a meritorious problem, at least in it's current typical format. There is nothing inherent about any of the two processes that elevates it above the other, in relative terms. There is too many music recorded natively in each of the processes to disregard, and I have to agree with Mike, above a, certainly diffuse but nonetheless real, threshold of quality, the differences are difficult to pin down, highly release dependent and quickly left at the door and you just enjoy the outcome.