Hello tima,
Thanks for your reply…!
Firstly, please let me say my post is, as you say, mere speculation on my part fuelled by fascination that vinyl still continues to offer me a near-transcendent musical experience, despite the many epitaphs written for it since the advent of CD (and vinyl’s objectively inferior performance relative to digital). Anyone who’s seeking to move vinyl replay forward is worthy of my interest, and I appreciate Monaco have focused their attention on rotational speed via rigorous implementation. My post is really part a broader observation, not specifically directed at the Monaco, though given the 2.0 has been able to quantify its performance advantage in such explicit terms, it seems as good a example as any, and perhaps the best.
Two contextual caveats upfront: I read very few reviews these days, and confess to initially skimming yours, but have re-read it since. Also, like I say, I’ve heard neither the Monaco 2.0, nor the 1.5 - only the 1.0 zero once, in a system I was unfamiliar with. So perhaps these comments below are best understood as neither a critique of your review, nor a criticism of the 2.0 - simply curiosity expressed as a work-in-progress of observations apropos vinyl replay in general.
As perhaps we might agree, with vinyl we are never dealing with absolute rotational speed, only ever relative rotational speed (the turntable and lathe individually and relative to one-another). If all mastering lathes rotated at the same precise rotational speed, and only if they did, would absolute rotational speed matter in our turntables because we would have a single absolute reference in which to attempt to emulate. However, we do not and never will have a single absolute reference to emulate, given the divergence of lathes one-to-another and the variance with which each lathe departs from absolute rotational speed in both speed accuracy and consistency.
So while I can certainly not say that your observations are incorrect (nor am I attempting to), I think it could still be true to say that since all turntables and all lathes will never identically match in terms of absolute rotational speed, the variables matter less than their distribution. If I were to play you a record in which I had made one-hundred audible scratches in its surface, and spaced them evenly at one second intervals, you would likely notice them far more than if I placed them randomly across the entire playing surface with no discernible pattern. When a given deviation’s distribution is non-random we seem to flag it in ways we do not when the deviation is random(ised).
In complex systems (and I would argue the turntable, and especially the hi-fi system as a whole is exactly that), how a given variable is distributed matters more than the presence of that variable. In other words, how a turntable achieves its speed stability can often have a far greater influence on our perceived enjoyment/non-enjoyment, and in a way that’s fundamentally different to what exact speed stability it achieves. Again, though this is not aimed at the Monaco specifically, I think perhaps we can all point to turntables in which the implementation of drive topology conveys significant benefits in perceived enjoyment, despite the absence of absolute speed stability - and vice versa.
Nevertheless, that there continues to be devotees of both belt, idler and direct-drive turntables suggests the how of the platter turning is fundamental to our perception of music, not just because of its implications for speed stability per se, but as your review suggests, because music is always pitch and amplitude over time and the three are always modulating. Timing errors will therefore always impact the way pitch and amplitude are conveyed. My hypothesis (and it is nothing more than that) is that it’s the distribution of those errors that differentiates our perception of belt versus idler versus direct-driven given all forms of rotational mechanisms will have inherent degrees of speed instability.
Given we can perhaps acknowledge that all lathes also share this variation, my thinking is that if indeed they are complex systems of non-linearities, what matters most is not that those non-linearities exist in both the lathe and the turntable, but only that as long as those non-linearities are distributed in a benign (stochastic) manner, our ear/brain mechanism is able to accommodate those non-linearities, and in fact perhaps, confer unexpected benefits to signal detection despite the presence of the noise (see link in my previous post).
There is of course an ongoing debate (and rightly so in my perspective) of the direct correlation between what can be measured and what can be perceived. However, as many of us have discovered - often to our lament - a component that produces vanishingly low distortion, noise and output impedance measurements does not always confer a direct benefit on listener involvement. In fact, in some cases, and even taking into account the fact we all have our preferences and biases, it may cause the opposite.
In complex systems, higher-order effects matter. Even in cases in which a component has objectively come to match an ideal of linearity, there will still be many who may not prefer it (although I completely accept there will be just as many who might) - not because it’s demonstrably/objectively linear (a first-order effect), but because its linearity comes with second- and third-order effects that cannot be predicted ahead of time (and certainly not in isolation).
Yes, if it cannot be observed then it’s fair to suggest it may not be worth observing. Yet complex systems - and especially, a dynamic, high-order and interdependently complex signal played back via a dynamic, high-order and interdependently complex mechanism in which interactions matter more than single independent actions (1) - often leads us to conflate evidence of absence for absence of evidence. It’s only later, once time has allowed us to peek beneath the first-order effects of our discoveries that we’re able to observe any second- and third-order effects. Indeed, the problem in dealing with complex systems is that second- and third-order effects are generally masked by first-order ones. But just because they may not be observable now, does not mean they may not become observable in the future.
And while I can accept that Ockham made a valuable observation, that observation holds most true when applied to simple systems where parsimony is a virtue. In complex systems, in which variables interact and generate second- and third-order effects that cannot be predicted ahead of time, parsimony is likely to only lead to a false dichotomy built on first-order effects (2).
In any case, no more of my hypothesising will change either the real-world performance of the Monaco, nor your perception of it evaluated via your own ears. I’m grateful you’ve shared your thoughts with me, and taken the time to respond in such a generous manner.
Take care, tima.
853guy
—
(1) I continue to persist in my belief that music and the recording/hi-fi mechanism we use to play it back are both complex systems built on simple principles that can be defined and studied in isolation (pitch, amplitude, time; acoustical energy, electrical energy, acoustical energy). When those simple principles are brought together however, they interact dynamically in ways that often defy those simple principles observed individually and statically, because a dynamic, high-order and interdependently complex signal played back via a dynamic, high-order and interdependently complex mechanism will produce variables and non-linearities the individual constituent parts in-and-of-themselves can never fully predict ahead of time.
(2) “Height determines weight” may have been a heuristic Ockham would have approved of. This would have been especially true in the 13th Century when nutrition was based on simple foods of limited portions and people generally engaged in moderate energy expenditure. However, in the 21st Century with the addition of unlimited choice, cheap and easily accessible sugars, chemical farming and chemical “foods”, hereditary disorders, limited energy expenditure and often, unlimited portion size, “height determines weight” has little to no utility value. The complexity of our modern diet and the interaction of greater variables renders a parsimonious heuristic redundant.
--
EDIT: It's come to my attention that since I tend to type faster than think, I may have inadvertently mixed/conflated speed accuracy for peak deviation distribution throughout my two posts. In summary then, my thoughts are as follows:
All lathes will vary in speed accuracy relative to one another.
All lathes will vary in the distribution of their peak deviations one to another.
All turntables will vary in speed accuracy relative to one another.
All turntables will vary in the distribution of their peak deviations one to another.
Therefore, all turntables will vary in speed accuracy and the distribution of their peak deviations relative to the lathe’s degree of speed accuracy and the distribution of peak deviations a given record was mastered on.
Given that even a turntable with very high levels of speed accuracy cannot make up for peak deviations of the lathe, nor its own, what will matter more is how the peak deviations are distributed in both the lathe and the turntable, since a lathe that produces a master running at 33.4 rpm will of course be best served by a turntable running at 33.2 rpm despite the fact both of them are not strictly speed accurate. Speed accuracy in-and-of-itself therefore matters less than the peak deviations and how they are distributed. Therefore, random/stochastic distribution of peak deviations will perhaps be the best way to realise performance from a turntable given the non-linear nature of all lathes mastering records.
Apologies for any confusion I may have caused.