With all due respect, statements like the above can come across as arrogant and absolute despite numerous opinions to the contrary. Therein lies the "flash point" for many, myself included. For better or worse, similar to what EW did in the past on this forum. Best.
I'm just trying to state the facts as simply as I can. We all play recordings, the vast majority of which are made without the aid of exotic cables. Its been posed they might sound better if exotic cables are used. Since I might seem too emphatic (which might brand me as unreliable), my recommendation is to obtain properly working studio gear of the era prior to the existence of exotic cables, make sure its refurbished and then use it to see if you can hear cable differences.
The fact Ralph says all tube rectifiers sound the same (when used within their limits and with the same voltage drop) is also ridiculous.
Have you tried to verify that or is it ridiculous out of
declaration? I've
never said you can't hear differences between rectifiers just as I'm not saying here that you can't hear differences between cables. To verify the rectifier thing you would have to note what you hear and then measure the Voltage drop and see if I'm right. Tubes testers can help you winnow out how a strong rectifier of a given brand and era can sound better than one of the exact same that isn't as strong. I know we all like to assume its magic, but it really isn't- its just Ohm's Law, which made all of this possible. For the cable thing, to verify I'm I'm wrong you would use the technique I mentioned in this post just above.
I thought that Ralph was talking about ICs, and in particular the balanced AES specification, not power cables and speaker cables.
That is correct. But since it got brought up I'll state my position on those.
You can measure power cables easily to show why the sound different. Usually a Digital Voltmeter is all that's needed. Measure the Voltage drop along the length of the cord. When people tell me )like Ethan Winer) that power cords don't make a difference, I usually ask them if they have measurements to back up their claim. You do this by measuring the output power, output impedance and distortion of an amplifier that might be used with the cables. The more Voltage drop across the cable, the lower the output power, the higher the output impedance and the higher the distortion. Not voodoo, not rocket science either (unless you're really old school).
Speaker cables vary in sound according to materials, construction (geometry), the equivalent gauge and their characteristic impedance, the latter of which is rarely anywhere near speaker impedances, and if it were, no speaker is made that would properly terminate it. Length makes a difference too as it magnifys the weaknesses of the particular cable. This is particularly noticeable with tube equipment which has a higher output impedance; the speaker cable thus can mess with the damping factor exerted on the loudspeaker.
in real world we hear difference between xlr cables and nobody can describe it by common measurements. even 100% similar cables with different material (silver or copper) have different sound.
low output impedance pre + high input impedance power decrease the current and change the equation for simple voltage transfer but even in voltage transfer the sound of different cables are not equal.
If a balanced interconnect is used properly its a Power transfer, not a Voltage transfer. That's why dBm is used instead of Voltage. And I'm not denying you can hear differences between balanced interconnect. I've heard them myself and said so earlier.
I've written this elsewhere but I'll repeat this anecdote again to put this more clearly:
Back in 1989 we did a test using four 30 foot interconnects. The 'control' was an old pair of studio cables. The other cables were Purist, Kimber and Esoteric. For the test we had two methods of driving the interconnects. The first was a balanced passive volume control. For it to work it had to reference ground so in no way supported any of the balanced standards although it was balanced. Using it to drive the cables we could hear differences between them easily. The Purist came out on top. The control cable made the system sound broken- really bad.
Then the passive balanced control was replaced by an active line stage that supported AES48 and also low impedance operation. The same musical cuts were used. The audible differences between the cables vanished. More detail could be made out; the bass had more impact and so on. IOW all the cables sounded better than the best had earlier.
I've repeated this test with a variety of older studio gear like Ampex, an RCA microphone preamp (MI-11241) and my Neumann U67s. The results don't vary.
This causes me to conclude that conversations like this one occur because there is a lot of balanced equipment now that does not support the balanced line tenants of operation. Most of that seems to be 'high end audio' equipment, although there is a fair amount of semi-pro recording gear that is like that too.
Again, how to think of balanced lines is like this: Its an exotic cable system where the equipment driving the cables and receiving the signal does the heavy lifting by forcing the cables to be truly neutral and sound right. To do this the balanced standards have to be supported. All the cable need be is a twisted pair within a shield and it will work. Sometimes you don't even need the shield.
Single ended cables are the opposite- there's no termination standard other than the connectors all fit one another. The cable does the heavy lifting so construction and materials are paramount to how they sound.
I hope what comes from this post is I'm not a cable denier at all. Instead I'm trying to get across the idea of why we hear what we do.