Not Ralph here, but also find the topic interesting.Ralph, I know you have spoken many tines about this. If you are feeling generous, would you remind me ( or link to prior discussion ) a bit more in detail about the rise of distortion into decreasing impedance? For example is it mostly IMD etc.?
This is interesting to me as it seems many makers of class D amps encourage lower impedance loads ( especially in pro products ) as their amps 'make more power' into those loads.
Thanks in advance...
Non switching amps at least, are very sensitive to being loaded. If you measure the unloaded voltage output of practically any amp you would be amazed at how low the distortion is. Once current starts flowing though, the picture changes, often dramatically. It is mainly due to non linearities in the transfer characteristics of the output devices, but even the voltage amplification stages may be sensitive to the output current.
Look at Stereophile measurements, every time the load halves, distortion approximately doubles. Irrespective of class A/B o the number of output devices. With all else being equal an 8ohm speaker causes less amplifier harmonic distortion than a 4 ohm.
And IMD is always the consequence of harmonic distortion.