Is ABX finally Obsolete

Status
Not open for further replies.
(...) But, you start off your sentence with the word "Sorry" which I presume (I could be in error here) means that you disagreed with my statement, i.e., that ABX is more than just a development tool. As such your statement also is regrettable as it is patently false. Mischaracterizations and stereotypes muddy the waters, which is precisely the case here.


In good faith, I wrote that I considered that ABX was a development tool. If you disagree (and may be you are correct) , please say so and support your affirmation in a way it can be discussed or I (and others ) can learn something. IMHO, the used style of accusation using nice words and denigrating others only muddies discussions, and does not add nothing to the debate.

Do you know of any other publicized use of ABX other than codec development?
 
Micro, it has been used to conduct general population as well as individual specific testing to determine the probability of the existence of an audible difference between all kinds of gear, be it transports, DACS, amps, or cables, and perhaps less obvious subject matter such as the existence of an audible difference between and 1st, 2nd and 3rd generation tape.

Google is your friend, here. A 2 second search of "abx test results" showed this abx test for amps. Perhaps Arny can join in the conversation, inasmuch as he invented the ABX comparator.
 
Gregadd -- The short comings of ABX are well documented after almost forty years. Perhaps we can list them in an attempt to design a better test.

Do you have an example in which the shortcoming of ABX have been "well-documented" by a research scientist? A statistician? Research assistant? A secretary at a research firm? Because while I know the shortcomings of ABX have been repeatedly, redundantly and ineffectively aired by Audiophiles (who talk about the shortcomings of all measurement and testing methodologies that fail to support their subjective conclusions), I haven't run across a well-documented discrediting of ABX by someone who actually has any credentials to back up their opinion. I've read of flawed/failed studies; I've read of good/well-designed and executed ones, but I haven't run across anyone serious seriously questioning the efficacy of the concept. I'd like to read that.

Tim
 
Last edited:
In good faith, I wrote that I considered that ABX was a development tool. If you disagree (and may be you are correct) , please say so and support your affirmation in a way it can be discussed or I (and others ) can learn something. IMHO, the used style of accusation using nice words and denigrating others only muddies discussions, and does not add nothing to the debate.

Do you know of any other publicized use of ABX other than codec development?

I know of several, and every one of them is like a red cape in front of a bull on an Audiophile forum.

Tim
 
Do you know of any other publicized use of ABX other than codec development?
Often transparancy of Codecs are tested using MUSHRA
The MUSHRA approach is recommended when there are obvious differences between codecs and original, but small differences between codecs tested.
It is a AES/EBU standard.
 
Micro, it has been used to conduct general population as well as individual specific testing to determine the probability of the existence of an audible difference between all kinds of gear, be it transports, DACS, amps, or cables, and perhaps less obvious subject matter such as the existence of an audible difference between and 1st, 2nd and 3rd generation tape.

Google is your friend, here. A 2 second search of "abx test results" showed this abx test for amps. Perhaps Arny can join in the conversation, inasmuch as he invented the ABX comparator.

"It has been used to " does not validate the method.

BTW, not in two seconds using Google, but yesterday I used a librarian search engine and found this abstract (that I can not access) in full :

AES E-Library
Statistical Analysis of ABX Results Using Signal Detection Theory
ABX tests have been around for decades and provide a simple, intuitive means
to determine if there is an audible difference between two audio signals.
Unfortunately, however, the results of proper statistical analyses are
rarely published along with the results of the ABX test. The interpretation
of the results may critically depend on a proper statistical analysis. In
this paper, a very successful analysis method known as signal detection
theory is presented in a way that is easy to apply to ABX tests. This method
is contrasted with other statistical techniques to demonstrate the benefits
of this approach.

Authors: Boley, Jon; Lester, Michael
Affiliations: LSB Audio, Lafayette, IN, USA; Shure Incorporated, Niles, IL,
USA(See document for exact affiliation information.)
AES Convention:127 (October 2009) Paper Number:7826 Import into BibTeX
Subject:Audio Perception

None of the tests found in the site you refer seems to supply this type of information.
 
I know of several, and every one of them is like a red cape in front of a bull on an Audiophile forum.

Tim

Tim,
Anyone living in or coming from a country where there is tourada or corrida de toros (bullfight) will tell you that bulls are colorblind ...

BTW, if by chance you are dressed in red and the bull chooses to attack you in the middle of a crowd of people dressed in green, the only possible scientific explanation should be expectation bias. :rolleyes:
 
Tim,
Anyone living in or coming from a country where there is tourada or corrida de toros (bullfight) will tell you that bulls are colorblind ...

BTW, if by chance you are dressed in red and the bull chooses to attack you in the middle of a crowd of people dressed in green, the only possible scientific explanation should be expectation bias. :rolleyes:

Or the fact that I have been deemed to be an "objectivist." A nod is as good as a wink to a double-blind bull.

Tim
 
Do you have an example in which the shortcoming of ABX have been "well-documented" by a research scientist? A statistician? Research assistant? A secretary at a research firm? Because while I know the shortcomings of ABX have been repeatedly, redundantly and ineffectively aired by Audiophiles (who talk about the shortcomings of all measurement and testing methodologies that fail to support their subjective conclusions), I haven't run across a well-documented discrediting of ABX by someone who actually has any credentials to back up their opinion. I've read of flawed/failed studies; I've read of good/well-designed and executed ones, but I haven't run across anyone serious seriously questioning the efficacy of the concept. I'd like to read that.

Tim

My good friend Tim. ABX is and always has been psuedo-science. I have posted an a crude example of a true double blind protocol. Ken referred me to a basic stattistical analysis (statistical power). Perhaps I overstated my case. Because ABX is not science, it does not require science to refute it. For example many have argued that it relies to heavily on memory. Or that short term exposure to unfamailiar sytem music and environment also effects releiability.
typed without glasses
 
Or the fact that I have been deemed to be an "objectivist." A nod is as good as a wink to a double-blind bull.

Tim

Wait a sec. Bulls, double-blind bulls. A Nod And A Wink? You must be talking about camels, of course. A reference like that does not pass these prog rock oriented eyes unseen:

8274c0a398a0dbe722ba2210.L.jpg
 
"It has been used to " does not validate the method.
You're moving the goalposts. You stated ABX was essentially a development tool. I pointed out the error in that statement; it is unequivocally used for more than that. You countered it was used for codec development. I explained how ABX was/is used to determine the probability of the existence of an audible difference between gear, not just codecs. I provided you with a link which showed just that. And, indeed, your own Google search provided corroboration of precisely that which I stated.

Now you're asking for studies to show the validity of the methodology. Can you do our readers and me a favor and let me know when the goalposts are firmly planted in the ground?
 
My good friend Tim. ABX is and always has been psuedo-science. I have posted an a crude example of a true double blind protocol. Ken referred me to a basic stattistical analysis (statistical power). Perhaps I overstated my case. Because ABX is not science, it does not require science to refute it. For example many have argued that it relies to heavily on memory. Or that short term exposure to unfamailiar sytem music and environment also effects releiability.
typed without glasses

Hearsay. Unattributed. You can do better than this, Greg. Give me a non-audiophile case against the scientific efficacy of a properly conducted ABX test. Just one.

Tim
 
Wait a sec. Bulls, double-blind bulls. A Nod And A Wink? You must be talking about camels, of course. A reference like that does not pass these prog rock oriented eyes unseen:

8274c0a398a0dbe722ba2210.L.jpg

I had a different album in mind, Ron...

Faces-A_Nod_Is_as_Good_as_a_Wink...To_a_Blind_Horse_%28album_cover%29.jpg


Tim
 
Hearsay. Unattributed. You can do better than this, Greg. Give me a non-audiophile case against the scientific efficacy of a properly conducted ABX test. Just one.

Tim

I could but the question is would anybody be any more receptive than they were in the past?
 
You're moving the goalposts. You stated ABX was essentially a development tool. I pointed out the error in that statement; it is unequivocally used for more than that. You countered it was used for codec development. I explained how ABX was/is used to determine the probability of the existence of an audible difference between gear, not just codecs. I provided you with a link which showed just that. And, indeed, your own Google search provided corroboration of precisely that which I stated.

Now you're asking for studies to show the validity of the methodology. Can you do our readers and me a favor and let me know when the goalposts are firmly planted in the ground?

Happily our readers do not need my favors as I am not a specialist in football, goalposts and masonry - neither in ABX tests.

The methodology was in my scope since my first posts about ABX that moved from another thread - please read them again.

"Tim, What would be your conclusions if an ABX test gets all the answers wrong? "

"It seems me that ABX is essentially a tool for development and does not pretend to be what some people expect - a scientific oracle that establishes the universal threshold of audibility tweaks. It was developed by industry because they needed a fast and economic test with a certain degree of confidence, but I am sure they must use other ways to complement this test - no one would risk millions on a 95% confidence test.

Let us suppose that a golden ear audiophile or cable developer can get a 25/25 score ABX test on cables. Will it universally prove that cables can make a difference?"

Unless we understand what is behind the ABX test it is not possible to answer to this questions.
 
 
I could but the question is would anybody be any more receptive than they were in the past?

That depends on the case presented. Make your case, counselor.

Tim
 
Status
Not open for further replies.

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu