I would parse the situation this way:
1. Objective macro level fidelity. At this level, it is easy to prove that jplay makes no difference. Bits get out the same way bits get out using Jriver. Calling it hoax in this school of thought is defensible, albeit in bad taste.
2. Objective micro level fidelity. This is where we look at timing of the bits, noise radiating from PC, etc., bleeding into DAC. It might seem that we can put value behind Jplay in this model of the universe. But I have a hard time accepting this validates performance of Jplay. There is no way an app on a PC can ever control what all a PC does. The operating system is in charge and there is so much asynchronous activity going on, outside of the control of the playback app, that no argument in my opinion can easily stick that there is an improvement. For all we know, changing how things work, may make it worse, not better! So on this front, I think Jriver may be 90% right if not 100%.
3. Subjective fidelity. Anything goes here. Problem is what Frantz says. If we don't believe in DBT, then there is no proof one way or the other. So in this case we could say Jriver is 50% wrong, and 50% right.
As far as I can tell, Jriver folks live in land of #1 so in that regard, it is perfectly justifiable for them to say what they said since countless other people believe in the same view of the world. Even if they traveled to #2, I believe the weight of evidence is on their side. No one has shown objectively how the PC acts differently on the output of the DAC when the playback pipeline is changed with Jplay.
Note that this is no the same as analog vs digital. In that argument, huge number of parameters are changed. Here, the bits themselves cannot change. And the system remains digital in both cases. So in this regard, I don't leave as much room for #3 argument as I would in other cases

.