This is a question I get all the time. Its a understandable question.
I will cover it in a few ways.
A good analogy is digital audio. Something most people here know well. SURE DSD256 can sound great. 192/24 can sound great. But is it always the case that it sounds better ? Then there is a the question of the hardware. The hardware for digital playback REALLY matters, a lot. The server, the DAC, the digital cables, the switch, the clocking.. Its easy to argue that the hardware is very very important to the sound, more so then the audio format and bitrate.Digital audio hardware has had 30 years of hard core evolution where each part that makes up a DAC has been perfected. Along the way it became clear jitter mattered for example. A high end DAC internally looks more like a high end lab grade piece of Textronix or Agliant test equipment. The digital audio hardware has been refined over decades by listening and measuring.
It turns out that streaming and video systems have not had nearly any of this type of engineering evolution. HDMI based systems are nearly untouched by proper engineering. HDMI circuits in all gear are simply copy pasted from "reference designs" from a chip makers datasheet. This "reference designs" are the cheapest way that the system is known to pass HDMI tests. You open up any HDMI devcie, look at the data sheet for the HDMI chip and you will find a copy paste from the chip maker. If a mfgr deviates from this, then HDMI certification is VASTLY more expensive and takes huge amounts of time. So even super high end devices copy paste a very cheap HDMI circuit in.
This has led to a situation where since the dawn of HDMI that no one even tried to improve HDMI. In fact no one even looked to see if improvements had any affect on the pic/sound. Some people in the buis have played with this tho. Datasat for example. There were some very quiet mods made to the HDMI boards and this was very much a hush hush thing because it would break HDMI certification rules.
As it turns out... HDMI seriously sucks. It "reference designs" are pretty much abhorrent. WHile the resultant signal does pass tests, it produces a really nasty audio/video experence. Pretty much, including me, knew that. Then one day I modded my Oppo really seriously deep into the HDMI system. The picture popped off the screen and sound was just STUNNING.. HUGE jump.. At that point I knew HDMI SUCKED..
Like you I looked at these streaming services as garbage because of the low bit rate. I tried them all. They all equally sucked. Then one day a client forced me to to mod a AppleTV. It was IMMD a OMFG experience. The sound went from total garbage to a detailed image with depth and nuance, musicality jumped thru the roof. It was stunning the difference. Then i noticed the pic.. Whoa...
I spent months playing with all this. I applied every bit of the hardware model used in high end digital audio to a AppleTV. I jumped into the HDMI with $100,000 test equipment. I spent time working to dejitter things like the CPU and RAM and the buss that feeds the HDMI chip. I cleaned up the whole platform by huge margins. EVERYTHING I did I could see and hear. It turns out a HDMI platform is just as esoteric as a high end DAC/Server/Switch... Along the way i also ran into a number of things that really rocked my engineering world about some key things in modern digital electronics that no one else has noticed. While I wont say what one of the key findings was, lets just say the PWM switching regulators have a issue that no one seems to have noticed. I have discussed this with some trusted engineers in this industry and they say I should patent.
The result was so crazy, I decided to make and sell the units.
So to get back to your question.. I think cleaning up the HDMI and the entire hadrware platform is more important then the bitrate alone.
OK.. So.. Bitrate.. A still image takes no bitrate. Its when you have motion that you need bit rate. Video for consumers uses variable bit rate. This means when there is little motion, but rate is REALLY low. The more motion, the higher the bitrate. If you run out of "cap" on bit rate then the areas in motion will loose resolution which is not that visible to the eye/brain.
Bitrate is only as good as the material being transmitted. For example you have have DSD256 of a recording, but, that does not mean its a good recording that makes use of it. A lot comes into play when your talking picture quality, not just bitrate.
The AppleTV can do 600Mbps streams from alocal NAS. I have a lot of test material at these high rates. I also have post production clients using the ATVX and playing high speed streams and files. SURE high bit rate looks better, most when things are in motion, but the quality of the material most times is way more important then the bitrate.
So.. The results people have shared here on this thread doing A/B/C/D/E of all manner of device VS the ATVX is that ok, on the right material, in some scenes, a UHD disc or a kscape might look and sound better. BUT. The HDMI and hardware improvements make this a very fuzzy line.
BluRay is dying. Physical media is history. Sadly.. KScape is expensive and the content is limited and expensive. A AppleTV has EVERYTHING ever made on it already on some app somplace. The hardware improvements puts it right that with Oppo and kscape. For me, I still have my Oppo and a kscape system, but, neither are hooked up. The Apple ecosystem is all I need. I have a single source component. My display is calibrated to my single source. For me its a ideal solution.
I use it now for so many things. Right now I am playing the Sirrus/XM app and have it playing Yacht Music, hahaha.. And it sounds surprizingly good for Sirrus/XM..
There is a reviewer working on a stand alone review of it as a streaming audio solution. The multichannel audio guys love it too.
So.. Bitrate is not the thing to focus on.. Just like in audio. Its the hardware that really matters.