I know Apple tends to support computer and phone hardware for a lengthy period of time after they were initially released. I don't know if they have any similar commitment to the AppleTV product line, but does anyone have any insights into a likely reasonable lifetime for the underlying version of the AppleTVX?
I'm quite tempted to order one if it's not likely to be phased out in two years. (That said, unless video/audio codecs change or 8k becomes a necessity, I don't see anything in the near future that suggests it'll become obsolete due to the hardware.)
Hi Mike. You also sent me a PM and asked more questions and I will also answer those here as they are quite valid and important.
WHO KNOWS what Apple may do ! No one knows that. No illusions. In fact Apple *might* come out with a better ATV. This last one was ****, but, who knows. If they come out with one that I can mod and produce a better pic then the A2169, well, I might go that route. However, my guess is the A2169 is going to be the best one they ever make. I think they may leave the market, just a guess.
Will the A2169 work in 3-4 years, they still support the ATV that does HD and that is 5 years old. Will some app come along that requres a newer ATV ? Hmmm, doubtful, but who knows. The first gen 4K box from 3 years is still fully supported it appears. I don't need better picture or sound, speaking for myself.
Who knows. To quote Yoda "Hard to see, the future is"
My plan right now is to buy up all the new A2169s I can and then transistion to refurbursing and offer full warranty. My plan is that the A2169 will become known as the best streaming platform ever made and to be the guy that makes the best ones. Like the Oppos with all the after market mods and the refurb work. I am planning on doing the A2169 for as long as they exist and are useful.
I dont need any more features as i think there are none of interest to add. If 8k ever comes out, the CDN ( content delivery networks ) used by the streaming providers have NO plans on increasing streaming speeds, so 8K would just end up more compressed then 4k as it would need to fit down the same size pipe. They don't even do good Mbps for 4K. In fact HD ( 2K ) can be really good as its less compressed.
"ATVX is better than Kaleidescape, which makes absolutely no sense to me... but if it's true, great!"
This is the #1 question I get.. But the device varies. Sometimes its the Oppo, sometimes in Nvidia, I even have post production guys question ATVX VS a post production workstation..
All these devices , and MANY more, use HDMI. HDMI
>> SUCKS<< .. ALL of them use the "reference design" from the HDMI chip mfgr. So all of them suck just as bad. While bitrate, Mbps, DOES matter,, it turns out that the weak link was never the bit rate. It was HDMI. So sure a post production workstation can chew thru 500Mbps of 4:4:4 but then it dumps thru a horrendous HDMI pipe and what emerges is sad. This is doubly true for sound. Sound is just jittered into submission by HDMI so whatever you start with is nearly meaningless. It like looking thru frosted glass. You can sorta tell the quality of a picture on the other side is better but it all kinda looks the same and fuzzy.. BUT remove the frosted glass and OMG. So bitrate improves the quality of the picture but its limited/obscured by the frosted glass. I removed the frosted glass.
Its also not just HDMI it turns out. The HDMI chip is fed from the CPU. The CPU is tied timing wise to RAM, buss and its SSD. So all that can jitter and stammer and stagger around during decoding. So what gets fed to the HDMI chip is already a mess before it leaves the CPU.
Getting the data from the Ethernet produces interrupts in the CPU to get data and buffed it in. Cleaning up the Ethernet chip then also somehow matters.
I radically clean all that up. No one else I am aware of does this. Interestingly, the brain and perceive jitter in all these systems. Who knew.
Then there are the evil PWM voltage regulators feeding everything. Truly awful devices that spew RF and have all manner of weirdness and noise spectra. No one does these even remotely correctly. SURE they work in std designs, but, again, the reference design for these devices falls way short.
So.. The K-Scape, Oppo, Roku, Nvidia, post workstations ALL suffer from a list of issues that degrade picture and sound. While bit rate can matter, fixing all the above is FAR more impactful then bitrate. In fact, then bitrate can REALLY matter. You can then see bitrate really have impact. The adobe app Frames.IO doing post production material at high bit rates is shocking thru a ATVX.
So YES, a large number of my clients have all the devices above. Usually more then 2 of the above. I have post production clients too. They ABCD. the ATVX is almost always the winner on every movie/show/clip despite the vast differences in bitrate because i fix HDMI and all the decoding jitter.
So while it seems insane that 10-20 Mbps streaming can look, and sound, stunning, it does indeed look and sound jaw dropping and I have 170+ units in the field now and lots of experience myself all confirming it. Its also not subtle. Its more shock and awe.
I FULLY understand tho that this seems insane. A streaming service looking/sounding reference level ? I would have laughed 2 years ago at the suggestion. But, I was shocked to find i was wrong. This is the whole reason I am making them. The entire industry needs a wake up call.
HDMI SUCKS