Got into a conversation with Joe Kane of ISF who's pushing for a 10 bit, 4:2:2 color subsampling '4K' disc/streaming standard. Streaming bandwidth issues aside, he feels that if he doesn't succeed in getting this on the table and accepted now, before the standard is set, whatever new format that takes hold will be just a fraction qualitatively of what it should be. Apparently the manufacturers, etc. are discussing 8 bit and something like 4:2:0
I missed seeing Joe on this trip . Alas, I think there is zero chance the format will be anything but 4:2:0, 8-bit when it comes. Joe is right of course. We should make it 10 bit because that is the production resolution (if not higher). Correct conversion to 8 bit requires dither which makes compression harder. So 10 bit can actually take the same bandwidth as 8 bit! But folks don't understand that and will most likely go with 8 bits again. Higher color sampling will take more bandwidth and will also requires more CPU cycles/RAM/Power in the decoders. And studios may want to make sure the specs don't approach digital cinema. For these reasons I don't think we are going to get anything more than a resolution bump. Even there, I don't expect 4K to come anytime soon in volume. There will be many incompatible sampling type service (e.g. what Sony is doing by bundling a server with their 84 inch UltraHD set), but nothing representing mass market is on the roadmap yet.
Sure. From beginning of time, there has been a battle between the theatrical division of the studios and their customers (i.e. theater chains) and the home video group that brings us rental and for purchase content for home viewing. If it were up to the commercial theaters, there would be no home video market. Now that there is, folks hang their hat on spec differences. The idea being that if you want the best, it is only available in the theater and not at home.
Digital Cinema uses a spec called DCI (Digital Cinema Initiative). DCI supports 2K (almost the same as 1080p) and 4K. Most movies today are presented in 2K in theaters so even they have not moved up to 4k! Default bit depth is 12 bits (versus 8 for consumer formats). The color space/gamut is deeper and compression is JPEG-2000. On the latter, each frame is individually compressed. This means there is no accumulated error on frame to frame. Consumer formats typically compress a frame and then use it for up to half a second (12 fps on a 24 fps movie). The advantage for the consumer format here is that its data rate is considerably less than DCI. The disadvantage is slightly lower quality.
The other major differentiation is time. Movies are held on purpose for a while before they are released for home delivery. I expect and home that this differentiation eventually goes away. I don't go to commercial theaters anymore but would love to see first run movies there. As you can imagine, the theater owners put up even a bigger fuss over this than the specs.
Thanks for taking the time to clarify this, Amir. I don't really see what the big deal is regarding spec's though. I personally have yet to go to a theater and experience a better picture than we have at the house, albeit a smaller screen. You would think [or at least I would] that the spec's, given the end result to the consumer, wouldn't even be an issue worthy of consideration. Unless it's something I would like to see before the Blu-Ray or DVD release, I/we don't go to the theater because the visual [and audio in our particular household] surpasses that which can be experienced in the theater.
-- This is very sad in a way for the progress of technology in the benefit of the entire world's humanity.
...Because money is the direct break pedal that limits most folks to go ahead and see what's coming in order to avoid accidents, and evolve, and go beyond with even more important matters for all societies and our planet.
Ok, let's stay on terra firma; even the public Cinema Theater venues don't have access to the very best.
And most of them (the very vast world's majority) don't even maintain their venues properly; sound wise, and visuals wise.
The elite (some Hollywood high shots, and some movie directors, and high ranking politicians too), are the ones with full access to the very best. Fair enough for them, but what about us?
Ok, I don't see our world changing anytime soon, but I can talk about it though. ...And perhaps, just maybe, make a small and valid difference enough. ...And without selling my soul to the devil (power of money). :b
I missed seeing Joe on this trip . Alas, I think there is zero chance the format will be anything but 4:2:0, 8-bit when it comes. Joe is right of course. We should make it 10 bit because that is the production resolution (if not higher). Correct conversion to 8 bit requires dither which makes compression harder. So 10 bit can actually take the same bandwidth as 8 bit! But folks don't understand that and will most likely go with 8 bits again. Higher color sampling will take more bandwidth and will also requires more CPU cycles/RAM/Power in the decoders. And studios may want to make sure the specs don't approach digital cinema. For these reasons I don't think we are going to get anything more than a resolution bump. Even there, I don't expect 4K to come anytime soon in volume. There will be many incompatible sampling type service (e.g. what Sony is doing by bundling a server with their 84 inch UltraHD set), but nothing representing mass market is on the roadmap yet.
Also, could you comment upon what some other industry experts have said about HQ, namely that like 3D, it's another marketing ploy, since it only makes a difference on sets >80 inches?
Indirectly yes. If the consumer content gets out, they don't want it to be their "masters."
Also, could you comment upon what some other industry experts have said about HQ, namely that like 3D, it's another marketing ploy, since it only makes a difference on sets >80 inches?
HQ? Do you mean UltraHD? If so, yes, it is a function of viewing distance. Even 80 inches doesn't do it if you sit where people normally sit. As I mentioned, you need to sit close enough to see the actual structure of the display pixels. Walk up to your display and get closer and closer until you see the pixels. If that is a fraction of where you sit now, the display has to be far larger than your currently one for the extra resolution to make a difference.
Indirectly yes. If the consumer content gets out, they don't want it to be their "masters."
HQ? Do you mean UltraHD? If so, yes, it is a function of viewing distance. Even 80 inches doesn't do it if you sit where people normally sit. As I mentioned, you need to sit close enough to see the actual structure of the display pixels. Walk up to your display and get closer and closer until you see the pixels. If that is a fraction of where you sit now, the display has to be far larger than your currently one for the extra resolution to make a difference.
Amir, I am very interested in Harmon's Quantum Logic Surround. Were there any demo's of it at the show? Also what do you think of QLC as a new surround sound format, is it a real advance for audio?
Amir, I am very interested in Harmon's Quantum Logic Surround. Were there any demo's of it at the show? Also what do you think of QLC as a new surround sound format, is it a real advance for audio?
No, there was no display of it. I think QLS is a major advance in presentation of audio in the home. It is not a new format because it extracts the audio elements from existing sources (that is indeed its claim to fame). I am not at liberty to say more about its availability. Suffice it to say, I hold strong hope that it will materialize in a product in our future .
You mean JBL above, not JVC . There is really one company now that is puts out products under different brands as appropriate. On the Lexicon processor, they decided to take the Bryston processor and work with them to improve it. That is currently being marketed under the JBL brand.
-- For Audio you would think that UltraHR would be more appropriate (HighResolution).
For 4K Video, UltraHD (HighDefinition), it's theirs now, as they've also been using it for the last 15 years at least.
I remember reading the term in The Perfect Vision mag (AbsoluteSound's sister back then) a very long time ago.
More than ten years ago they were already talking about Ultra High Definition Picture (Video) with 8K resolution, and even 16K.
Indirectly yes. If the consumer content gets out, they don't want it to be their "masters."
HQ? Do you mean UltraHD? If so, yes, it is a function of viewing distance. Even 80 inches doesn't do it if you sit where people normally sit. As I mentioned, you need to sit close enough to see the actual structure of the display pixels. Walk up to your display and get closer and closer until you see the pixels. If that is a fraction of where you sit now, the display has to be far larger than your currently one for the extra resolution to make a difference.
I've been running at (when I bother to set it up) 10.5ft plus,and 1080P plus (messing with resolution encode/decode parameters and their affect on human realization, etc) since about ....1998.
CRT projectors can be an essential tool when it comes to forming a 'true' understanding of what compression/decompression/encoding/formats/etc does to imagery.