Current and emerging TV technologies have the capability to display a color gamut far beyond that available in consumer content. Movie studios are not delivering better content that takes advantage of these new technologies and the world seems to be rushing headlong towards lower quality streaming as the preferred deliver method. Surely there’s something wrong with this picture?
Mark Anderson | HomeToys.com
TV Display Technology
If we look at today’s major TV technologies, there’s no doubt that Plasma and LED Backlit LCD lead the field in terms of color quality. Emerging standards such as 4K, OLED and QDEF (we'll be covering QDEF next month) are pushing the capabilities of displays even further. Yet all content is still authored to a Dickensian standard known as Rec. 709 standard, which dates back to the early nineties (and is actually based on standards from the sixties). In contrast, digital movies for the theater are authored to the DCI P3 standard, which has a significantly larger gamut than than Rec. 709 (more on this later).
Many colors found in real life fall outside the boundaries of the Rec. 709 gamut, so why don’t the movie studios choose a lager gamut (such as P3) for consumer content? Before we attempt to answer this, we’ll take a closer look at color gamut and gamut mapping.
Understanding Color Gamut
For a device, the color gamut represents the largest range of colors that can be displayed. For content, it denotes the minimum and maximum color range of that content. So if I have a display whose gamut is smaller than the gamut of the content, clipping will appear. Typically, this will manifest itself as washed out highlights, blocky dark areas and duller colors.
In 2-dimensonal literature, gamut is frequently represented using the CIE chromaticity diagram using the CIE xyY coordinates. The horseshoe shaped boundary represents the human visible spectrum.
In the above diagram, x and y determine chromaticity (or hue and saturation), e.g. the redness of red or the blueness of blue. The Y coordinate (note the difference between "y" and "Y"), which is not shown as we only have 2 dimensions, is luminance (or brightness). As Y decreases all the colors would get darker and the outer bounds of the shape would reduce.
The other reason to use a two-dimensional plot is that very difficult to compare just two color spaces (never mind four) in three dimensions without an interactive 3D model that can be freely rotated. For this reason we’ll stick to the conventional 2D representation, where the primaries are shown at full luminance.
Because the third dimension is not shown, these diagrams can’t show the difference between two gamuts at different luminance values. (The diagram above is actually a cross section at a luminance of 100%). For example, my Pioneer Kuro plasma and an OLED display have much deeper blacks (and much purer colors) than LCD displays. The only way we’ll see this is to view the gamut in 3D or take cross sections at varying luminance values.
The diagram below is an approximation of the gamut of four color spaces. (Some of the values had to be visually interpreted from manufacturers diagrams, but they are close enough to illustrate the point.)
The black dotted line represents the gamut of Rec. 709. This is the standard for HDTV and Blu-ray content. If we compare that to the DCI P3 gamut (white dashed line), it’s clear that there are a considerable number of colors that can be displayed in DCI P3 vs. Rec. 709. DCI P3 is the Digital Cinema Initiative's standard for authoring of content. We can see from this that the same movie viewed in a movie theater (authored for DCI P3) will have a much broader range of colors than a Blu-ray (authored for Rec. 709) on even the best home theater display (regardless of its display technology).
The converse of this is one of the reasons why content for consumers is not authored for DCI P3. If we took a movie authored for DCI P3 (dashed black line) and played it back on a CCFL LCD display (solid white line) a massive amount of clipping would take place unless some gamut mapping was carried out. This is because the display simply cannot represent the range of colors in the content.
What is Gamut Mapping?
Gamut mapping is a simple mathematical process that can map all the colors in the large gamut (the content) to colors in the smaller gamut (the display) or vice versa. Unfortunately gamut mapping and color space conversion are part science and part craft because simply shifting all the colors to the new gamut could render a dreadful result.
When discussing anything to do with color transformations, it’s useful to understand a little about a particular aspect of the psychology of color: memory colors. These are things that we instinctively recognize and instinctively expect to be specific values. When they are wrong, we spot them instantly. Examples include green grass, blue sky, Caucasian flesh tones, etc. So whenever we do anything with color, we need to make sure that certain things do not shift too far from the norm.
So back to gamut mapping, let’s assume I’ve just bought a new OLED display (bank account -$10,000) and I have a Blu-ray disc (authored to Rec.709). My display has a much larger gamut than the content (except in blue areas):
As we can see in my hypothetical OLED, the capabilities in the red and green areas are way beyond Rec.709 content. Let’s further assume that my OLED manufacturer has a mode called “vivid” that simply maps the gamut of Rec.709 to the maximum gamut of the display with a simple mathematical transform. I’ve chosen an arbitrary point (1) to represent a normal Caucasian subject’s flesh tone. (Depending on the quality and calibration of your monitor, it may look like an absurd color, but it’s just for illustration.) My OLED’s vivid mode is going apply gamut mapping and shift all the reds towards a more saturated value, so that it can take advantage of all these wonderfully saturated color. Now this may be great for scenes containing flowers, Scarlet Macaws and red Ferraris, but it could end up shifting my flesh tone to location 2, which would make my subject look like they had sunburn. This is one of the reasons simple mathematics cannot be used for gamut mapping in the real world.
Moths to a Flame
Unfortunately, many displays attempt to do just that with “vivid” mode. (It should really be called “Crayola” or better yet “DO NOT USE”.) Consumer electronics vendors and retailers love this vivid mode, as it makes the sets looks nice and bright in the store and much of the buying public are like moths to a flame: they are attracted to the brightest model they see. In a retail store, with bright lights and suitable content, these displays can look pretty good, but the moment the set gets home, it needs to be switched to “movie” or some other suitable mode (and ideally, calibrated).
On some of the higher end sets, manufacturers attempt to use adaptive algorithms. For example, Runco have a mode on their projectors to expand the gamut. These algorithms (rarely successfully) attempt to recognize specific content such as flesh tones and keep these within an acceptable tolerance, whilst making my flowers, Scarlet Macaws and red Ferraris look wonderfully saturated.
Movie studios and post production houses have to manage this problem in reverse. They are recording on film or on digital cameras whose gamuts equal or exceed DCI P3, so how do they compress this down to the gamut of Rec. 709 for Blu-ray, DVD and OTA content? Answer: they have experts (cinematographers and colorists) who not only understand the process in detail, but also work with the director to preserve his creative intent, which is ultimately the most important aspect of the conversion.
So now we understand a bit about color gamut and the standards used in the industry today, we can move on to trends in viewing.
There’s no doubt that streaming media is growing dramatically. A few months ago Netflix was reported to be the source of 32.7% of all internet traffic in North America. Don’t get me wrong, I love streaming, for chick flicks, a documentary, or a run-of-the-mill TV episode, it’s great. It does frustrate the heck out of me when I see compression artifacts or audio sync issues, but in general it’s not bad for certain kinds of content and great for convenience.
Vudu’s HDX (my go to streaming provider) has been the forerunner in 1080p streaming and other providers are beginning to catch up. (Netflix now support it and devices that support Netflix's 1080p streams are beginning to roll out.) None of them provide lossless audio such as Dolby TrueHD and DTS MasterAudio. So, on the one hand we have display (and receiver) technology coming on leaps and bounds, but much of the growth in content provision is at a level of quality that doesn’t even match DVD.
This is one of my pet peeves about “HD” streaming. Just because it’s 1080i (or 1080p) does not necessarily make it HD. It might satisfy the resolution requirements, but the factors that really affect video quality are bit rate and compression. If those 1080 lines are not 1080 lines of quality, it’s not HD in my book. It’s the same as digital cameras: resolution is not the primary factor that determines quality. For years I had a three megapixel Canon EOS 30D. Friends were going out buying 6MP and 12MP point and shoot versions and laughing at my lowly 3MP. As you might imagine, the quality was nowhere near the same, because CCD sensors weren’t a patch on the Canon CMOS sensor and my pro lenses cost three times the price of their entire camera. Unfortunately, people get way too hung up on resolution, whether it be still images or video.
No doubt, streaming will get better. Bandwidth to the home is increasing with services such as Verizon’s FIOS and Google’s gigabit to the home. That said, the entire pipe, from content providers’ data farms, through internet backbone providers to the local distribution points need to increase too. Eventually, we may get to Rec.709 with lossless audio, but that only gets us to where Blu-ray is today.
Display Technology Implications for Hollywood
For Hollywood it’s a bit of a double-edged sword. One the one hand their profits from streaming are growing massively (as their cost is negligible for each title), but on the other, the quality of the content arriving at the consumer has gone downhill compared to Blu-ray.
Providing higher color fidelity (e.g. DCI P3) would be quite a money spinner for the studios. Think about it: they could sell the entire Star Wars series all over again, this time in “enhanced color”. Would it jeopardize ticket sales at the box office? I doubt it. I rarely go to the movie theater these days, preferring to watch Blu-ray on my schedule, in my own Home Theater. Sure, there’s an occasional blockbuster that has to be seen at the movies (until I get my Runco projector and 120” screen).
What about Deep Color?
It still amazes me how many consumer electronic devices and HDMI cables are out there that support DeepColor (part of HDMI 1.3), yet there isn’t a single piece of content to support it. DeepColor in and of itself does not do anything for the gamut of the content, it is merely a specification that allows the number of bits per color to be increased from 8 to a maximum of 16, i.e. it increases the precision, but not the amount of data.
So, Am I Wasting my Money Buying a Better TV?
In most cases: no. I’ll be buying an OLED or QDEF panel as soon as they’re affordable and reliable enough. As I mentioned earlier, deeper blacks and purer colors, all result in a much higher fidelity picture than the TV in most homes today. Certaily anyone with an early LCD panel (using CCFL) will notice a remarkable improvement.
So, come on Hollywood, it’s time to step up and provide us with some content worthy of our devices.
Mark Anderson is Managing Editor of HomeToys.com and AVSystemsMag.com. He will be covering everything related to residential and commercial, AV, Automation and Digital Signage
He is is a long-time home theater enthusiast and lives on the bleeding edge of Home Automation.
This post does not have any comments. Be the first to leave a comment below.
Post A Comment
You must be logged in before you can post a comment. Login now.