Geoffrey Morrison

HDR, or high dynamic range, is poised to be the next big thing in TVs.

We’ve been talking about it for several years, but finally a few HDR-compatible TVs are on the market, the first HDR TV shows and movies are available to stream, and later this year or in early 2016, Ultra HD Blu-ray discs will likely appear with HDR titles, not to mention the new players to play them.

Is this new technology worth the hype? In two words: yes, hopefully. I am pretty jaded when it comes to new TV tech, and I’m really excited about HDR. And I’m not the only one.

What is high dynamic range?

The two most important factors in how a TV looks are contrast ratio, or how bright and dark the TV can get, and color accuracy, which is basically how closely colors on the screen resemble real life (or whatever palette the director intends). This isn’t just from my years of TV testing, but in pretty much every multiviewer TV test and the years of testing from industry experts like the Imaging Science Foundation and Joe Kane.

If you put two TVs side by side, and one has a better contrast ratio and more accurate color, and the other just has higher resolution (more pixels), the one with greater contrast ratio will be picked by pretty much every viewer. It will look more natural, “pop” more, and just seem more “real,” despite having lower resolution. In other words, a 1080p resolution TV with excellent contrast and color beats a 4K resolution TV with average contrast and color every time.

HDR expands the range of both contrast and color significantly. Bright parts of the image can get much brighter, so the image seems to have more “depth.” Colors get expanded to show more bright blues, greens, reds and everything in between.

Wide color gamut (WCG) is coming along for the ride with HDR, and that brings even more colors to the table, colors so far impossible to reproduce with TVs up to this point. The reds of a stop sign or fire truck, the deep violet of an eggplant, even the green of many street signs. You may have never noticed before that these weren’t exactly how they looked in real life, but you sure will now. WCG will bring these colors and millions more to your eyeballs.

Photo HDR isn’t TV HDR

One of the most important things to know about HDR TVs is that TV HDR is not the same as photo HDR. Every article I’ve written about HDR has comments from people complaining about the hyper-realistic look common with HDR photography. These are two very different things that, unfortunately and confusingly, just happen to share the same name. Like football and football.

I wrote an entire article about the difference, but the main takeaway is that HDR for TVs is not a picture-degrading gimmick (akin to the soap opera effect). It is definitely not that.

TV HDR: Expanding the TV’s contrast ratio and color palette to offer a more realistic, natural image than what’s possible with today’s HDTVs.

Photo HDR: Combining multiple images with different exposures to create a single image that mimics a greater dynamic range.

Photo HDR: Taking two or more images (left and center) and combining them to show some aspects of both (right).
Geoffrey Morrison

HDR for TVs aims to show you a more realistic image, one with more contrast, brightness and color than before.

An HDR photo isn’t “high dynamic range” in this sense. The image doesn’t have the dynamic range possible in true HDR. It’s still a standard dynamic rangeimage, it just has some additional info in it due to the additional exposures.

A TV HDR image won’t look different the way a photo HDR image does. It merely looks better.

I hate to belabor the point, but due to the two processes sharing the same name, this understanding is really the first, biggest hurdle HDR faces. Those with an open mind might seek out HDR to find out what it is, and be blown away by a demo — and the demos are amazing. Those convinced HDR isn’t worth their time, won’t ever bother to see the demo and will poison the well (so to speak).

How does it work?

There are two parts of the HDR system: the TV and the source.

The first part, the TV, is actually the easier part. To be HDR-compatible, the TV should be able to produce more light than a normal TV in certain areas of the image. This is basically just like local dimming, but to an even greater range.

Tied in with HDR is wide color gamut, or WCG. For years, TVs have been capable of a greater range of colors than what’s possible in Blu-ray or HD downloads. The problem is, you don’t really want the TV just creating those colors willy-nilly. It’s best left to the director to decide how they wants the colors of their movie or TV show to look, not a TV whose color expanding process might have been designed in a few days 6,000 miles from Hollywood. More on this in a moment.

Of course, making TVs brighter and more colorful costs money, and some HDR TVs will deliver better picture quality than others. Just because a TV is HDR-compatible doesn’t necessarily mean it’s going to outperform non-HDR TVs. The only thing the HDR label really means is that the TV will be able to display HDR movies and TV shows.

The content is the hard part. To truly look good, the HDR TV needs HDR content, which today is almost nonexistent. TV shows and movies with 4K resolution are rare enough, but HDR TV shows and movies are even rarer. Only Amazon has released any so far, and it consists of just a couple of shows and a handful of Sony films. Netflix has said it will offer HDR before the end of 2015, but no other content provider has yet announced HDR for the home.

Another source of HDR will be physical discs. The Ultra HD “4K” Blu-ray specification allows the discs to carry HDR versions, with Dolby Vision as an option. The trick is, how many 4KBD titles will we see with HDR, especially at the beginning? One hint to that is how easy (read: cheap) is it to create an HDR version of a movie. Turns out, we have some clues to that.

HDR content (the key)

When a movie or TV show is created, the director and cinematographer work with a colorist to give the program the right look. Take the muted, cold color tones of Winterfell in “Game of Thrones” versus the richness and warmth in King’s Landing. If you’ve been living in a cave without HBO or the Internet, here’s what I mean:



For movies, the team is able to use the wide palette of the Digital Cinema P3 color space to create gorgeous teals, oranges and violets.

But then comes the time to make these movies work on TV. In order to do that, that team essentially “dumbs down” the image, removing dynamic range and limiting the color. They get it to look the way they want, given the confines of the HDTV system, and that limited version is what you get on Blu-ray or a download.

If your TV is set to the Movie or Cinema mode, this is approximately what you’ll get at home. If you’re in the Vivid or Dynamic mode, the TV will then exaggerate the colors as it sees fit. It’s creating something that isn’t there, because at the mastering stage, the director and her team had to take that all out. Is the “Vivid” version close to what they saw, or what was in the theater? Doubtful, and there’s no way to know since it’s your TV’s creation.

Thanks to the additional storage and transmission capacities of 4K BD and streaming video from Amazon, Netflix and others, additional data, called metadata, can be added to the signal. It tells HDR/WCG TVs exactly how they should look, exactly what deeper colors to show, and exactly how bright a given highlight, reflection, star, sun, explosion or whatever should be. It can even adjust picture settings or put the TV in a certain picture mode automatically. This is a huge advancement in how we’re able to see images on TVs.

Technicolor’s Intelligent Tone Mapping is a tool for content creators to more easily (as in, more affordably) create HDR content. I’ve seen it in action, and the results are very promising. This is a good thing, as it means it’s not labor intensive to create HDR versions of movies and shows. If it took tons of time, and time equals money, then we’d never get any HDR content. This is just one example of the process.

What about cables and connectors?

You won’t need new cables for HDR. Current High-Speed HDMI cables can carry HDR. The source device (a 4K Blu-Ray player, say) and TV must be HDMI 2.0a to transmit the metadata, however. If you have a receiver and want to use it for switching, it will need to be HDMI 2.0a as well.

The good news is many HDMI 2.0 devices released this year are getting a firmware update to be HDMI 2.0a. Unlike the transition from HDMI 1.4 to 2.0, there don’t seem to be hardware limitations to go from 2.0 to 2.0a. If you bought an HDMI 2.0 device this year, best to check with the manufacturer if you’re going to upgrade your gear to be HDR.

Bottom line

Most experts I’ve spoken to, on both the content side and the TV side, are excited about HDR and WCG. 4K itself didn’t have anyone in those camps that excited. The common refrain was “More pixels are cool, but better pixels would be amazing.”

Though breathlessly claimed as the next-generation TV evolution, 4K was anything but. Now, with HDR and WCG, we’re looking at the promised evolution, and it should be a brighter and more colorful one.

Got a question for Geoff? First, check out all the other articles he’s written on topics such as why all HDMI cables are the same, LED LCD vs. OLED vs. Plasma, why 4K TVs aren’t worth it and more. Still have a question? Send him an email! He won’t tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.


What is HDR for TVs, and why should you care?