By Yoel Zanger
Does a higher resolution guarantee the best image quality, or does better contrast and brightness? And can today’s limited bandwidth handle all that data?
There is a controversy taking place between 4K and HDR. For the past few years, 4K has been the poster child for visual innovation in TVs. Now, High Dynamic Range (HDR) as a buzz word is quickly rising in popularity with industry experts. The two types of quality enhancement, associated with the emerging UHD standard, have recently been used as marketing trump cards for industry players, e.g. Netflix currently offers more 4K content. while Amazon started by offering HDR content.
It is hard to say which will win the marketing buzz war. Probably both will eventually become part of the UHD standard, together with other UHD features such as WCG (Wide Color Gamut) and HFR (Higher Frame Rates). The challenge of standardizing is complicated by the fact that OEMs aren’t completely on-board yet with 4K, and current TVs are mostly not HDR-ready.
The two “non-standards” have their respective problems and benefits for visual quality. While TV OEMs and streaming video producers lean on buzzwords to market products, many consumers are in the dark about what it really means for them.
So, let’s clear up some of the confusion and suggest an outcome for the 4K and HDR wars that could be a win for everyone—consumers, content providers, and consumer electronic device OEMs alike.
Quality Matters with 4K and HDR
4K relates to the horizontal pixel count, and theoretically improves resolution by increasing the number of dots on a screen for a higher quality image.
Usually the assumption is that higher resolution is better, and TV manufacturers are making large 4K screens affordable. But does a large TV make sense for the limited space in a tiny San Francisco or New York City apartment? For the human eye to fully recognize the difference in pixel counts, you have to be within a certain distance of the screen. Most small apartments only have 6 to 8 feet of space between the couch and TV. At that distance, a 65-inch TV screen would be far too large for the space. You would have to sit over 9 feet away for the optimal viewing experience.
The other issue is that more pixels boosts the bandwidth requirements for streaming video. There are technologies for TV sets that improve streaming content capabilities over existing bandwidth speeds, but they have yet to be adopted as a standard on all TV manufacturers and OTT services.
Recently, Netflix stood behind HDR as more important than 4K for higher quality content, despite having been beaten to the punch by Amazon which pitched HDR first. HDR emphasizes strong contrast between light and dark parts of an image, producing a more eye-popping and clear picture without the issues of inconsistency caused by viewer distance from the screen. Essentially, if you’re able to take a standard image and make the reds, greens, and blues brighter or darker, then emphasize shadows or highlights, you’re able to make an image look less flat.
Many photographers use this technique to improve photo contrast and color, with results that almost look 3D. Pixel luminosity is measured in nits, with each nit equivalent to the brightness of one candle. The majority of today’s TVs peak at 100 nits, with HD TVs topping at about 1000 nits. This means most TV’s aren’t ready to handle HDR content.
HDR and more nits puts a strain on internet bandwidth, as well, although not nearly as much as 4K. Thus, if you want to see the brightest white clouds around Lando Calrissian’s Cloud City from the Star Wars trilogy, or the depth of contrast in a high-speed action movie, you’re going to need more bandwidth.
While HDR could be used on any TV that is bright enough, creating an HDR world is difficult because it requires TV manufacturers and content producers to be on the same page.
Another factor is what’s in it for the technology companies leading UHD development. Today, vendors such as Dolby, Technicolor, and Philips have their own competing intellectual property for HDR and ways to improve image quality and streaming speeds. Codec companies like Dolby and Technicolor will continue to push HDR because there is no real intellectual property to be had in 4K (It’s hard to patent adding more pixels), but there is a need for HDR licensing. Of course, being able to have both 4K and HDR content would be the goal.