High dynamic range, known as HDR, once considered merely a creative tool for high-tech photographers, is wending its way into mainstream media. No company has placed more bets on this technology than Dolby Laboratories, who is on course to redefine its brand as a powerhouse video technology provider. And nowhere were these two facts more evident than at the NAB convention this year, beginning with the kickoff session for the SMPTE Technology Summit for Cinema (TSC) weekend.
There is more to HDR than simply making an image brighter. If one were to pump more light through a display, everything becomes brighter, both highlights and shadows. For it to work correctly, highlights must be brighter, but blacks must be darker. View a black slide on a screen with the room lights off, and one sees a grayish screen – not black. Turn up the screen brightness, and the gray screen becomes a lighter shade of gray – anything but black. When viewing an image, one sees a washed out image if only the screen brightness is increased. The challenge for effective HDR displays is to produce deeper blacks and brighter highlights at the same time. This requires more than a brighter screen: it requires deep blacks and very high contrast.
According to company insiders, HDR has been an area of intense investment for Dolby for the past 10 years. Dolby took the wraps off its 4000 nit EDR (Dolby’s term for HDR) monitor last December, capable not only of 10x or greater light levels than an ordinary monitor, but also very deep blacks. There is much said on the internet about this monitor for those interested in learning more. While not a catalog sale item, Dolby has been busy sprinkling samples of the monitor with studios and researchers to encourage exploration of HDR imagery. This clever move on Dolby’s part has sprouted much research and interest. This led to three presentations given on HDR at the TSC event, one from Dolby, the others from researchers in Europe, one of which studied the light levels required to achieve audience approval. To summarize, there was no contention as to whether viewers prefer HDR images over non-HDR: HDR has proven that it can be a very compelling format. The reports focused on the level of brightness needed to achieve a high level of differentiation with audiences, and the need for a wide color gamut with HDR.
It appears that maximum benefits are achieved for large screen environments, i.e., cinema, with 3000 nits of luminance. 3000 is certainly lower than 4000 nits, but ever so much higher than the standardized 14 ft-L of cinema. Converting units, 14 ft-L equals 48 nits, meaning that 3000 nits requires 62 times the light currently employed in cinemas. Research also reveals that blacks in cinema need to be deeper than in home entertainment, which one would expect given that home entertainment ambient light levels are higher.
Seeking results based on achievable numbers, but not reported at TSC, Dolby has privately conducted tests of projected large screen images in the 100-200 nits range. These aren’t ordinary digital cinema projectors, but ones modified for very high contrast using Dolby’s intellectual property. Reports on effectiveness vary. The concern heard among distributors is that while HDR pictures projection at limited (but higher) luminance levels unquestionably look better, they don’t look as good as what’s possible on home entertainment systems. You might ask what home entertainment systems? To which the answer is: those set to enter the market at the end of this year under the Dolby Vision™ license.
Assuming HDR receives an enthusiastic reception in high-end home entertainment, it follows that HDR will find its way into the cinema. This has led to talk about how to distribute HDR content into cinemas. (The subject was even raised in a recent ISDCF meeting – it’s becoming a popular topic these days in Hollywood hallways.) Dolby suggests that the 12-bit picture used today in digital cinema distributions should be enough. But one must assume that this would require use of a Dolby patent, of which Dolby was granted no less than four patents for HDR compression in 2013. Notably, a Dolby patent discloses that the entire dynamic range of human vision can be captured in 16-bits using non-patentable logarithmic encoding.
If HDR content cannot be played on normal projection systems, then there is no need to retain “normal” media blocks in HDR systems. Certainly, the additional cost of a more capable media block will be insignificant when compared to the >$500K capital expenditure needed for an HDR projector. And a more capable media block will be needed. The Academy of Motion Picture Arts and Sciences is in a very early stage of studying the capture and distribution of HDR content. The first reaction heard is that the 250Mb/s minimum bit rate in the DCI specification is woefully inadequate, but so also are the 500Mb/s media blocks now available on the market. If forced to change the media block, then why stop there? HDR offers an opportunity to clean up the digital cinema package with everything learned over the past 10 years with a v2.0 version of SMPTE DCP. Or, cynically, we could do nothing and simply revise the Interop spec with 16-bit picture, mangling our way forward as the industry has been doing for the past 10 years. Dolby is on a remarkable if not stunning track with its efforts in HDR, but it will take more vision than Dolby Vision to do the right thing in digital cinema.