DCI posted two new documents towards a cinema HDR specification in November, titled “DRAFT Direct View Display D-Cinema Addendum” and “DRAFT High Dynamic Range D-Cinema Addendum”. They represent a major improvement over DCI’s previous draft, which I wrote about last September in “DCI Has Lost It’s Way,” In contrast, the new documents are well-written, detailed, explanatory and responsive. But questions remain and cost matters, leaving plenty of room for improvement.
The new documents provide useful new information, including a darker environment for reference viewing than called for in SMPTE RP431-2, and the use of the “PQ” (Perceptual Quantization) method for encoding 12-bit color as defined in SMPTE ST2084. Details of an HDR DCP (digital cinema package, the distribution format) are also included. Importantly, the minimum display black level was increased from .001 nits to .005 nits, in line with existing HDR-capable direct view LED cinema displays. (Nits, or candellas per square meter, is a measure of the luminance of visible light.)
What’s not clear is DCI’s goals with HDR and direct view LED. Are these documents a statement that Hollywood studios are ready to promote cinema HDR? Or is this a protectionary measure, to get in front of others who might otherwise promote it? Since studios have vowed to not pay for equipment, was any effort made to involve those who will pay? Is image quality a concern? What is clear is that DCI is inventing a new format and placing new requirements on workflows and exhibitors, the details of which need more polish. Several areas stand out, which I discuss below.
There is significant disparity among direct view LED displays and projectors in the price paid for higher resolution. While a 4K projector may cost 20% more than a 2K projector, a 4K direct view LED display will cost approximately 400% that of a 2K version. In direct view LED, cost is driven by the number of pixels, not the size of the display. Given the same resolution, a large display will cost about as much as a small display, the difference in cost largely driven by the cost of the support frames and differently sized panel enclosures.
It is virtuous of DCI to require exhibitors to buy 4K displays at 400% the cost of 2K displays, while studios, wink wink, only intend to distribute 2K content. This can be seen in DCI’s HDR frame rate table, which calls for frame rates of 24, 48, 60, 96, and 120 for 2K HDR, listing only 24 fps for 4K HDR. (fps = frames per second.) In the next section, I explain why 24 fps is of not much value for 4K HDR, making DCI’s 4K frame rate column little more than a pacifier.
DCI’s preference for 4K could be driven by concern over the “screen door effect,” caused by the visible space between rows and columns of pixels. This effect is mitigated in projectors by defocusing the image a fraction of a pixel such that no space is visible between rows and columns. While there is nothing to defocus with direct view LED displays, a higher density of pixels can help mitigate the effect. It shouldn’t be overlooked, however, that good design can also mitigate the effect. Sony’s Crystal LED technology is one such example, where each pixel shines through a micro-lens that fills the gap between pixels. Granted, the lens doesn’t come for free, but it is unlikely to increase the cost of the display by 400%. If the screen door effect is the concern, then DCI should address it by prescribing the need to mitigate it, instead of attempting to solve it in the most expensive manner possible.
The dynamic range prescribed by DCI ranges from .005 nits (first step of black) to 500 nits (peak white). Peak white is a subjective number in HDR. Peak white represents the maximum luminance for highlights, which by their nature can be significantly brighter than the midtones of the image. The numbers for peak white called for by other organizations vary widely. VESA, in its DisplayHDR™ criteria, classifies four different peak white levels, from 400 nits to 1000 nits. ITU Rec. 2100 calls for a reference viewing environment delivering peak luminance of 1000 nits or greater. The UHD Alliance recommends “more than 1000 nits.” Dolby, a leader in HDR development, offers its Pulsar reference monitor capable of 4000 nits peak. It may be surprising to learn that DCI’s prescription for color encoding using PQ (Perceptual Quantization), as originally defined by Dolby and standardized by SMPTE, is defined up to 10,000 nits.
The point made is that there is no hard and fast definition for peak luminance in HDR. Blacks, however, have a firm footing at the other end of the scale. It is detail in blacks that most creatives would argue for, and DCI’s first step of black of .005 nits caters to that need. So it is the choice of peak white that determines the dynamic range of the display, and it is dynamic range that can impact the availability and cost of silicon in the display.
The electrical method used to drive LEDs in a display is called pulse-width modulation (PWM). The number of bits required of the PWM driver is determined by the dynamic range of the display. A dynamic range of .005 nits to 500 nits requires up to 17 bits-per-primary color to drive each pixel. In practice, this requires 18-bit PWM drivers. Notably, if peak white is reduced to 300 nits, the PWM driver requirement reduces to 16 bits. Most direct view LED display manufacturers will say that 16 bit PWM drivers are readily available, while 18 bit not so much. Custom silicon will affect cost of the display at best, but more likely, it will limit competition.
Coincidently, Samsung recommends 300 nits for peak white in distribution. It’s a figure borne of experience with cinema-sized LED displays. For Samsung, it’s not a display limitation, as Samsung cinema displays are capable of 500 nits, and Samsung engineers verify the use of 18 bits per color when driving pixels. Samsung, of course, has an engineering team to design its own semiconductors. Most display companies don’t have that luxury. But it’s worth listening to Samsung’s advice: 300 nits as a minimum figure for peak white may be just what the industry needs.
The one area where DCI’s documents are conspicuously silent is compressed image bit rate. One can think of compressed bit rate as the inverse of compression ratio. The higher the compression ratio, the lower the image quality. Inversely, the higher the compressed bit rate, the higher the image quality. Since 2005, DCI has required a maximum compressed bit rate of 250 Mb/s. It may have been a challenging number to achieve in 2005, but today, even Netflix requires studios to deliver compressed bit rates at least three times that figure. Ask anyone engaged in DCP picture compression for what they would change with HDR, and it will be an increase in compressed bit rate. Presumably, DCI holds cinema back because higher bit rates translate to higher distribution costs. Higher bit rates produce larger files, which require larger disk drives, increased load time, and more time to distribute over satellite (the major means of digital fulfillment in the US).
The bit rate problem is exacerbated by HDR’s 24 fps problem. It is well-known that motion in high dynamic range will reveal frame rate judder at 24 fps. I would be remiss to not point out alternative solutions for HDR and 24 fps that perform a moving average on select objects, such as RealD’s TrueMotion™ technology, artistically blurring the object in select frames to avoid the perception of judder. But with HDR, higher frame rates are inevitable, as evidenced by DCI’s introduction of 2K 60 fps and 120 fps in its HDR document. It is wrong for DCI to introduce higher frame rates without increasing the bit rate.
An increase in compressed bit rate requires a change in the media block. There is no time like now to take this step. It would complete the definition of DCI’s new cinema HDR format: DCI’s new HDR display specification, coupled with DCI’s new HDR DCP specification, and complimented by a new HDR media block specification.
There is an undercurrent affecting much of what is taking place with cinema HDR. Studios are in a bind with new cinema formats. The impact to bottom line when introducing a new format is unlikely to stop at the production level. Exhibitors are burdened with the cost of upgrades or new equipment to properly show the format, and are unlikely to let the studio off the hook. The first line of attack will be the split of box office. This is the very problem that plagued the digital transition in its early days. As studios would say then, and it surely is no different now, once the box office split is renegotiated, it is unlikely to revert. The problem was solved in the digital transition with the introduction of the virtual print subsidy, which had no impact on box office splits.
That was then, and this is now. There will be no more subsidies, and studios have no intent to renegotiate box office splits. HDR will only take off in cinema when exhibitors are motivated to make it happen. For that to occur, HDR equipment costs need to be much lower, or proof must exist that HDR is additive to box office, or both.
The cost of equipment is the factor that DCI has not taken into account. Competition will be needed to drive costs down. The big ticket item, of course, is to eliminate DCI’s limitation of 4K displays, at 400% the cost of a 2K display. If the screen door effect is an issue, then state the problem and call for better engineering to make the display acceptible. More competition will also be encouraged by reducing the minimum peak white to 300 nits. And if exhibitors are to spend their money on new equipment, the equipment should deliver high quality pictures by employing a higher compressed bit rate. The time to get these specs right is now, because there is also a cost to delaying the rollout of HDR cinema while consumers experience better images at home. Here’s hoping that DCI sees the light.
Also published in Digital Cinema Report as DCI Must Do More on HDR.