The Challenges and Benefits of Working With HDR
HDR is a very exciting development for the AV industry. It provides information about brightness and color across a much wider range than standard dynamic range (SDR). Even with HDR content on lower-resolution displays, viewers tend to choose HDR as the better picture. But while the pictures speak for themselves, there is a lot of information in circulation that is both incomplete and confusing.
In video, “high resolution” is frequently used as a way of saying “better.” But higher resolution is not the most important factor to the human eye when evaluating image quality. Color and contrast are what make a picture really stand out, what make it seem more “real”. This is what HDR delivers through increasing both the contrast spectrum (dynamic range) of every pixel and the color range (gamut) that each is capable of reproducing. (HDR refers both to the technology used to capture moving images and that used to display them.).
HDR is already a standard feature of cinema and TV cameras. the industry now has to develop the signal transfer infrastructure to handle HDR transmissions.
One of the primary issues that systems integrators need to consider is bandwidth. Handling HDR data requires bandwidth of more than 11 Gbps (up to 18Gbps), so only AV equipment compatible with the HDMI 2.0 specification has enough headroom. But this comes with severe limits of just a few feet on transmission distances. Even then, only equipment meeting the recent HDMI 2.0a or 2.0b specs supports HDR metadata, making the correct choice of source, display, and components in between absolutely critical. AV equipment will also need to support HDCP 2.2, the current protection scheme on most content.
HDR ‘Format Wars’
Another challenge facing integrators is that HDR is not a universal format; there are currently four standards. The two best established are the proprietary Dolby Vision and the more open-platform HDR10. The main differences are that Dolby Vision supports up to 10,000 nits peak brightness, with a current peak brightness target of 4,000 nits, and 12-bit color depth; HDR10 supports up to 4,000 nits peak brightness, with a current peak brightness target of 1,000 nits, and 10-bit color depth. In short, Dolby Vision currently has slightly (and potentially considerably) higher video quality.
Alongside Dolby Vision and HDR10 is the newer Hybrid-Log Gamma (HLG), developed mainly as an HDR format for live video, and Advanced HDR, aimed at broadcast media and upscaling SDR video.
Currently, quality HDR content is still fairly thin on the ground, making the increased cost in hardware and cabling more difficult to justify. HDR media players, Blu-ray players, and gaming consoles are helping to the bridge the gap, but it will take quality streaming HDR content to really get purchasers embracing the medium.
The biggest drawback to HDR really is that, once you see it, you won’t want to go back.