Although Ultra HD TV has not yet quite supplanted HDTV as the norm, another buzzword has taken over – HDR, short for High Dynamic Range. There’s a lot of confusion in the media and the market about HDR because first of all the term is used for two quite different technologies, and secondly because there are by now an almost bewildering number of formats and allegations of a format war. So what’s going on in this area in terms of technologies and standards? Before looking into that, let’s take a step back and look at what HDR video is and what’s the benefit of it.
Since 2013, Ultra HD or UHD has come up as a major new consumer TV development. UHD, often also referred to as ‘4K’, has a resolution of 3,840 x 2,160 – twice the horizontal and twice the vertical resolution of 1080p HDTV, so four times the pixels. UHD has been pushed above all by TV manufacturers looking for new ways to entice consumers to buy new TV sets. To appreciate the increased resolution of UHD, one needs to have a larger screen or a smaller viewing distance but it serves a trend towards ever larger TV sizes that consumers buy.
While sales of UHD TV sets are taking off quite prosperously, the rest of the value chain isn’t following quite as fast. Many involved feel the increased spatial resolution alone is not enough to justify the required investments in production equipment. Several other technologies promising further enhanced video are around the corner however. They are:
High Dynamic Range or HDR
Deep Color Resolution: 10 or 12 bits per subpixel
Wide Color Gamut or WCG
High Frame Rate or HFR: 100 or 120 frames per second (fps)
As for audio, a transition from conventional (matrixed or discrete) surround sound to object-based audio is envisaged for the next generation of TV.
Of these technologies, the first three are the most relevant right now. They are also interrelated.
So what does HDR do? Although it’s using rather different techniques, HDR video is often likened to HDR photography as their aims are similar: to capture and reproduce scenes with a greater dynamic range than traditional technology can, in order to offer a more true-to-life experience. With HDR, more detail is visible in images that would otherwise look either overexposed, showing too little detail in bright areas, or underexposed, showing too little detail in dark areas. The exposure bracketing technique used in photography and referred to as “HDR” results in an image that’s still SDR, otherwise it would not be viewable on a standard phone/camera display. True HDR can only be viewed on an HDR display. Be warned that just about any comparison between SDR and HDR you’ll see online or in printed media is simulated (the SDR image is artificially ‘dumbed down’) because you can’t show the difference between SDR and HDR on an SDR display any more than you can show the difference between black & white and color TV on a black & white TV. To add to the confusion, some camera makers have started applying the photography technique to video.
HDR video is typically combined with a feature called Wide Color Gamut or WCG. Traditional HDTVs use a color space referred to as Rec.709, which was defined for the first generations of HDTVs which used CRT displays. Before that, television used a color space called Rec.601. Current flat panel display technologies like LCD and OLED can produce a far wider range of colors and greater luminance, measured in ‘nits’. A nit is a unit for brightness, equal to candela per square meter (cd/m2). To accommodate this greater color gamut, Rec.2020 color space was defined. No commercial display can fully cover this new color space but it provides room for growth. The current state of the art of color gamut for displays in the market is a color space called DCI-P3, which is smaller than Rec.2020 but substantially larger than Rec.709.
To avoid color banding issues that could otherwise occur with this greater color gamut, HDR/WCG video typically uses a greater sampling resolution of 10 or 12 bits per subpixel (R, G and B) instead of the conventional 8 bits, so 30 or 36 bits per pixel rather than 24.
Color/luminance volume: BT.2020 (10,000 nits) versus BT.709 (100 nits); Yxy - Image credit: Sony
The problem with HDR isn’t so much on the capture side nor on the rendering side – current professional digital cameras can handle a greater dynamic range and current displays can produce a greater contrast than the content chain in between can handle. It’s the standards for encoding, storage, transmission and everything else that needs to happen in between that are too constrained to support HDR.
So what is being done about this? A lot, in fact. Let’s look at the technologies first. A handful of organizations have proposed technologies for describing HDR signals for capture, storage, transmission and reproduction. They are Dolby, SMPTE, Samsung, Technicolor, Philips, and BBC together with NHK.
HDR10 is currently the most commonly accepted HDR video format. You may be surprised that the ‘PQ’ technology it’s based on was developed by Dolby. It’s an ‘open’ standard for which no royalties are payable.
HDR10 Media Profile
EOTF: SMPTE ST 2084
Color sub-sampling: 4:2:0 (for compressed video sources)
Bit depth: 10 bit
Color primaries: ITU-R BT.2020
Metadata: SMPTE ST 2086, MaxFall (Maximum Frame Average Light Level), MaxCLL (Maximum Content Light Level)
Referenced by: 1. Ultra HD Blu-ray spec (Blu-Ray Disc Association) 2. HDR-compatible display spec (CTA; former CEA)
Dolby’s HDR technology is branded Dolby Vision. One of the key elements of Dolby Vision is the Perceptual Quantizer EOTF which has been standardized by SMPTE as ST 2084 (see box: SMPTE HDR Standards) and mandated by the Blu-ray Disc Association for the Ultra HD Blu-ray format. The SMPTE ST 2084 format can actually contain more picture information than TVs today can display but because the information is there, as manufacturers build better TVs the content has the potential to look better when the new, improved display technologies come to market. Dolby Vision and HDR10 use the same SMPTE 2084 standard making it easy for studios and content producers to master once and deliver to either HDR10 or, with the addition of dynamic metadata, Dolby Vision. The dynamic metadata is not an absolute necessity, but using it guarantees the best results when played back on a Dolby Vision-enabled TV. HDR10 uses static metadata which ensures it will still look good – far better than Standard Dynamic Range (SDR). Even using no metadata at all, SMPTE 2084 can work at an acceptable level just as other proposed EOTFs without metadata do.
High-level overview of Dolby Vision dual-layer transmission for OTT VOD; other schematics apply for OTT live, broadcast, etc. Image credit: Dolby Labs Dolby Vision white paper
Dolby uses 12-bit color depth for cinematic Dolby Vision content to avoid any noticeable banding but the format is actually agnostic to different color depths and works with 10-bit video as well. In fact, Dolby recommends 10-bit color depth for broadcast. On Ultra HD Blu-ray, Dolby Vision uses 12-bit color.
Several studios have begun releasing movies on Ultra HD Blu-ray with Dolby Vision. Disney, for some time the last hold-out, has recently also begun releasing UHD Blu-rays. Sadly the first two movies come with HDR10 only but the company says it’s committed to Dolby Vision.
Dolby Vision is available on select movies on streaming services Vudu, Netflix and Amazon Video, with the right TV set and streaming device or app.
For more information about Dolby Vision, check this FAQ.
Samsung is one of the companies that has not adopted Dolby Vision. Instead it has developed its own method for adding dynamic metadata to HDR10, which it calls HDR10+. This format is standardized as SMPTE ST2094-40. Samsung has made it an open standard for which no royalty is required. In other words, other manufacturers and content providers are free to use it. Amazon Video was the first to support it. As of IFA 2017, Panasonic and 20th Century Fox have joined, and together with Samsung formed the HDR10+ Alliance.
HDR10+ is currently not one of the optional HDR formats recognized by Ultra HD Blu-ray but the companies are aiming to get it added to the disc standard. Samsung also has an SDR-to-HDR up-conversion technology that very confusingly it calls HDR+. A name like SHDR might have been better.
Technicolor and Philips
Technicolor has developed two HDR technologies. The first takes a 10-bit HDR video signal from a camera and delivers a video signal that is compatible with SDR as well as HDR displays. The extra information that is needed for the HDR rendering is encoded in such a way that it builds on top of the 8-bit SDR signal but SDR devices simply ignore the extra data.
Image credit: Technicolor
The second technology is called Intelligent Tone Management and offers a method to ‘upscale’ SDR material to HDR, using the extra dynamic range that current-day capture devices can provide but traditional encoding cannot handle, and providing enhanced color grading tools to colorists. While it remains to be seen how effective and acceptable the results are going to be, this technique has the potential to greatly expand the amount of available HDR content.
The commercial name for this feature is Advanced HDR, and LG’s 2017 UHD TV sets incorporate it.
At CES 2016, Technicolor and Philips announced they would merge their HDR technologies. The resulting standard is called Prime Single or SL-HDR1 (SL for Single-Layer).
Hybrid Log Gamma (HLG)
Having a single signal that delivers SDR to legacy TV sets (HD or UHD) and HDR to the new crop of TVs is also the objective of what BBC’s R&D department and Japan’s public broadcaster NHK are working on together. It’s called Hybrid Log Gamma or HLG. HLG’s premise is an attractive one: a single video signal that renders SDR on legacy displays but HDR on displays that can handle this. HLG, BBC and NHK say, is compatible with existing 10-bit production workflows and can be distributed using a single HEVC Main 10 Profile bitstream.
Depending on whom you ask HLG is the best thing since sliced bread or a clever compromise that accommodates SDR as well as HDR displays but gives suboptimal results and looks great on neither. The Hybrid Log Gamma name refers to the fact that the OETF is a hybrid that applies a conventional gamma curve for low-light signals and a logarithmic curve for the high tones.
Hybrid Log Gamma and SDR OETFs; image credit: T. Borer and A. Cotton, BBC R&D
OETF: function that maps scene luminance to digital code value; used in HDR camera;
EOTF: function that maps digital code value to displayed luminance; used in HDR display;
OOTF: function that maps scene luminance to displayed luminance; a function of the OETF and EOTF in a chain. Because of the non-linear nature of both OETF and EOTF, the chain’s OOTF also has a non-linear character.
Image credit: T. Borer and A. Cotton, BBC R&D
The EOTF for Mastering Reference Displays, conceived by Dolby and standardized by SMPTE as ST 2084 is 'display-referred'. With this approach, the OOTF is part of the OETF, requiring implicit or explicit metadata.
Hybrid Log Gamma (HLG), proposed by BBC and NHK, is a 'scene-referred' system which means the OOTF is part of the EOTF. HLG does not require mastering metadata so the signal is display-independent and can be displayed unprocessed on an SDR screen.
The reasoning is simple: bandwidth is scarce, especially for terrestrial broadcasting but also for satellite and even cable, so transmitting the signal twice in parallel, in SDR and HDR, is not an attractive option. In fact, most broadcasters are far more interested in adding HDR to 1080p HD channels than in launching UHD channels, for exactly the same reason. Adding HDR is estimated to consume up to 20% extra bandwidth at most, whereas a UHD channel gobbles up the bandwidth of four HD channels. It’s probably no coincidence HLG technology has been developed by two broadcast companies that have historically invested a lot in R&D. Note however that the claimed backwards compatibility of HLG with SDR displays only applies to displays working with Rec.2020 color space, i.e. Wide Color Gamut. This excludes the very first UHD TV sets from around 2014 but very few sets were sold in those days because they were extremely expensive back then – upward of $20,000.
ARIB, the Japanese organization that’s the equivalent of DVB in Europe and ATSC in North America, has standardized upon HLG for UHD HDR broadcasts.
The BBC has held a small-scale trial with UHD HLG content made available via iPlayer but this trial, which used a fragment from BBC’s nature documentary Planet Earth II, was limited to Panasonic TV sets. Note that Ultra HD Blu-ray does not support HLG, so the documentary released on disc by BBC Worldwide uses HDR10 instead.
The DVB Project meanwhile has announced that UHD-I phase 2 will actually include a profile that adds HDR to 1080p HD video – a move advocated by Ericsson and supported by many broadcasters. It’s not likely many CE manufacturers will start producing HDTVs with HDR however. Such innovations typically end up only in the UHD TV category, where the growth is and any innovation outside of cost reductions takes place. Sony has said their 2017 HDTVs will offer HDR but it appears the only source that will work with is PlayStation 4 games.
This means consumers will need a HDR UHD TV to watch HD broadcasts with HDR. Owners of such TV sets will be confronted with a mixture of qualities – plain HD, HD with HDR, plain UHD and UHD with HDR (and WCG), much in the same way HDTV owners may watch a mix of SD and HD television, only with more variations.
The SMPTE is one of the foremost standardization bodies active in developing official standards for the proposed HDR technologies. See box ‘SMPTE HDR standards’.
SMPTE HDR Standards
ST 2084:2014 - High Dynamic Range EOTF of Mastering Reference Displays
defines 'display referred' EOTF curve with absolute luminance values based on human visual model
called Perceptual Quantizer (PQ)
ST 2086:2014 - Mastering Display Color Volume Metadata supporting High Luminance and Wide Color Gamut images
specifies mastering display primaries, white point and min/max luminance
ST 2094-2:2017 - Content-dependent Metadata for Color Volume Transformation of High-Luminance and Wide Color Gamut images
specifies dynamic metadata used in the color volume transformation of source content mastered with HDR and/or WCG imagery, when such content is rendered for presentation on a display having a smaller color volume
ST2094-10: Dolby (Parametric Tone Mapping)
ST2094-20: Philips (Parameter-based Color Volume Reconstruction)
ST2094-30: Technicolor (Reference-based Color Volume Remapping)
ST2094-40: Samsung (Scene-based Color Volume Mapping)
The UHD Alliance mostly revolves around Hollywood movie studios and is focused on content creation and playback, guidelines for CE devices, branding and consumer experience). The UHDA has announced a set of norms for displays, content end ‘distribution’ to deliver UHD with HDR, and an associated logo program. The norm is called ‘Ultra HD Premium’ (see box). Is it a standard? Arguably, yes. Does it put an end to any potential confusion over different HDR technologies? Not quite – while the norm guarantees a certain level of dynamic range it does not specify any particular HDR technology, so all options are still open.
UHD Alliance ‘Ultra HD Premium’ definition
Color Bit Depth
Minimum 10-bit signal depth
Minimum 10-bit signal depth
Signal input: BT.2020 color representation Display reproduction: More than 90% of P3 color space
BT.2020 color representation
BT.2020 color representation
High Dynamic Range
SMPTE ST 2084 EOTF A combination of peak brightness and black level either: More than 1000 nits peak brightness and less than 0.05 nits black level or More than 540 nits peak brightness and less than 0.0005 nits black level
SMPTE ST 2084 EOTF Mastering displays recommended to exceed 1000 nits in brightness, less than 0.03 black level, minimum of DCI-P3 color space
SMPTE ST 2084 EOTF
One other such body is the Blu-ray Disc Association (BDA). Although physical media have been losing some popularity with consumers lately, few people are blessed with a fast enough broadband connection to be able to handle proper Ultra HD video streaming, with or without HDR. Netflix requires at least 15 Mbps sustained average bitrate for UHD watching but recommends at least 25 Mbps. The new Ultra HD Blu-ray standard meanwhile offers up to 128 Mpbs peak bit rate. Of course one can compress Ultra HD signals but the resulting quality loss would defy the entire purpose of Ultra High Definition.
Ultra HD Blu-ray may be somewhat late to the market, with some SVoD streaming services having beat them to it, but the BDA deserves praise for not rushing the new standard to launch without HDR support. Had they done that, the format may very well have been declared dead on arrival. The complication, of course, was that there was no single agreed-upon standard for HDR yet. The BDA has settled on the HDR10 Media Profile (see box) as mandatory for players and discs, with Dolby Vision and Philips’ HDR format as optional for players as well as discs. Sometimes HDR10 is mistakenly referred to as ‘BDA-HDR’.
The UHD Alliance has also started certifying Ultra HD Blu-ray players as ‘Ultra HD Premium’-compliant. They have not yet disclosed what the requirements for players are but it seems likely that any player that meets the disc standard’s specifications automatically qualifies; however actually obtaining the label surely involves taking a license from the UHDA and possibly paying a certain fee per unit.
While HDR standards for video streaming and especially for broadcast TV are still under discussion, the standards for Ultra HD Blu-ray have been settled. The format offers three options:
HDR10 is mandatory for every Ultra HD Blu-ray player to support, and any disc that uses HDR must at least offer HDR10. The other two formats are optional. Samsung, Panasonic and Fox have indicated they’re working to get HDR10+ added to the Ultra HD Blu-ray standard as an optional format.
Dolby Vision discs have just come to market, starting with Despicable Me 1 & 2, and their launch coincides with that of the first Dolby Vision players: Oppo’s UDP-203 and 205. Like Oppo, LG has issued a firmware update to add Dolby Vision to its UP970 UHD BD player, and Philips has announced Dolby Vision support for their Ultra HD Blu-ray players as well.
The Philips HDR standard has not yet gained support in Ultra HD Blu-ray discs or players. HLG is not part of Ultra HD Blu-ray’s specification, not even as optional format, and there doesn’t seem to be any effort going on to achieve this, like there is for HDR10+.
Most Ultra HD Blu-rays released up to now offer HDR, but not all do – it’s not mandatory. Most Hollywood movie studios are much more interested in the HDR aspect of movies than the 4K resolution, and for good reasons. First of all, the extent to which you can appreciate 4K resolution depends on the size of your screen and the viewing distance. If your screen is not very large or you’re sitting relatively far away, the picture quality difference with 1080p HD becomes negligible. The benefit of HDR however can be appreciated under any circumstances, regardless of screen size and viewing distance. Secondly, most Hollywood movies these days are shot in mere ‘2K’ resolution, comparable to HD. Even when 4K or higher-resolution cameras are used, the CGI (Computer-Generated Imagery) is typically done in 2K, because the graphics rendering is so computing-intensive and costly. Consequently, such films are finished in 2K resolution and the ‘Digital Intermediate’ file which is used for deriving all versions for distribution has to be upscaled from 2K to 4K for Ultra HD Blu-ray.
Does this mean the movie looks no better on Ultra HD Blu-ray than on regular Blu-ray Disc? No, first of all because of HDR, but even if a movie doesn’t use HDR, professional studio-grade upscaling will give better results than real-time upscaling done inside a TV set. Moreover, if the source material was 4K, upscaling from 2K will give better results than with material that was captured only in 2K. In a similar fashion, SDR that’s derived from HDR content looks better than material that was captured in SDR originally.
Some may worry that for 4K HDR movies to get released on Ultra HD Blu-ray we’re going depend on new movies being shot as such but that’s not the case. Analog film can capture a much greater dynamic range that SDR video and its resolution may exceed 2K digital video. This means that there’s a vast library of catalog movie titles that can be restored in 4K resolution and given the HDR treatment.
A common misperception is that HDR movies are mastered to a brighter level than SDR movies but that’s not the case. Although HDR video can contain higher brightness, the average brightness level will generally be the same as with SDR. It’s just that the highlights are brighter and the blacks darker (and therefore usually more natural). Most HDR movies are best watched in a dark or dimly lit room – not in bright daylight.
It would have been nice if the UHD Alliance had reserved the ‘Ultra HD Premium’ label exclusively for movies shot in digital 4K or derived from 4K scans of analog film, thus excluding titles upscaled from 2K DI’s and making it easier for consumers to tell what they’re dealing with, but alas.
Ultra HD Forum
The Ultra HD Forum, not to be confused with the Alliance, meanwhile focuses on the end-to-end content delivery chain including production workflow and distribution infrastructure. The organization issues guidelines that broadcasters, streaming providers and other parties can use to deploy Ultra HD services. Several phases are distinguished. Phase A, originally started in 2016 but updated since then and now at v1.4, stipulates the following:
Ultra HD Forum – Phase A
1080p* or 2160p
SDR, PQ, HLG
24, 25, 30, 50, 60
HEVC, Main 10, Level 5.1
Stereo or 5.1 or channel-based immersive audio
AC-3, EAC-3, HE-ACC, AAC-LC
AES or DVB-CSA3, using a minimum of 128 bits key size
IPTV / OTT security
AES as defined by DVB
Broadcast (TS), Multicast (IP), Unicast Live & On demand (DASH ISO BMFF)
Captions/Subtitles Coding (in/out formats)
CTA-608/708, ETSI 300 743, ETSI 300 472, SCTE-27, IMSC1
With the guidelines for UHD Phase A, the Ultra HD Forum has introduced a couple of new terms, including PQ10 and HLG10.
UHD Phase A definition
PQ (SMPTE ST.2084)
Hybrid Log Gamma (as per ITU-R BT.2100)
PQ10 is basically HDR10 without metadata.
Ultra HD Forum Phase B will likely include some of the following technologies:
More HDR formats such as Technicolor, Dolby Vision, China HDR or others;
HFR – 1080p120 and 1440p120 are MPEG level 5.1 and can fit on an HDMI 2.0x link; 2160p120 is MPEG level 5.2 and requires HDMI 2.1;
Dynamic metadata – all the SMPTE ST 2094 flavors including HDR10+;
Object-based Audio – Dolby Atmos, DTS:X, MPEG-H;
New codecs – VP9, AVS2.0, AV-1;
A Phase C may be deemed necessary at a later stage.
In broadcasting we’ve got ATSC in North America defining how UHD and HDR should be broadcast over the air with the upcoming ATSC 3.0 standard (also used in South Korea) and transmitted via cable. Here, the SCTE comes into play as well. Japan has the ARIB (see above) and for most of the rest of the world, including Europe, there’s the DVB Project, part of the EBU, specifying how UHD and HDR should fit into the DVB standards that govern terrestrial, satellite and cable distribution.
Now the main contribution to HDR for broadcasts, HLG, does not come from these bodies but from broadcasters BBC and NHK. In Japan, HLG broadcasts already take place. In the west, TravelXP 4K so far is the only channel transmitting in UHD with HLG HDR.
The European Telecommunications Standards Institute (ETSI) has launched an Industry Specification Group (ISG) “to work on a standardized solution to define a scalable and flexible decoding system for consumer electronics devices from UltraHD TVs to smartphones” looking at UHD, HDR and WCG. Founding members include telcos BT and Telefónica. The former already operates a UHD IPTV service. We have yet to hear back from this initiative.
In the meantime, ITU has ratified a new standard called Rec.2100 (or BT.2100), that builds on Rec.2020 color space. Added to the specification are two methods for doing HDR – PQ (the basis of HDR10 and Dolby Vision) and HLG. Three resolutions are specified: HD, UHD-I (‘4K’) and UHD-2 (‘8K’). As for frame rates, all the usual ones between 24 and 120 fps are specified (including fractional rates). Note that Rec.2100 is about production and program exchange – not about distribution to end users.
Rec.2100 also specifies viewing conditions for mastering / color grading HDR:
Rec.2100 - viewing condition specification
Background and Surround
Neutral grey at D65
Brightness of background
Brightness of surround
≤ 5 cd/m2
Avoid light falling on the screen
for 1920×1080 format: 3.2 picture heights for 3840×2160 format: 1.6 to 3.2 picture heights for 7680×4320 format: 0.8 to 3.2 picture heights
Peak luminance of display
≥ 1 000 nits
Minimum luminance of display (black level)
≤ 0.005 cd/m2
Then there are CTA (Consumer Technology Association, formerly known as CEA) in the US and DigitalEurope dealing with guidelines and certification programs for consumer products. What specifications does a product have to support to qualify for ‘Ultra HD’ branding? Both have formulated answers to that question. It has not been a coordinated effort but fortunately they turn out to almost agree on the specs. Unity on a logo was not as feasible, sadly.
The CTA has also issued guidelines for HDR. DigitalEurope hasn’t yet. It’d be great for consumers, retailers and manufacturers alike if the two organizations could agree on a definition as well as a logo this time.
CTA definition of HDR-compatible:
A TV, monitor or projector may be referred to as a HDR Compatible Display if it meets the following minimum attributes: 1. Includes at least one interface that supports HDR signaling as defined in CEA-861-F, as extended by CEA-861.3. 2. Receives and processes static HDR metadata compliant with CEA-861.3 for uncompressed video. 3. Receives and processes HDR10 Media Profile* from IP, HDMI or other video delivery sources. Additionally, other media profiles may be supported. 4. Applies an appropriate Electro-Optical Transfer Function (EOTF), before rendering the image.
As if all of this weren’t enough there are about six more HDR standards: OpenEXR, HDRi data, Logluv TIFF, RGBE Image Format, JPEG-HDR, HDR Rendering (HDRR), not to mention various Sony S-Log camera OETFs.
What are consumers, broadcasters, TV manufacturers, technology developers and standardization bodies to do right now?
Would-be UHD TV buyers and other consumers could be forgiven for thinking there’s a HDR format war going on but that’s misleading and dangerous for reasons explained here.
HDR video format support, September 2017
For broadcasters, HLG is the most practical and pragmatic choice in the short term, though companies intent on delivering a premium experience may opt for Dolby Vision
For OTT streaming service providers, the safest bet is to simply offer all (four) HDR video formats supported by the major TV brands. For the TV makers themselves universal HDR support is also the safest strategy. Hopefully, the updated Ultra HD Blu-ray standard will offer the option to have HDR10, Dolby Vision and HDR10+ all on the same disc. That’s up to the BDA to work out.
For all parties involved in technology development and standardization, my advice would be as follows. It’s inevitable we’re going to see a mixture of TV sets with varying capabilities in the market – SDR HDTVs, SDR UHD TVs and HDR UHD TVs, and that’s not even taking into consideration near-future extensions like HFR.
Simply ignoring some of these segments would be a very unwise choice: cutting off SDR UHD TVs from a steady flow of UHD content for instance would alienate the early adopters who bought into UHD TV early on as well as consumers who are doing so now with an entry-level UHD TV model. The CE industry needs to cherish the early adopters. It’s bad enough that those Brits who bought a UHD TV in 2014 cannot enjoy BT Sport’s Ultra HD service today because the associated set-top box requires HDCP 2.2 which their TV doesn’t support.
For broadcasters, it is not realistic to cater to each of these segments with separate TV channels. Even if the workflows can be combined, no operator wants to spend the bandwidth to transmit the same channel in SDR HD and HDR HD, plus potentially SDR UHD and HDR UHD.
Having separate channels for HD and UHD is inevitable but for HDR to succeed it’s essential for everyone in the production and delivery chain that the HDR signal be an extension to the broadcast SDR signal.
Innovations like Ultra HD resolution, High Dynamic Range, Wide Color Gamut and High Frame Rate will not come all at once with a big bang but (apart from HDR and WCG which go together) one at a time, leading to a fragmented installed base. This is why compatibility and ‘graceful degradation’ are so important: it’s impossible to cater to all segments individually.
What is needed now is alignment and clarity in this apparent chaos of SDOs (Standards Defining Organizations). Let’s group them along the value chain:
SDOs (Standards Defining Organizations)
MPEG , VCEG
ATSC, SCTE, EBU/DVB, ARIB, SARFT
BDA, DECE (UV), MovieLabs
CTA, DigitalEurope, JEITA
Within each segment, the SDOs need to align because having different standards for the same thing is counterproductive. It may be fine to have different standards applied, for instance if broadcasting uses a different HDR format than packaged media; after all, they have differing requirements. Along the chain, HDR standards do not need to be identical but they have to be compatible. Hopefully organizations like the Ultra HD Forum can facilitate and coordinate this between the segments of the chain.
If the various standardization organizations can figure out what HDR flavor to use in which case and agree on this, the future is looking very bright indeed.