With its deep roots in Hollywood and content production, Sony will in 2024 embark on a quest to establish 4000 nits as the next standard for HDR peak brightness, up from 1000 nits.
Since the introduction of the first HDR-capable TVs, UHD Blu-ray players and streaming content in 2015, HDR content, including movies and series, has been standardized around a peak brightness of 1000 nits for home entertainment.
While there have been occasional releases in 4000 nits (The Great Gatsby, The Angry Birds Movie, Storks, In the Heart of the Sea) or even 10000 nits (Gran Turismo 7), content creators more often aim for peak brightness levels lower than 1000 nits. It is important to note that these figures refer specifically to peak brightness, representing the brightest part of the picture in select scenes only. This metric tells us little about the average brightness level of the movie, which typically remains consistent with levels seen before the advent of HDR – also known as SDR.
Sony's focus is specifically on building an ecosystem for 4000 nits peak brightness.
Gran Turismo 7 is one of the few titles graded to 10000 nits peak brightness. Most other titles conform to 1000 nits or less. Photo: FlatpanelsHD
The problem
There are a number of challenges involved with going beyond 1000 nits for peak brightness. Few 4K TVs in living rooms go beyond 1000 nits, even if advertised, but the TV is only the last link of the chain.
As for content grading, Hollywood studios predominantly rely on 1000-nit reference monitors like Sony's RGB OLED (BVM-X300). While there was once the liquid-cooled Dolby Pulsar, capable of reaching 4000 nits, it came with serious compromises and is no longer available.
Additionally, there is the issue of entrenched habits among filmmakers and color graders. Many still prefer the older, more subdued aesthetic, which is ultimately a creative decision.
The HDR video chain. Sony wants an even bigger pipe. Photo: Murideo
The cameras, however, are ready to go beyond 1000 nits. Footage captured on the latest digital Hollywood cameras, such as Sony's own Venice, can be resolved to high peak brightness levels exceeding 4000 nits. Moreover, older films shot on film cameras that can be rescanned at higher resolution and wider dynamic range than when the film was originally released, which is why many older films are being rereleased in 4K HDR these years.
The video standards are ready. HDR10, HDR10+ and Dolby Vision are all based on the same PQ (Perceptual Quantizer) transfer function, developed by Dolby and standardized by SMPTE and ITU. The PQ function replaces the gamma curve of SDR content and defines luminance levels ranging from 0.0001 nits to 10000 nits, although displaying the full dynamic range without banding requires 12-bit rather than the current 10-bit capability of consumer displays. From the outset, HDR standards were designed to accommodate not only higher luminance but also more colors.
Distribution channels are also ready – except for broadcast TV. UHD Blu-ray and streaming services already support HDR10, HDR10+, and Dolby Vision. The latest game consoles, such as PS5 and Xbox Series X, support HDR gaming in HDR10 and Dolby Vision, with HDR10+ emerging as a viable third option starting with PCs.
In other words, the weakest links of the chain in achieving higher peak brightness at home today are consumer TV displays and the hardware production tools, such as reference monitors, as well as the methods and habits of Hollywood creators.
Two other challenges may be even harder to resolve: Why 24fps is not enough for HDR movies and cinema projection. The latter increasingly appears to impede progress in video technology development. For the sake of brevity, we will however not delve into these issues further in this article.
Sony's vision for 4000 nits
Sony aims to change the status quo by pushing both Hollywood and TV hardware beyond the 1000-nit barrier, establishing 4000 nits peak brightness as the new reference standard.
With its extensive presence across various sectors, Sony is uniquely positioned to drive change in filmmaking and content production. The company owns Sony Pictures Entertainment and its many subsidiaries, operates multiple video game studios, manufactures a significant number of the video cameras used in Hollywood production, and produces the majority of studio reference monitors utilized in grading suites. And of course, Sony produces some of the best TVs around.
Sony Pictures even rents out its production lots and suites to outside studios or partners, so its influence reaches beyond its own titles.
But how will Sony accomplish this? It laid it out as a two-pronged approach:
Rolling out the new 4000-nit reference monitor, BVM-HX3110, for Hollywood color grading studios. This dual-layer LCD monitor with LED zone dimming delivers a peak brightness of 4000 nits (10% window) and 1000 nits in full-screen (100% window).
Launching the high-end Bravia 9 miniLED LCD TV. While Sony does not promote it as capable of 4000 nits, it positions the Bravia 9 as the first step towards achieving the milestone. More information on Sony's 2024 TVs can be found here.
Sony is positioning its BVM-HX3110 reference monitor and Bravia 9 consumer TV as the first steps on the journey towards 4000 nits HDR
Sony acknowledges that it is facing an uphill battle, particularly with studios and creatives. As previously reported by FlatpanelsHD, color grading studio Company 3, responsible for color grading 80% of major Hollywood movies and series, still predominantly uses Sony's 1000-nit RGB OLED (BVM-X300) as their reference master monitor. Company 3 acquired all remaining stock after Sony discontinued the monitor.
During the launch at the Sony Pictures lot in Culver City, Los Angeles, Sony confirmed this piece of information to FlatpanelsHD, and admitted that the 1000-nit dual-layer LCD (BVM-HX310) isn't really selling, despite replacing the 1000-nit RGB OLED (BVM-X300). As a result, the price of BVM-HX310 has been reduced following the launch of the 4000-nit dual-layer LCD with LED dimming zones (BVM-HX3110).
Some people in Hollywood are starting to see the potential, though.
- "I was blown away with 4000 nits. It's incredible to see how far it has come in just a few years. I am excited to work with it on our new film. We have daylight exteriors and that seems to be where HDR shines in a way that feels very natural," Joseph Kosinski, an American film director known for Top Gun: Maverick, Tron: Legacy and Oblivion, said during a panel debate at Sony Pictures Studios attended by FlatpanelsHD, referring to Sony's new 4000-nit reference monitor.
Kosinski's new project is the upcoming untitled F1 film starring Brad Pitt.
Director Joseph Kosinski (left) and Director of Photography Claudio Miranda (middle) discussing film techniques during a panel debate at Sony Pictures, Culver City. Photo: Sony
Meanwhile, some movie studios are expressing interest in using the A95L QD-OLED TV for color grading, according to Sony. This is due to its pixel-level luminance and color control, as well as its wider color gamut. Sony can provide custom firmware to Hollywood studios to disable panel ABL (dimming), but it will void warranty. According to FlatpanelsHD's testing, Sony's A95L QD-OELD reaches almost 1500 nits peak brightness. Additionally, Samsung Display, in partnership with Flanders Scientific, is developing a smaller 31.5-inch QD-OLED reference monitor (XMP310). However, this monitor is limited to 1000 nits peak brightness, as it is built on Samsung Display's QD-OLED monitor panel, which does not achieve the same brightness levels as the QD-OLED TV panel.
So, how has Hollywood and the gaming industry even managed to release content graded at 2000, 4000 or even 10000 nits? Some titles were graded on the now-discontinued 4000-nit Dolby Pulsar, while in other cases it simply involves guesswork – clearly not a good approach.
Grading anything at 10000 nits is not really feasible today, but do you recall Sony's 10000-nit 8K prototype from 2018? Sony confirmed to FlatpanelsHD that it is still operational in their Tokyo office. While it may have appeared elegant from the front at CES 2018, there is a massive module behind it. This display was used to grade Gran Turismo 7, making it one of the few pieces of content actually graded at 10000 nits. Only when you have a 10000-nit TV at home in the future will you experience Gran Turismo 7 as intended!
Sony's 10000-nit prototype display was used to color grade Gran Turismo 7. Photo: FlatpanelsHD, 2018
As for TVs, the new Sony Bravia 9 miniLED LCD is not expected to hit 4000 nits, but we are getting closer. OLED TV panel manufacturers claim 3000 nits this year, but it remains to be seen what the consumer TVs actually achieve. 4000 nits OLED panels are expected in 2025-2026 at the earliest. More high-brightness TVs will emerge, but it will take many years for the installed base at home to reach critical mass, so consider it a long journey.
The good news, however, is that a movie graded at 4000 nits can be reproduced on any HDR-capable TV, just like Gran Turismo 7 can be played in HDR on any HDR-capable TV. The crucial difference lies in how well it is reproduced, which ultimately depends on the TV's display hardware and the TV's implementation of tone-mapping/clipping.
Why 4000 nits?
But why aim for 4000 nits instead of stopping at 2000 nits first? Will 4000 nits be too bright and force us to squint or even feel eye fatigue after watching a full movie?
There are valid concerns, but it is important to remember that human perception of light is logarithmic, not linear. So while 4000 nits may sound significantly brighter than 1000 nits, in reality, it is not that much brighter, and how you perceive it depends on several factors such as your surroundings (nighttime, evening, daytime), the contrast of the picture, color or white peak brightness, the size of the bright object on the screen (small star or big sun?), and the duration of exposure. However, in a completely dark cinema environment, 4000 nits in anything but very tiny segments of the picture, like a twinkling star, could indeed be too bright.
A more pertinent question is probably whether 4000 nits actually provides a meaningful picture improvement over, say, 1000 nits?
FlatpanelsHD has had the opportunity to experience Sony's 4000-nit BVM-HX3110 twice, compared side-by-side in Tokyo, Japan, to the 1000-nit dual-layer LCD (BVM-HX310) and 1000-nit RGB OLED (BVM-X300), and later in Los Angeles (only the 1000-nit BVM-HX310).
Left: Dual-layer LCD (BVM-HX310). Center: Dual-layer LCD with LED zone dimming (BVM-HX3110). Right: RGB OLED (BVM-X300). Photo: FlatpanelsHD
During our first demo in Tokyo, we witnessed instances of brilliance and situations where the 4000 nits made the sky or sun appear significantly more realistic and detailed, as it did not clip the brightest details. However, the RGB OLED still seemed to have more sparkle or lifelike reflections in certain scenes, such as a car's paint or other reflective surfaces.
The demonstration we had in Los Angeles at Sony Pictures, featuring a special version of the movie Alpha mastered at 4000 nits (the consumer version was mastered at 1000 nits, according to Sony), was particularly compelling. Here, the sun and sky appeared incredibly vibrant and realistic on the 4000-nit monitor, whereas the 1000-nit reference monitor looked muted and lifeless in comparison. It served as a convincing showcase for the benefits of 4000 nits peak brightness, likely to convince most viewers. It is unfortunate that such demonstrations are not more widely available, as they could sway some skeptics. Unfortunately, due to copyright restrictions, we were not permitted to take pictures or videos of the 4000-nit Alpha demo, so the images embedded here are from our Japan demo.
Sony's BVM-HX3110 reference monitor can resolve brightness detail up to 4000 nits, for instance in this particular scene. Photo: FlatpanelsHD
Studies conducted by SMPTE (link) and Dolby have reached similar conclusions. There is a noticeable difference to the viewer between 1000 nits, 4000 nits, and even 10000 nits.
It is worth noting that Sony is not alone on this journey. You may have also heard LG and Panasonic share similar narratives about "Hollywood to your home", partnerships with Company 3, or how LG OLED display are used by Disney as client reference displays in movie studios.
A bigger toolbox
It is important to emphasize that 4000 nits is not necessarily superior to 1000 nits or even 500 nits on its own. Attempting to grade the visual quality of a movie based on the peak brightness level, as some have attempted in the past, is a deeply flawed approach.
Also read: We need to talk about HDR
Instead, it is about giving content creators a broader array of tools – a bigger toolbox – and advancing video technology. It is up to content creators to decide whether and how to utilize these tools, whether they want to leverage high brightness and true blacks, vibrant colors, or high frame rates. The combination of these elements can result in something remarkable – perhaps even new types of movie experiences or genres. Conversely, the absence of specific picture elements, such as color or high brightness, can also contribute to artistic expression.
Specifically for brightness, it is furthermore widely recommended to use high brightness sparingly for the most impactful effect, limiting it to small segments of the picture, which requires pixel-level dimming on TVs to be fully realized, i.e. OLED, QD-OLED or possibly microLED or nanoLED in the future. If everything in the picture is bright, nothing is truly bright – your eye's pupil will simply contract, like it does in bright outdoor settings.
Nonetheless, it is intriguing to observe Sony's acknowledge that home entertainment is reshaping Hollywood. The company's new slogan, "Cinema is Coming Home," rings true, particularly in the realm of content distribution and the disruptive impact of streaming on the industry. However, this sentiment does not quite hold from a technology standpoint, as cinema is now lagging considerably behind consumer display technology.
The step to 4000 nits is inevitable; it is the continuation of an amazing journey that started in 2015 with the launch of HDR – a video format more impactful than 8K – driven by advancements in TV display technology.