While resolution, color and dynamic range have advanced significantly, frame rate has remained stuck at 24 frames per second. This is a problem, not least for creative intent, that Hollywood is finally starting to acknowledge.
Many still regard cinema as the ultimate way to experience a movie, and in some ways, it still is. How wonderful it is to forget about your worries – and your smartphone – sit back and be transported into another world through the big canvas with wonderful sound.
However, there is a growing disconnect. Cinema, for over a century, has been shaped by the capabilities, or rather limitations, of the movie projector. For example, the classic DCI (Digital Cinema Initiative) specification requires a brightness level of 14 foot-lamberts (fL), equivalent to 48 nits. This level is not some kind of golden number but simply what was practically achievable – with projectors – at the time. It has since stuck, shaping the look of movies for decades.
In 2018, the DCI group started work to define a cinema standard for HDR. The final specification sets a black level of 0.005 nits and peak brightness of 300 nits. This is not possible with current projectors. Instead, it requires direct-view LED walls, but such installations in theaters remain very limited.
Similarly, technical limitations are the reason why Hollywood settled on a frame rate of 24 frames per second in the 1920s. It was an economical way to synchronize sound to picture when shooting on film, enabling the advent of sound film. It has since stuck and practically all movies since then have been produced in 24fps. Some have even argued that 24fps is a prerequisite for the cinematic look, but is it really?
This illustration captures the current situation, comparing TV display capabilities with standards for movie authoring.
Not creative intentA movie presented at a 24fps frame rate with 14 fL (48 nits) brightness works well enough in the cinema – a very dark room with a dim projector without any support for HDR (High Dynamic Range).
However, try watching the same movie at home, mastered by the film studio in HDR, on modern display technology such as OLED or QD-OLED (or one of the few LED cinema walls out there), and you may now notice excessive judder that can be highly distracting or even eye-straining. As discussed for years in FlatpanelsHD's TV reviews, HDR and some bright SDR movies can exhibit an even worse 'stroboscopic effect' in bright or high-contrast scenes when viewed on an OLED TV. We no longer have 3D in our TVs, but 3D can also make the judder more apparent.
Here is a still example of what judder looks like:
Now, it might be tempting to lay the blame on your display. TV makers, resolve this issue! And they have tried with motion smoothing, which is a significant compromise that does not take into account the filmmaker's intended motion look. My plasma TV did not have such motion issues! No, because it was not much brighter than a movie projector. The real challenge here is that OLED's pixel response time is exceptionally fast and its brightness relatively high, exposing what 24fps motion actually looks like, amplified somewhat by the sample-and-hold driving method of an OLED panel.
As video technology has advanced, people are starting to notice these issues and question ingrained beliefs. This level of judder simply does not align with how creators envision their movies and series to be experienced. Cinematic? Yes. Excessive judder? Certainly not.
Is cinema holding back advancement?For the first time in history, cinema is lagging behind, leapfrogged by TV display technology on all parameters except size. So, why do movies and especially TV series, which will never reach the big screen, continue to be produced based on decades-old beliefs, shaped by the technical limitations of the projector?
This question is one that FlatpanelsHD often poses to industry professionals working at the intersection of filmmaking and TV displays: Is cinema holding back advancements in video technology? Is the projector? And what is being done behind the scenes to solve the issue?
Most recently, we directed this question to Sony's executives and engineers in Tokyo, Japan, because they are involved in the entire process from the camera through the mastering monitor to TV display. Sony did not provide a direct answer, but they did share something interesting with us, which we will report on later.
Before CES 2024, I also reached out to a company actively engaged in these discussions and the relevant technology, Pixelworks, to hear if they were available to meet for a discussion. Fortunately, they were in Las Vegas at the time, though not at the actual convention center. I still decided to go.
Why 24fps is not enoughPixelworks explained the problem in clear terms. A judder level deemed acceptable at 48 nits (cinema) can become unacceptable at higher brightness levels, as illustrated in this graph:
The graph even explains why we occasionally observe this issue outside of the HDR realm. As you can see, unacceptable judder becomes noticeable during some panning shots even before a display reaches 100 nits – SDR territory (and most people adjust their TV to significantly higher brightness than 100 nits for SDR viewing). Interestingly, this threshold coincides with EDR (Enhanced Dynamic Range), such as the Dolby Cinema laser projection systems (31 ftL or 106 nits).
Dolby Labs has highlighted the same issue in the past and has published a research paper of a 'luminance-aware model of judder perception' (link).
A proposed solution in Hollywood has been to shoot at a higher frame rate of 48fps, 60fps, or even 120fps. However, as demonstrated by the theatrical HFR releases of The Hobbit and Billy Lynn's Long Halftime Walk, this can negatively affect the cinematic look, making the content look overly realistic; the infamous soap opera effect.
To better understand the challenge, Pixelworks developed a Motion Appearance Model, akin to a Color Appearance Model, taking into account the various factors. They identified a motion blur / shutter equivalence relationship, as depicted below.
With this knowledge, they devised a system called TrueCut Motion that for the first time enables filmmakers to fine-tune judder (0–360, where 0 is typical 24p judder and 360 is no perceivable judder), motion blur (0–360, where 0 is no extra motion blur), and speed, all in post-production, scene-by-scene.
They call it 'motion grading,' drawing a parallel to 'color grading', a process that has become integral to the production of every movie and series to ensure the right look in terms of color and dynamic range.
The demos will convince youAs you may recall, James Cameron's Avatar: The Way of Water was theatrically released in late 2022 in 48fps HFR and 3D, doubling the normal frame rate of 24fps. For the first time, audiences widely accepted High Frame Rate (HFR), unlike earlier attempts. Avatar: The Way of Water was motion graded with TrueCut Motion.
During our meeting with Pixelworks, FlatpanelsHD saw the difference demonstrated using popular movies or demo clips with three versions of the same scene: One version with the typical excessive 24p judder, one version with pristine 48fps (accompanied by the soap opera effect), and one HFR version motion graded with TrueCut. The TrueCut-graded examples were very convincing, with smoother motion devoid of excessive judder and without the soap opera effect. We encourage every skeptical movie lover to watch these examples whenever they get the opportunity.
We saw a scene from Avatar: The Way of Water, some movies that we are not allowed to talk about publicly at this time, and even The Hobbit motion graded with TrueCut and reviewed by Peter Jackson! Anecdotally, Pixelworks said that Hollywood filmmakers in general respond very positively when shown their demos.
Unfortunately, we were not allowed to take photos or videos during our discussion, but this illustration may help you understand what we saw:
The tools work with any frame rate (24–120fps and higher), be it 2D or 3D. Often, with a 24fps source the goal is to refine the motion look, exemplified by the HFR re-releases of Avatar (the first one) and Titanic.
Think of it as a larger frame rate container, much like how the HDR format serves as a larger container for color and luminance. It provides filmmakers with a bigger toolbox for heightened creative freedom. The decision on how to leverage these tools rests entirely in their hands.
A turning point for HFR movies?Avatar: The Way of Water seems like a pivotal moment for HFR, at least to us in the West. In China, Pegasus premiered as early as 2019 as the first 48fps TrueCut release, and several HFR movies have followed.
This week, Hollywood took another significant step, with the two major film studios, Disney and Universal, unveiling HFR initiatives. Disney has inked a multi-year deal to bring HFR movies to home viewers for the first time, and this is just the beginning. Apple and Universal's Argylle will debut in theaters globally in HFR. In both cases, the studios are partnering with Pixelworks for motion grading using TrueCut.
So, HFR solved? We are witnessing advancements in cinema, but with the evolution toward EDR (Dolby Cinema) and HDR (direct-view LED walls), Hollywood will likely need to push beyond 48fps in the future.
Yet, the more crucial step, at least from FlatpanelsHD's perspective, lies in the booming home entertainment market – now many times bigger than cinema. Early signs of an HDR movie ecosystem are emerging, led by Apple, Disney, and Pixelworks. However, at the launch this week, HFR content is limited to Apple Vision Pro – the headset. Pixelworks tells us that discussions are underway with major TV makers regarding a certification program to ensure native playback of TrueCut Motion titles, but this process will take more time.
2024 marks the beginning, so stay tuned for more news.