A chance to talk to FlatpanelsHD's reviewers.
By Kuschelmonschter
#12436
What happens if you switch output to some HDR display mode (e.g. "4K HDR 24Hz") and play some Dolby Vision content? Does HDR mean HDR10 in this context and is the Dolby Vision metadata somehow converted to HDR10? Doesn't sound like a good idea to convert dynamic metadata to one static HDR10 representation. I think I didn't get that part...
By melvin
#12443
Thanks for this excellent, detailed analysis with real measurements.

I think so far the ATV5 is the least bad UHD streamer, in large measure simply because of the iTunes price, quality and selection of UHD titles.

But they REALLY need to add an option for native output. It doesn't have to be the default. But for many of us it would greatly improve the quality of the experience.

Even after they get the mapping from rec709 to rec2020 and SDR to HDR correct, there are many many displays that won't show it correctly. You point out how power consumption greatly increases, which is no small thing.

But there are other problems. Even the cheapest TV's on the market can be calibrated to show SDR/709 content correctly. But even many expensive UHD TV's still lack the controls necessary to calibrate HDR/2020 accurately. And many of them make compromises like maxing out the backlight, the contrast setting and the color setting -- to try to get the most impressive torch mode experience -- when fed an HDR signal.

Should those TV's do better? Of course!

But we live in the real world. Most TV's won't do better for years, so even if Apple correctly maps SDR to HDR, the end user won't see the right stuff on most TV's. Instead, they will see inaccurate color, raised black levels, incorrect gamma, and experience greatly increased power consumption.

I understand why Apple has done it, and yes it's pretty clever and forward looking. But it's one of the main things that makes this device far from perfect.

The solution? In the settings menu, there should be a an "expert" mode that let's a user choose to output native signals. Heck, even my UHD blu ray player and my regular blu ray player before that, ten years ago!, would let me choose such an option.

For now, the workaround is to manually adjust the output for each content type. Now THAT is a seriously ugly solution, but it's the only thing that keeps me from returning my ATV5 to the store.

I sure hope they are going to allow native output at some point. It doesn't need to be the default. It could be hidden in a special super user sub menu. It could be caveated with lots of warnings. But it would solve the very real problems of assuming they are can map content correctly and that displays can display it correctly.
By melvin
#12444
Kuschelmonschter wrote:What happens if you switch output to some HDR display mode (e.g. "4K HDR 24Hz") and play some Dolby Vision content? Does HDR mean HDR10 in this context and is the Dolby Vision metadata somehow converted to HDR10? Doesn't sound like a good idea to convert dynamic metadata to one static HDR10 representation. I think I didn't get that part...
They use DV Profile 5 which contains a base HDR layer and DV meta data, so they don't convert the DV content to HDR10. Rather, they convert the HDR base layer to HDR10. And for the most part, it looks good.

(Far better than SDR content converted to HDR -- which won't be solved with better tone mapping since so many displays have compromises with their HDR rendering that don't trouble their SDR rendering.)
By chrisheinonen
#12446
On the 4K HDR output mode chart, I'm assuming you're using "Use Measured White Point" in the CalMAN options? If so, it is possible to get the chart after changing that to "Use 195 cd/m2" so you want it to exactly match the output of the 4K SDR Output Mode chart? I have to assume the dE2000 levels would be far higher than they are now, but it also lets us know how much they are increasing the relative brightness by using HDR for SDR content.
User avatar
By Torben Rasmussen
#12448
The HDR chart isn't measured in the HDR workflow. It is measured in "SI Advanced" as you should expect the graph to conform to SDR. The point of creating the chart was to determine how wrong things would get if you tried to view SDR using the HDR-profile, which would require measuing things as if if was in fact SDR.

It should be noted that the HDR-profile hasn't been calibrated to the same extend as the SDR-profile, which is partly due to not being able to pass our HDR calibration patterns through the ATV 4K. None of the apps tested could play back our 4K HDR-content. I found some slight differences in greyscale rendition between e.g. USB and ATV 4K on the Eclipse TV, so the reference, and the SDR 4K calibration curves are actually obtained using slightly different settings on the same picture profile on the TV.

Bottom line: You shouldn't pay too much attention to the absolute offset in greyscale, but merely note that rendition of color and greyscale levels aren't mapped as we had hoped for.
User avatar
By Rasmus Larsen
#12449
Kuschelmonschter wrote:What happens if you switch output to some HDR display mode (e.g. "4K HDR 24Hz") and play some Dolby Vision content? Does HDR mean HDR10 in this context and is the Dolby Vision metadata somehow converted to HDR10? Doesn't sound like a good idea to convert dynamic metadata to one static HDR10 representation. I think I didn't get that part...
"HDR" in iTunes means HDR10. iTunes only shows the label for the highest-quality format but when it shows "Dolby Vision" there is also a HDR10 base layeravailable on the adaptive streaming ladder (as well as 720p SDR, 1080p SDR etc.)

So there is no conversion between HDR10 and Dolby Vision. If you have Apple TV 4K set to Dolby Vision and try to play something in iTunes that is available only in HDR10, the HDR10 is the base layer in the Dolby Vision container. HDR10 and Dobly Vision are both based on same basics (PQ curve, Rec.2020),
By Kuschelmonschter
#12450
So why does a HDR10 output mode exist then when the TV supports Dolby Vision? Doesn't it make sense to set it to either SDR or Dolby Vision? What's the drawback of playing HDR10 in Dolby Vision output mode?

I always admired Apple for the simplicity of things. But what they do with display modes is beyond me...
User avatar
By Torben Rasmussen
#12451
Not all DV TV's support 60 Hz DV in 4K. The LG E6 Rasmus used didn't support it, so if you want HDR in 60 Hz you can only use HDR10.
By JMGNYC
#12453
Excellent review. Although please note that Infuse playback seems somewhat darker than playing the same thing through Apple's native player, say with Plex. I think some of the difference you're seeing in "Dory" between HDR and SDR output modes may be Infused induced. Check it out.
By Mamaw
#12545
https://youtu.be/ipo05SYfqXw

Very informative video.
And a bad news for HDR10 only TV set owners. The Apple TV is down converting Dolby Vision movies to HDR10 with static values regardless of the movie you are watching.
This is completely faked !
Apple need to add a HDR10 logo in iTunes to movies that are capable of native HDR10 presentation.
I don’t want to buy a HDR movie to watch it in fake HDR ....
User avatar
By Rasmus Larsen
#12546
I haven't watched Vincent's video yet but note that HDR10 is based on the same primaries as Dolby Vision, specifically PQ (Perceptual Quantizer) and Rec.2020. Movie studios also output the HDR10 copy from the Dolby Vision master. So you strip away the dynamic metadata to set an overall level for luminance through a static metadata instruction.

EDIT: OK, I see Vincent's point regarding the MaxCLL and MaxFALL now. Fair. I wouldn't call it fake. It's a bug with setting values via metadata, and something that they hopefully correct soon.
By JMGNYC
#12548
tvOS beta 11.2 was release today which feature automatic switching for both native frame rate and dynamic range so if you enable that SDR/HDR/DV conversion problems become a thing of the past. It's really great news IMO.
User avatar
By Rasmus Larsen
#12549
Mamaw wrote:Ok I admit “completely faked” is a little bit strong :)
Note that some UHD Blu-ray releases in HDR10 also include wrong - or null-rated - values for MaxCLL and MaxFALL, likely because few TV manufacturers were using them in the early days of HDR. That's why TV manufacturers have started doing tone-mapping based on image analysis instead of the metadata provided.