Page 1 of 1

Bad math when Timeline Working Luminance > 1000 nits?

PostPosted: Sat Dec 05, 2020 1:17 pm
by Michael Tiemann
Update: I noticed the same behavior when I used RWG/Log3G10 as the timeline color space with 1000 vs. 2000. And I noticed the same discrepancies when I set the timeline working luminance to custom values of 1000 vs. 1001. That 1 nit makes all the difference...

I've noticed an anomaly using Davinci Wide Gamut/Intermediate when the Timeline Working Luminance is > 1000 nits. Here are two screenshots, one showing expected values of saturation/luminance when qualifying a blue LED near the middle of the screen (when Timeline Working Luminance is 1000 nits), and the other showing unexpected values of the same LED (note that the Luminance value of the qualifier is far to the left side of the scale, and saturation is far to the right side).

DWG-I_at_1000nits.jpg
Davinci Wide Gamut / Intermediate at 1000 nits
DWG-I_at_1000nits.jpg (316.62 KiB) Viewed 1014 times


DWG-I_at_2000nits.jpg
Davinci Wide Gamut / Intermediate at 2000 nits
DWG-I_at_2000nits.jpg (316.54 KiB) Viewed 1014 times


I have found that trying to use masks generated by such out-of-bounds numbers, even when the key looks correct (using Highlight to examine the key) results in useless garbage later down the imaging pipeline. That is to say, while the Alpha matte may look correct in the qualifier node, the math values are not valid enough to do anything good for nodes trying to use those values.

My bug report is not about the fact that garbage inputs leads to garbage outputs. It is that HDR 2000 should work for DWG/I since the documentation advertises it should be good up to 10,000 nits, and it's not.

The is with Resolve 17.0B Build 10 (aka 17 Beta 3).

Re: Bad math when Timeline Working Luminance > 1000 nits?

PostPosted: Mon Dec 21, 2020 9:34 am
by Michael Tiemann
Fixed in 17b6!