- Posts: 75
- Joined: Sun Dec 28, 2014 8:52 pm
So a few things first to get out of the way
1) This has nothing to do with all that "quicktime gamma bug" nonsense.
2) I know its not a proper way to monitor for grading, this is just to match a reference monitor next to it as best as possible.
I took a spectrophotometer (iPro3) and a regular probe (i1Display) with light illusions Colorspace and measured the EOTF of a macbook pro 14" XDR display with different resolve and display settings to find out what I need to do to get proper gamma 2.4 output on the display, as in matching a reference display calibrated to rec709/Gamma 2.4 (I use pure power law 2.4 and not bt1886 because i am using OLEDs and the miniLED which do 0 NIT black so its the same as 2.4 anyhow and less complex).
I ran 48 lineary spaced full range patches from 0->255 in resolves viewer and recorded the resulting measured values from the display output and then plotted them against gamma 2.4, obvious to see any not-correct EOTFs.
First thing I tried, and I assumed this would work but sadly did not
--------------------
UMDCPFV*: OFF
Display Setting: XDR Rec709/bt1886 reference mode system gamma boost 1.22 (default)
Output Colorspace(makes no difference here beacause UMPFV is turned off): rec709(scene)
Resulting EOTF = 2.2
--------------------
So I thought, why? somehow there is something not adding up, UMDCPFV*: OFF should just do nothing and send pixels "as is" to whatever the display is, in this case 2.4 gamma, so I assume apple is acually using the inverse of the default XDR profile to transform it to the working space(linear/XYZ)? Not sure, but I was able to fix it by using a System gamma Boost of 1.09 instead of 1.22
this gives me a measured EOTF of 2.4
--------------------
UMDCPFV*: OFF
Display Setting: XDR Rec709/bt1886 reference mode system gamma boost 1.09 (custom
Output Colorspace: whatever
Resulting EOTF = 2.4
--------------------
Ok cool, so with this I would get correct viewing in resolve but nowhere else like quicktime.. but I would be able to run colormanaged workflows in resolve.
The other way, which is how FCPX does it, is to:
--------------------
UMDCPFV*: ON
Display Setting: XDR Rec709/bt1886 reference mode system gamma boost 1.22 (default)
Output Colorspace: rec709A
Resulting EOTF = 2.4
--------------------
This obviously would have issues with running colormanaged, especially when using external proper monitoring as setting the output colorspace to 2.4 would result in a EOTF of roughly 2.6 on the XDR.
So Long story short what we need is a XDR compatible mode, ideally one that triggers different reference modes from inside resolve.
This mode (in SDR) should always display the same as FCPX and Quicktime, so always "rec709A" as that ends up displaying correct rec709/2.4 without beign effected by my colormanagement settings, even for non XDR it would be nice to seperate viewer colorspace from output colorspace..
UMDCPFV = Use Mac Display Color Profiles for viewers
1) This has nothing to do with all that "quicktime gamma bug" nonsense.
2) I know its not a proper way to monitor for grading, this is just to match a reference monitor next to it as best as possible.
I took a spectrophotometer (iPro3) and a regular probe (i1Display) with light illusions Colorspace and measured the EOTF of a macbook pro 14" XDR display with different resolve and display settings to find out what I need to do to get proper gamma 2.4 output on the display, as in matching a reference display calibrated to rec709/Gamma 2.4 (I use pure power law 2.4 and not bt1886 because i am using OLEDs and the miniLED which do 0 NIT black so its the same as 2.4 anyhow and less complex).
I ran 48 lineary spaced full range patches from 0->255 in resolves viewer and recorded the resulting measured values from the display output and then plotted them against gamma 2.4, obvious to see any not-correct EOTFs.
First thing I tried, and I assumed this would work but sadly did not
--------------------
UMDCPFV*: OFF
Display Setting: XDR Rec709/bt1886 reference mode system gamma boost 1.22 (default)
Output Colorspace(makes no difference here beacause UMPFV is turned off): rec709(scene)
Resulting EOTF = 2.2
--------------------
So I thought, why? somehow there is something not adding up, UMDCPFV*: OFF should just do nothing and send pixels "as is" to whatever the display is, in this case 2.4 gamma, so I assume apple is acually using the inverse of the default XDR profile to transform it to the working space(linear/XYZ)? Not sure, but I was able to fix it by using a System gamma Boost of 1.09 instead of 1.22
- Code: Select all
2.4/1.961 = 1.22
- Code: Select all
2.4/2.2 = 1.09
this gives me a measured EOTF of 2.4
--------------------
UMDCPFV*: OFF
Display Setting: XDR Rec709/bt1886 reference mode system gamma boost 1.09 (custom
Output Colorspace: whatever
Resulting EOTF = 2.4
--------------------
Ok cool, so with this I would get correct viewing in resolve but nowhere else like quicktime.. but I would be able to run colormanaged workflows in resolve.
The other way, which is how FCPX does it, is to:
--------------------
UMDCPFV*: ON
Display Setting: XDR Rec709/bt1886 reference mode system gamma boost 1.22 (default)
Output Colorspace: rec709A
Resulting EOTF = 2.4
--------------------
This obviously would have issues with running colormanaged, especially when using external proper monitoring as setting the output colorspace to 2.4 would result in a EOTF of roughly 2.6 on the XDR.
So Long story short what we need is a XDR compatible mode, ideally one that triggers different reference modes from inside resolve.
This mode (in SDR) should always display the same as FCPX and Quicktime, so always "rec709A" as that ends up displaying correct rec709/2.4 without beign effected by my colormanagement settings, even for non XDR it would be nice to seperate viewer colorspace from output colorspace..
UMDCPFV = Use Mac Display Color Profiles for viewers
Last edited by FinnJaeger on Thu May 11, 2023 7:58 am, edited 1 time in total.