
- Posts: 817
- Joined: Wed Mar 08, 2017 10:01 pm
Using Calman or DisplayCal to create a 3D LUT, I get a lower contrast image on my monitor than on an output file.
Here's my hardware setup:
Physical Output is from a Decklink mini 4k via SDI to a Shogun Inferno that acts as a LUT box. From the Inferno the signal goes via HDMI to an LG 31MU97-B 4k monitor. The Shogun has no LUT loaded. The LG is set at custom color to pre-adjust for balanced RGB at 120 nits and 6500 Kelvin.
The Decklink is set to output 4096x2160 DCI RGB 4:4:4:4 though the same results happen if it's set to Y:Cb:Cr 4:2:2
Resolve is set at Rec 709 4096x1716 timelines, 10bit color, full levels, with no LUTs applied. Color space seems not to matter for the patches whether it's ACES, or DaVinci.
Both Calman and DisplayCal are set to generate 3D cube LUTs Rec 709 BT.1886 using Resolve's patches. The resulting LUT is loaded into the Shogun but looks the same if the Shogun is bypassed and the LUT is loaded into the 3D monitor LUT slot in setup. Both the Calman and DisplayCal LUTs look identical on the LG. The image is significantly lower contrast than an output file when that file is played in VLC on a Dell P2715Q that is eye matched to the LG before any LUT is applied.
I've taken the Shogun out of the equation by doing the following: the pre-LUT-d and LUT-ed images were fed to a different HDMI input on the LG from the Decklink HDMI output and compared to the SDI/HDMI path via the Shogun. The two paths' images looked identical.
Does anyone have an idea what may be going on, or additional things to try?
Here's my hardware setup:
Physical Output is from a Decklink mini 4k via SDI to a Shogun Inferno that acts as a LUT box. From the Inferno the signal goes via HDMI to an LG 31MU97-B 4k monitor. The Shogun has no LUT loaded. The LG is set at custom color to pre-adjust for balanced RGB at 120 nits and 6500 Kelvin.
The Decklink is set to output 4096x2160 DCI RGB 4:4:4:4 though the same results happen if it's set to Y:Cb:Cr 4:2:2
Resolve is set at Rec 709 4096x1716 timelines, 10bit color, full levels, with no LUTs applied. Color space seems not to matter for the patches whether it's ACES, or DaVinci.
Both Calman and DisplayCal are set to generate 3D cube LUTs Rec 709 BT.1886 using Resolve's patches. The resulting LUT is loaded into the Shogun but looks the same if the Shogun is bypassed and the LUT is loaded into the 3D monitor LUT slot in setup. Both the Calman and DisplayCal LUTs look identical on the LG. The image is significantly lower contrast than an output file when that file is played in VLC on a Dell P2715Q that is eye matched to the LG before any LUT is applied.
I've taken the Shogun out of the equation by doing the following: the pre-LUT-d and LUT-ed images were fed to a different HDMI input on the LG from the Decklink HDMI output and compared to the SDI/HDMI path via the Shogun. The two paths' images looked identical.
Does anyone have an idea what may be going on, or additional things to try?
David
Resolve Studio latest build
Windows 11 Pro
Decklink Mini Monitor 4k
Intel i9 7960x @ 4 GHz
Thermaltake Floe Riing 360 Water Cooler
Asus x299 Prime Deluxe
64GB 3333 Corsair Dominator ram
1 EVGA RTX 3090 XC3 Card
Areca Thunderbolt3 12 drive Raid
Resolve Studio latest build
Windows 11 Pro
Decklink Mini Monitor 4k
Intel i9 7960x @ 4 GHz
Thermaltake Floe Riing 360 Water Cooler
Asus x299 Prime Deluxe
64GB 3333 Corsair Dominator ram
1 EVGA RTX 3090 XC3 Card
Areca Thunderbolt3 12 drive Raid