PeterMoretti wrote:Yes, if you remove OS, program and monitor profiles from the equation, which using an external calibrated monitor and BOB does, then you eliminate all of those things. But it doesn't explain what was going on with the OP's setup nor does it answer his question about why his monitor is showing such a huge difference between Rec 709 and sRGB.
Exactly. And it seems pretty clear from his setup that he's grading in the GUI. So, first the OP needs to be running SDI out to eliminate all those variables. Once he's got a signal on a reference monitor that he can trust, only then can he begin to troubleshoot what he's seeing on other screens. That's all I'm saying.
Peter Cave wrote:The BenQ monitor is expecting bit 16-235 video levels for the rec709 setting and 0-255 for the sRGB setting. These correspond to the Resolve 'Video' and 'Full or Data' output settings. Rec709 rendering from Resolve should always be set to Video levels, sRGB rendering should be Data.
I doubt that's the issue. If you choose "auto", virtually all video file formats exported out of Resolve will have flags in the header of the wrapper that denote how the file should be displayed. Manually setting to "video" or "full" can screw that up. The colorists at MixingLight.com (who I trust about such things) recommend sticking with "auto" unless you are rendering out image sequences or intermediate files for another software or hardware that you know for sure can't or won't read the flag properly.