- Posts: 36
- Joined: Fri Nov 16, 2012 11:25 pm
I have a timeline that's primarily full of 2k DPX files, scanned from film.
My Video Monitoring settings for the timeline are with Video data levels. Timeline Color Space is Rec.709 Gamma 2.4. I calibrated my projector to Rec.709 Gamma 2.4.
After grading, I made a 2k DCI flat DCP (interop, not SMPTE) from this timeline using the easyDCP plugin. I left the Data Levels for the export set to "Auto", expecting that Resolve and EasyDCP would handle any scaling and colorspace conversions necessary.
I copied the resulting DCP to an ext2-formatted drive and today ingested it into a GDC server connected to a Christie projector, and the result played just fine -- but the black level appeared to be elevated and the image washed out. The projectionist switched the Christie projector from what he said was the cinema standard "Unity RGB", to “RGB 10-bit 64-940”, and suddenly everything looked perfect - the black levels were back to normal, the picture was once again saturated and not overly-bright. This suggests to me that the DCP I’ve made has video levels rather than full range data levels, as 64-940 is the 10-bit version of 16-235.
But when I import that DCP into Resolve in the Media tab and have a look at it, it looks fine. If I set its clip attributes to video levels, it looks very crushed, with all the shadows clipping completely to black. This can be seen in both Resolve's software scopes and on my external scope. This suggests to me that the DCP is full range, not video levels.
If the DCP is full range, then why did it not look right on the Christie until that projector was set to the equivalent of video levels? If it's video levels, why does it not show up as such when brought into Resolve (or any of the freely-available DCP players -- it looks fine in DCP-o-Matic Player and seems to be ok in easyDCP Player Trial).
I'm left scratching my head on this one, and my ability to figure out the solution is limited by my access to a DCP server and projector.
I could send out the DCP for projection, with a note attached saying that Christie projectors should be set to the "RGB 10-bit 64-940" color space setting. But I don't know the equivalent setting name for all other projectors on which this film might be shown (it's a fairly small independent film that's about to go do a series of press screenings). The projectionist at today's test said they typically have that projector set to "Unity RGB", but sometimes have to switch it to "RGB 10-bit 64-940", even for released films from studios. He says they generally watch a couple of scenes and figure out the right setting. Is that really how this process works, or is there a standard way for me to indicate how this file should be projected? And of course, if I'm to indicate something like "video levels" or "64-940", is that really what this file is? I can't seem to verify that with any of the software I've found.
I've done a test export DCP of a minute of the film with the Export > Video > Advanced Settings > Data Levels explicitly set to "Full rather than "Auto". But as far as I can tell by bringing the file into Resolve and into DCP player software, this has the same levels as that initial DCP I'd made with the "Auto" setting. But maybe setting this explicitly would tell EasyDCP to flip some bit in the MXF file such that it will be interpreted as Full range by the DCP server?
Confusion reigns.
On the plus side, the image and sound were great, except for that whole "what are this file's video levels and how do I get projectors to always display it properly" issue.
Thanks for any help or information.
Resolve 15.0.08.065 (beta), macOS 10.13.5
edited to mention "interop"
My Video Monitoring settings for the timeline are with Video data levels. Timeline Color Space is Rec.709 Gamma 2.4. I calibrated my projector to Rec.709 Gamma 2.4.
After grading, I made a 2k DCI flat DCP (interop, not SMPTE) from this timeline using the easyDCP plugin. I left the Data Levels for the export set to "Auto", expecting that Resolve and EasyDCP would handle any scaling and colorspace conversions necessary.
I copied the resulting DCP to an ext2-formatted drive and today ingested it into a GDC server connected to a Christie projector, and the result played just fine -- but the black level appeared to be elevated and the image washed out. The projectionist switched the Christie projector from what he said was the cinema standard "Unity RGB", to “RGB 10-bit 64-940”, and suddenly everything looked perfect - the black levels were back to normal, the picture was once again saturated and not overly-bright. This suggests to me that the DCP I’ve made has video levels rather than full range data levels, as 64-940 is the 10-bit version of 16-235.
But when I import that DCP into Resolve in the Media tab and have a look at it, it looks fine. If I set its clip attributes to video levels, it looks very crushed, with all the shadows clipping completely to black. This can be seen in both Resolve's software scopes and on my external scope. This suggests to me that the DCP is full range, not video levels.
If the DCP is full range, then why did it not look right on the Christie until that projector was set to the equivalent of video levels? If it's video levels, why does it not show up as such when brought into Resolve (or any of the freely-available DCP players -- it looks fine in DCP-o-Matic Player and seems to be ok in easyDCP Player Trial).
I'm left scratching my head on this one, and my ability to figure out the solution is limited by my access to a DCP server and projector.
I could send out the DCP for projection, with a note attached saying that Christie projectors should be set to the "RGB 10-bit 64-940" color space setting. But I don't know the equivalent setting name for all other projectors on which this film might be shown (it's a fairly small independent film that's about to go do a series of press screenings). The projectionist at today's test said they typically have that projector set to "Unity RGB", but sometimes have to switch it to "RGB 10-bit 64-940", even for released films from studios. He says they generally watch a couple of scenes and figure out the right setting. Is that really how this process works, or is there a standard way for me to indicate how this file should be projected? And of course, if I'm to indicate something like "video levels" or "64-940", is that really what this file is? I can't seem to verify that with any of the software I've found.
I've done a test export DCP of a minute of the film with the Export > Video > Advanced Settings > Data Levels explicitly set to "Full rather than "Auto". But as far as I can tell by bringing the file into Resolve and into DCP player software, this has the same levels as that initial DCP I'd made with the "Auto" setting. But maybe setting this explicitly would tell EasyDCP to flip some bit in the MXF file such that it will be interpreted as Full range by the DCP server?
Confusion reigns.
On the plus side, the image and sound were great, except for that whole "what are this file's video levels and how do I get projectors to always display it properly" issue.
Thanks for any help or information.
Resolve 15.0.08.065 (beta), macOS 10.13.5
edited to mention "interop"
Last edited by zachnfine on Tue Jul 17, 2018 1:13 am, edited 1 time in total.