Brian Millbrook wrote:what would you rate the Quicktime MP4 codec or the H.264 Codec for 4K rendering as far as quality? I'm trying to get as close to 4K prores as possible but I'm on PC at the moment.
Ugh. Terrible - MP4 or H.
264 will destroy your footage. And whilst H.
264 *can* be used for archival purposes, Resolve lacks the parameters necessary to export H.
264 to that kind of quality.
In your position (and you're not gonna like this) I'd export either as an image sequence or as an uncompressed AVI and then compress it to DNxHR using QuickTime or some other package. It's the only current way (other than MXF/DNxHR) to get 4k out of Resolve without seriously comprising the quality. Note though that you won't be able to get QuickTime DNxHR back into Resolve (yet).
Personally, I only work with uncompressed 10-bit YUV AVIs when exporting out of Resolve now but for HD work you can use 10-bit DNxHD.
Johan Cramer wrote:I don't think that this is an issue of the codec as such, but of Gamma shifts and different interpretations of color spaces in Resolve's internal preview vs. external video playing software. Blackmagic clearly says that you need a professional grading monitor in order to judge color.
Resolve's internal preview is just showing a full-range image (0-1023) which more often than not (but not always) maps studio-swing to full-swing. It's more predictable than most software I've used in how it treats different
colour-spaces. Where you have to be careful is how you export footage - Resolve's 'auto' setting doesn't always get it right so it's important to know how your destination app (and codec) handles swing. ProRes 422 for instance is *always* YUV and should be interpreted at 'video' levels - some apps will bring it in as 'data' though which is where gamma/
colour shifts start to happen. A better grading monitor will get you better grades but not knowing how your codecs work makes all that effort pointless.
Added to this, Mac-originated footage (particularly ProRes) may come in at the wrong gamma on PCs(1.8 vs 2.2 - Fusion does this, Resolve doesn't) and have to be compensated for before doing any further work on the footage.
In the screengrab below you can see the original TIFF (RGB) images on the left and a Mac-created ProRes HQ version on the right - Fusion's correctly remapped from YUV to RGB but hasn't caught the gamma difference which is why the two look different. Resolve handles the gamma correctly and both sequences look identical when viewing them in Resolve.
Here's another example - on the left is the same TIFF RGB sequence, on the right is a DNxHD sequence rendered out of Resolve at 'video' levels - Fusion expects the DNxHD to be at data levels (presumably, that's what the Avid codec is telling QuickTime's decoder) and consequently the footage looks wrong. It's easily fixed with an Auto-Gain but that doesn't make it any less wrong
Once again, Resolve handles the footage transparently and both clips look identical when viewed in Resolve.
The moral of this story being: just because it looks right in Resolve, it doesn't mean your rendered footage will look right when it gets to wherever it's going. External H.
264/MP4 encoding in particular is a minefield since some encoders expect footage to be input in a certain way (studio/full swing) - Sony Vegas *must* have footage in Studio-Swing otherwise it encodes incorrectly, ffmpeg/x264 examine the video stream and internally convert RGB to YUV 420.
And even assuming your footage is correctly encoded, you final video may look different when played back on a Mac or a PC since QuickTime wants/expects gamma information to be tagged in the .mov
One last example:
The clip on the left was rendered as an MP4 out of Vegas at studio levels and matches the source footage. The bottom clip was at data (or 'computer' as Vegas calls it) levels and has crushed the blacks. On the right is the (correct) studio version on the Mac - notice that the gamma's out.