Bug report at the end.
Inspired by Bryans great work in mapping the SSIM index of Youtube videos using different source encodes, I thought I'd take a look at the recent additions to h.264 encoders in Resolve v15.
Since v15.2 the Studio version of Resolve for Windows and Linux can use hardware accelerated Nvidia and Intel Quick Sync encoders in addition to Resolve's native software encoder. In older versions of Resolve (like v12) the included encoder wasn't very good and I basically never used it. But even the software encoder seems to have improved a great deal.
I used the crowd_run_2160p50.y4m UHD clip available at https://media.xiph.org/video/derf/ as a reference. Since Resolve can't read that uncompressed format I encoded it to lossless h.264 using -qp 0 in ffmpeg and verified that the SSIM index was 1 = lossless. Using -crf 0 instead of -qp 0 gives an SSIM index slightly below 1 for some reason.
The original file is 8-bit, 4:2:0 according to VLC.
Encode lossless reference file:
The Intel Quick Sync file was encoded on an Intel NUC using an Intel i5-4250U CPU. A small wonder it can even run Resolve and render UHD files with the built-in GPU. The Nvidia file was encoded on a workstation using a GeForce GTX Titan X (Maxwell architecture).
Results
Apologies for the swedish number format with commas, just think of them as decimal points.
Note that the bit rate is in megabytes per second. Insanely high average bit rate for the "Best" quality setting using the Intel Quick Sync encoder on this clip. SSIM values are high though. Compare these to the .88 that Bryan found when downloading the highest quality encodes from Youtube.
What about macOS?
I also rendered h.264 on a 2018 Mac Mini using the "Automatic" quality setting. For some reason the macOS version of DaVinci Resolve Studio v15.2.3 doesn't have the Best, High, Medium, Low, Least settings that are available in Windows. The SSIM index came out 0,907721 and the average bit rate was 17,82 MB/s. That's a lower average bit rate than the "Least" setting in the Windows version native encoder and of course the image quality suffers.
I haven't tested setting a specific max bit rate.
Nvidia h.265
I've learned that SSIM isn't necessarily a good measure of image quality for encoders that are perceptually optimized. Perhaps that explains the lower SSIM values for the Nvidia h.265 encoder. I assume this might be different in different generations of GPU:s as well.
Encoder Quality setting bugs/incorrect nomenclature
As you can see in the chart above there's something wrong. For Nvidia h.265 the quality settings for Best, High, Medium and Low all output the exact same thing. The same goes for Nvidia h.264 Best and High. I suppose it could be that these settings are for constant quality where the only difference is the time taken, but then it's not consistent.
For VP9 encodes (only in the macOS version of Resolve), the nomenclature of "Best" seems incorrect since all the "Quality" settings produces practically the same quality. It's locked in Constant Quality mode and the only difference is how long the encode takes and to a lesser extent the file size. Best came out 0,972852 (103,56 MB/s) and Least was 0,970022 (106,91 MB/s).
Inspired by Bryans great work in mapping the SSIM index of Youtube videos using different source encodes, I thought I'd take a look at the recent additions to h.264 encoders in Resolve v15.
Since v15.2 the Studio version of Resolve for Windows and Linux can use hardware accelerated Nvidia and Intel Quick Sync encoders in addition to Resolve's native software encoder. In older versions of Resolve (like v12) the included encoder wasn't very good and I basically never used it. But even the software encoder seems to have improved a great deal.
I used the crowd_run_2160p50.y4m UHD clip available at https://media.xiph.org/video/derf/ as a reference. Since Resolve can't read that uncompressed format I encoded it to lossless h.264 using -qp 0 in ffmpeg and verified that the SSIM index was 1 = lossless. Using -crf 0 instead of -qp 0 gives an SSIM index slightly below 1 for some reason.
The original file is 8-bit, 4:2:0 according to VLC.
Encode lossless reference file:
- Code: Select all
ffmpeg -i crowd_run_2160p50.y4m -c:v libx264 -qp 0 crowd_run_2160p50_reference.mp4
- Code: Select all
ffmpeg -i crowd_run_2160p50_reference.mp4 -i crowd_run_2160p50.y4m -lavfi ssim -f null
The Intel Quick Sync file was encoded on an Intel NUC using an Intel i5-4250U CPU. A small wonder it can even run Resolve and render UHD files with the built-in GPU. The Nvidia file was encoded on a workstation using a GeForce GTX Titan X (Maxwell architecture).
Results
Apologies for the swedish number format with commas, just think of them as decimal points.
Note that the bit rate is in megabytes per second. Insanely high average bit rate for the "Best" quality setting using the Intel Quick Sync encoder on this clip. SSIM values are high though. Compare these to the .88 that Bryan found when downloading the highest quality encodes from Youtube.
What about macOS?
I also rendered h.264 on a 2018 Mac Mini using the "Automatic" quality setting. For some reason the macOS version of DaVinci Resolve Studio v15.2.3 doesn't have the Best, High, Medium, Low, Least settings that are available in Windows. The SSIM index came out 0,907721 and the average bit rate was 17,82 MB/s. That's a lower average bit rate than the "Least" setting in the Windows version native encoder and of course the image quality suffers.
I haven't tested setting a specific max bit rate.
Nvidia h.265
I've learned that SSIM isn't necessarily a good measure of image quality for encoders that are perceptually optimized. Perhaps that explains the lower SSIM values for the Nvidia h.265 encoder. I assume this might be different in different generations of GPU:s as well.
Encoder Quality setting bugs/incorrect nomenclature
As you can see in the chart above there's something wrong. For Nvidia h.265 the quality settings for Best, High, Medium and Low all output the exact same thing. The same goes for Nvidia h.264 Best and High. I suppose it could be that these settings are for constant quality where the only difference is the time taken, but then it's not consistent.
For VP9 encodes (only in the macOS version of Resolve), the nomenclature of "Best" seems incorrect since all the "Quality" settings produces practically the same quality. It's locked in Constant Quality mode and the only difference is how long the encode takes and to a lesser extent the file size. Best came out 0,972852 (103,56 MB/s) and Least was 0,970022 (106,91 MB/s).
Last edited by roger.magnusson on Wed Jan 30, 2019 4:27 pm, edited 1 time in total.