Best CPU for new build

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

Culnara

  • Posts: 1
  • Joined: Sun Apr 07, 2024 5:22 am
  • Real Name: Stuart Kenneth Burnett

Best CPU for new build

PostSun Apr 07, 2024 5:39 am

Hi folks,

I'm basically looking for some advice, I'm currently in the middle of buying parts for a new PC build and I'm getting confused (coz I only have a vague idea what I am doing) on which CPU to get.

Ive had the DV studio for well over a year now and my old rig with a GTX1080Ti has recently started to struggle a bit.

Current build will have a ASUS ROG Strix GeForce RTX 4090 OC Edition GPU, I have also ordered an Intel Core i9-14900KF Raptor Lake Refresh CPU.

This is where the issue lies, I've spoke with several members on facebook DV resolve page and It's been recommended that I ditch the i9-14900KF and get the i9-14900K CPU instead.

Could any of you kind gets give me some advice please

TIA

Cul
Offline

Jim Simon

  • Posts: 31017
  • Joined: Fri Dec 23, 2016 1:47 am

Re: Best CPU for new build

PostMon Apr 08, 2024 8:50 pm

K doesn't matter to me, as I don't overclock. F would give you some additional decoding options, so that might be nice.
My Biases:

You NEED training.
You NEED a desktop.
You NEED a calibrated (non-computer) display.
Offline

RCModelReviews

  • Posts: 1242
  • Joined: Wed Jun 06, 2018 1:39 am
  • Real Name: Bruce Simpson

Re: Best CPU for new build

PostMon Apr 08, 2024 11:10 pm

Jim Simon wrote:K doesn't matter to me, as I don't overclock. F would give you some additional decoding options, so that might be nice.

No, the 'F' version has no internal GPU so you lose the nice 10-bit H265 decoding that this would offer. Do not get the F version if you want to work with 10-bit long-GOP file formats.
Resolve 18.1 Studio, Fusion 9 Studio
CPU: i7 8700, OS: Windows 10 32GB RAM, GPU: RTX3060
I'm refugee from Sony Vegas slicing video for my YouTube channels.
Offline

4EvrYng

  • Posts: 676
  • Joined: Sat Feb 19, 2022 12:45 am
  • Warnings: 1
  • Real Name: Alexander Dali

Re: Best CPU for new build

PostMon Apr 08, 2024 11:24 pm

RCModelReviews wrote:
Jim Simon wrote:K doesn't matter to me, as I don't overclock. F would give you some additional decoding options, so that might be nice.

No, the 'F' version has no internal GPU so you lose the nice 10-bit H265 decoding that this would offer. Do not get the F version if you want to work with 10-bit long-GOP file formats.

+1 If you get a version that does not have iGPU you will regret it sooner or later.
Offline
User avatar

Marc Wielage

  • Posts: 11237
  • Joined: Fri Oct 18, 2013 2:46 am
  • Location: Hollywood, USA

Re: Best CPU for new build

PostTue Apr 09, 2024 8:45 am

Just watched a great new video on this very subject:

Certified DaVinci Resolve Color Trainer • AdvancedColorTraining.com
Offline
User avatar

PeterDrage

  • Posts: 42
  • Joined: Sun Jan 01, 2023 6:25 pm
  • Location: UK
  • Real Name: Peter Drage

Re: Best CPU for new build

PostTue Apr 09, 2024 11:16 am

Jim Simon wrote:K doesn't matter to me, as I don't overclock. F would give you some additional decoding options, so that might be nice.


+1 Do not buy an F CPU as other have said it does not have an iGPU and you will regret it. Intel Media Engine is far better than Nvidia and AMD for Hardware Decode.

I run a high core count CPU and have a 4080 for main gpu and then purchased a cheap Intel Arc A310 Single Slot GPU that I use just for its media engine only for Hardware based Decode.

Works really well and has transformed my system when using H.265 10Bit LongGOP.
Resolve Studio 18.6.6 | Micro Panel | Speed Editor | DeckLink Monitor 4K

10980XE | 128GB RAM | RTX4080 | ARC A310 | 4x2TB NVMe | 72TB RAID 6 HDD | Win11 23H2

14” MacBook Pro | M3 Max 16 Core | 40 Core GPU | 64GB RAM | 1TB SSD | MacOS 14.4.1
Offline

Jim Simon

  • Posts: 31017
  • Joined: Fri Dec 23, 2016 1:47 am

Re: Best CPU for new build

PostTue Apr 09, 2024 2:32 pm

RCModelReviews wrote:No, the 'F' version has no internal GPU
Ooops, I got that backwards.
My Biases:

You NEED training.
You NEED a desktop.
You NEED a calibrated (non-computer) display.
Offline

4EvrYng

  • Posts: 676
  • Joined: Sat Feb 19, 2022 12:45 am
  • Warnings: 1
  • Real Name: Alexander Dali

Re: Best CPU for new build

PostTue Apr 09, 2024 6:30 pm

PeterDrage wrote:I run a high core count CPU and have a 4080 for main gpu and then purchased a cheap Intel Arc A310 Single Slot GPU that I use just for its media engine only for Hardware based Decode.

Works really well and has transformed my system when using H.265 10Bit LongGOP.

As my CPU doesn't have iGPU I'm very interested in hearing more about this. Which card did you get, please? Are you experiencing any driver issues / conflicts that are due to Arc?
Offline
User avatar

PeterDrage

  • Posts: 42
  • Joined: Sun Jan 01, 2023 6:25 pm
  • Location: UK
  • Real Name: Peter Drage

Re: Best CPU for new build

PostTue Apr 09, 2024 6:59 pm

4EvrYng wrote:
PeterDrage wrote:I run a high core count CPU and have a 4080 for main gpu and then purchased a cheap Intel Arc A310 Single Slot GPU that I use just for its media engine only for Hardware based Decode.

Works really well and has transformed my system when using H.265 10Bit LongGOP.

As my CPU doesn't have iGPU I'm very interested in hearing more about this. Which card did you get, please? Are you experiencing any driver issues / conflicts that are due to Arc?


Hi

I purchased the Sparkle A310 ECO, tiny single slot GPU that draws its power from the PCI-E slot only. I am using the latest Intel Drivers as of this message. I have used Arc Control centre to limit the TDP to 20W which is enough to run the Media Engine as the GPU part is not used. Fans never start up and temp sits at around 50-55C even under load, case fans provide enough air flow.

I have an 36 core X299 system with the card in an x8 PCI Slot. DaVinci is set to use CUDA for Effects and Intel Arc for Hardware Decode. This single card has probably extended the life of my system by 2 -3 years or at least until GPUs move to PCIE 5.

Been running it for several months and have done 5 projects where all source footage was H.265 LongGOP. Previously I would have to create Proxies. But now I can happily scrub a timeline with a 4 camera mulitcam at 4K 25 and it does not drop frames.

I have even found the AV1 Encode and Decode to be comparable to the 4080.

Regards

Peter
Resolve Studio 18.6.6 | Micro Panel | Speed Editor | DeckLink Monitor 4K

10980XE | 128GB RAM | RTX4080 | ARC A310 | 4x2TB NVMe | 72TB RAID 6 HDD | Win11 23H2

14” MacBook Pro | M3 Max 16 Core | 40 Core GPU | 64GB RAM | 1TB SSD | MacOS 14.4.1
Offline

4EvrYng

  • Posts: 676
  • Joined: Sat Feb 19, 2022 12:45 am
  • Warnings: 1
  • Real Name: Alexander Dali

Re: Best CPU for new build

PostWed Apr 10, 2024 6:08 am

PeterDrage wrote:I purchased the Sparkle A310 ECO …

I have an 36 core X299 system with the card in an x8 PCI Slot. … This single card has probably extended the life of my system by 2 -3 years or at least until GPUs move to PCIE 5.

I have even found the AV1 Encode and Decode to be comparable to the 4080.

Thank you for thorough reply! My system too is X299 + Cascade Lake-X based (EVGA X299 Dark + 10920X), but with Nvidia 2080 Super, so my motivation too has been to extend life by adding what I’m missing (H.265 10-bit 4:2:2 and AV1 support) cheaply.

One thing I am not sure about is which model I should go for, which ones of the models would work in slots I have left and how much memory I would need.

Slot I have free is x16 physically so any model will fit in it, question is will it work because that slot has 8 lanes. Intel is clear that A310 and A380 use 8 lanes, and thus should work, but their phrasing “_UP TO_ PCIe 4.0 x16” for A580/750/770 is throwing me off. Do you happen to know will A580/750/770 work in x16 slot that has 8 lanes or for them slot MUST have 16 lanes? That will determine what is the maximum memory I can get.

When it comes to memory, what is the highest utilization you have seen when working with all of its features?

Last, but not least, I have seen warnings that Arc requires ReBAR to run decently. My mb does have ReBAR support but I don’t know how good is it, that is one of the last things they added before stopping issuing BIOS updates for it. This person https://linustechtips.com/topic/1558854 ... t=16324848 claims to be using A770 + X299 Dark with ReBAR on but he is the only one I could find. What has been your experience, please?
Offline
User avatar

PeterDrage

  • Posts: 42
  • Joined: Sun Jan 01, 2023 6:25 pm
  • Location: UK
  • Real Name: Peter Drage

Re: Best CPU for new build

PostWed Apr 10, 2024 11:45 am

Hi

That is a lot of questions so here goes:-

Option 1 - Cheapest and Simplest (What I did at Home)

Go with the Sparkle A310 ECO Single Slot, no cabling required. It is a native 8x wired PCI-E 4 device but it works quite happily in an 8x wired PCI-E 3 Slot, as long as the lanes are from the CPU and not the Chipset, which is what I am running it in. The reduction in bandwidth is not important as the system will not be using it as a GPU, likewise the amount of memory makes no real difference. This is because the only part of the die that will be used by DaVinci is the Media Engine for Encode / Decode.

I use the 4080 for AV1 Encode and CUDA/RT for GPU Effects and AI, however the AV1 Encode also works fine on the A310. I have no monitor plugged into the A310 and TDP is wound right down to between 20 - 22 W

Option 2 - Replace 2080 Super with an Intel A770 16GB and sell 2080 Super

This would give you all the benefits of the A310 (AV1 and H.265 10Bit LongGOP) in a single GPU. I run an A770 16GB at work which replaced by Quadro RTX5000 16GB, when it failed, which is essentially a 2080 with more memory and ECC. I saw no meaningful difference in performance for my workloads in DaVainci. Puget System have some great Comparisons on GPU's which have been updated recently to show the gains in Intel ARC performance. If you game I would probably stick with the 2080 Super and go with Option 1.

ReBAR

Yes, Intel ARC requires ReBAR to be performant, as long as your Motherboard / BIOS Supports it as an option then you should be all good. I have never tried running it with it off, although according to all the reviews it works just with poor performance, but they are talking about GPU Gaming and Rendering.

ReBAR can be a benefit / hindrance in games depending on the Game but I have not seen or heard anything about DaVinci having an issue with it.

Experience with Intel ARC

I had it delivered on Day One and things were not great, I moved it to a Test System after about a week. However, Intel to their credit have massively improved the quality of the drivers which were always the issue rather than the hardware. Since the middle of 2023, I have been running the A770 in my work system where I do some occasional editing, the majority of my editing is done at Home on my personal system, and it has been rock solid. Key thing is to keep the drivers up to date, as every release they improve.

I have looked at both Intel and AMD Consumer Platforms but there is just not the performance gains for me to move. My 10980XE has 18C/36T all running at 4.8GHz, and whilst new platforms are running at 6GHz they have less cores and way less PCIE Lanes. I have used all 48 CPU PCIE Lanes for dedicated storage for Boot, Edit, Data, Cache Volumes that are all redundant NVMe. x299 is an undervalued and misunderstood platform. The new HEDT platform from Intel and AMD are way over by budget.

One for the best tweaks I made was to the File System Cluster Size of the volumes, for example optimising it for Video Storage. From my Internal 6 x 20TB HDD Archive Array, I am now getting 1GB/s Read and Write sustained. It is actually fast enough to edit multicam ProRES 422 and BRAW from if I need to go back and retweak a project that has been archived.

Hope that helps.

Peter
Resolve Studio 18.6.6 | Micro Panel | Speed Editor | DeckLink Monitor 4K

10980XE | 128GB RAM | RTX4080 | ARC A310 | 4x2TB NVMe | 72TB RAID 6 HDD | Win11 23H2

14” MacBook Pro | M3 Max 16 Core | 40 Core GPU | 64GB RAM | 1TB SSD | MacOS 14.4.1
Offline

Klaus Schuberth

  • Posts: 8
  • Joined: Mon May 26, 2014 6:55 am

Re: Best CPU for new build

PostThu Apr 11, 2024 12:55 pm

Hello,

i just bulid a new workstation for davinci resolve studio with

-Intel i9-14900K
-Asus ProArt Z790-CREATOR WiFi-Mainboard
-64 GB G.Skill Ripjaws 2x32 GB DDR5 Ram 6000 MHz
-NVidia Gaming GeForce RTX 4070 Ti
-2xSamsung 980 PRO NVMe M.2 SSD
-BM Intensity 4K pro
-BM Speed Editor
-BM DR Resolve Micor Panel

and Davinci Resolve Studio doesnt start with both the intel Processor GPU UHD 770 and the NVidia GPU.
It doesnt show the startscreen from DR. If i disable the intel gpu 770, then davinci starts normal. All GPUs with the latest drivers. I read here in the forum, that it is not possible to have both GPUs activ. Can this be?

This is the ResolveDebug Log when DR doesnt start because of the intel GPU is active:

[0x00006d18] | Main | INFO | 2024-04-09 19:51:03,985 | Running DaVinci Resolve Studio v18.6.6.0007 (Windows/MSVC x86_64)
[0x00006d18] | Main | INFO | 2024-04-09 19:51:03,985 | BMD_BUILD_UUID 38bc271b-9970-455d-9b0c-dbd7706f3c71
[0x00006d18] | Main | INFO | 2024-04-09 19:51:03,985 | BMD_GIT_COMMIT 3f827ef0d730e66d9a676870f5237d1a0008c1ec
[0x00006d18] | GPUDetect | INFO | 2024-04-09 19:51:03,986 | Starting GPUDetect 1.2_5-a6

Any ideas?
Offline

Nick2021

  • Posts: 783
  • Joined: Thu May 13, 2021 3:19 am
  • Real Name: Nick Zentena

Re: Best CPU for new build

PostThu Apr 11, 2024 2:55 pm

No there shouldn't be a problem running both the IGPU and the GPU. You just don't want a monitor plugged into the IGPU.
Offline

Klaus Schuberth

  • Posts: 8
  • Joined: Mon May 26, 2014 6:55 am

Re: Best CPU for new build

PostThu Apr 11, 2024 4:37 pm

Thanks Nick, but no sucess.

I unplugged the monitor from the iGPU, but Davinci still doesnt start....
Offline

Klaus Schuberth

  • Posts: 8
  • Joined: Mon May 26, 2014 6:55 am

Re: Best CPU for new build

PostThu Apr 11, 2024 4:40 pm

This is from the relsove_graphics_log:

[Info] [GInstance.Create] success
Platform Windows
Mac OS Version unset
AMD Driver Version unset
Intel Driver Version unset
NVIDIA Driver Version 551.86
CPU Core Count 32
Physical Memory (MiB) 65180
[Info] [GInstance.Create] success, backend=OpenGL
[Info] [GLContext.Create] success: Name="DeviceManager", Handle=0000000000010005, ShareContext="GlobalShare", ThreadID=66F0
Requested: 4.5 Core, R(-1)G(-1)B(-1)A(-1), D(-1)S(-1), SwapBehaviour=platformDefault SwapMethod=default
Chosen: 4.5 Core, RGBA8_UNorm, D24_UNorm_S8_UInt, sRGB, SwapBehaviour=double SwapMethod=default
Surface: 4.5 Core, RGBA8_UNorm, D24_UNorm_S8_UInt, sRGB, SwapBehaviour=double SwapMethod=default
Screen: \\.\DISPLAY2, 1920x1080, DPR=1.0, Refresh~59.00, Geom=(0,0,1920,1080), Primary=yes
[Info] skipped loading of 4.6 entry points
[Info] Fusion::Graphics::GDevice::SetupShaderEnvironment shader version set to 4.5
[Info] [GLibrary.Create] success
GPUID gpu:3ba1d2cb.5150080d
Name GeForce RTX 4070 Ti
Vendor NVIDIA
Type Discrete
Physical Memory (MiB) 12281
IsMainDisplayGPU yes
Backend OpenGL
Backend Version 4.5
GLSL Version 4.5
[Info] Looking for shaders ...
[Info] Shaders found.
[Info] [GLContext.Create] success: Name="ResourceManager3D", Handle=0000000000010006, ShareContext="GlobalShare", ThreadID=66F0
Requested: 4.5 Core, R(-1)G(-1)B(-1)A(-1), D(-1)S(-1), SwapBehaviour=platformDefault SwapMethod=default
Chosen: 4.5 Core, RGBA8_UNorm, D24_UNorm_S8_UInt, sRGB, SwapBehaviour=double SwapMethod=default
Surface: 4.5 Core, RGBA8_UNorm, D24_UNorm_S8_UInt, sRGB, SwapBehaviour=double SwapMethod=default
Screen: \\.\DISPLAY2, 1920x1080, DPR=1.0, Refresh~59.00, Geom=(0,0,1920,1080), Primary=yes
[Info] [GLContext.Create] success: Name="GraphicsThread0", Handle=0000000000010007, ShareContext="GlobalShare", ThreadID=66F0
Requested: 4.5 Core, R(-1)G(-1)B(-1)A(-1), D(-1)S(-1), SwapBehaviour=platformDefault SwapMethod=default
Chosen: 4.5 Core, RGBA8_UNorm, D24_UNorm_S8_UInt, sRGB, SwapBehaviour=double SwapMethod=default
Surface: 4.5 Core, RGBA8_UNorm, D24_UNorm_S8_UInt, sRGB, SwapBehaviour=double SwapMethod=default
Screen: \\.\DISPLAY2, 1920x1080, DPR=1.0, Refresh~59.00, Geom=(0,0,1920,1080), Primary=yes
[Info] [GLContext.Destroy] Name="GraphicsThread0", ThreadID=A0A8
[Info] [GLContext.Destroy] Name="ResourceManager3D", ThreadID=66F0
[Info] [GLContext.Destroy] Name="DeviceManager", ThreadID=66F0

Message Summary:
Trace = 0 Debug = 0 Info = 13 Warning = 0 Error = 0 Fatal = 0
Offline
User avatar

PeterDrage

  • Posts: 42
  • Joined: Sun Jan 01, 2023 6:25 pm
  • Location: UK
  • Real Name: Peter Drage

Re: Best CPU for new build

PostThu Apr 11, 2024 4:48 pm

Using an iGPU for Decode and an dedicated GPU is the best option. The Intel Media engine in the iGPU is far superior to even a 4090’s NVDEC.

Has had already been said just make sure no monitor is plugged into the motherboard only the 4070.

Under DaVinci you set GPU to be Cuda not OpenCL and the iGPU to be hardware Decode. I run essentially the same setup just using a dedicate Intel GPU as I have an HEDT platform.

Puget Systems recommend Intel CPUs with iGPU for this very reason.

If DaVinci is not starting then I guess it is an issue with either your PC setup or DaVinci Install.

I have always found UK Tech Support very helpful when it comes to providing support for the Studio version.
Resolve Studio 18.6.6 | Micro Panel | Speed Editor | DeckLink Monitor 4K

10980XE | 128GB RAM | RTX4080 | ARC A310 | 4x2TB NVMe | 72TB RAID 6 HDD | Win11 23H2

14” MacBook Pro | M3 Max 16 Core | 40 Core GPU | 64GB RAM | 1TB SSD | MacOS 14.4.1
Offline
User avatar

PeterDrage

  • Posts: 42
  • Joined: Sun Jan 01, 2023 6:25 pm
  • Location: UK
  • Real Name: Peter Drage

Re: Best CPU for new build

PostThu Apr 11, 2024 4:49 pm

That Log appears to be showing you are running OpenGL rather than CUDA and that there is no Intel GPU driver install.

Is that correct?
Resolve Studio 18.6.6 | Micro Panel | Speed Editor | DeckLink Monitor 4K

10980XE | 128GB RAM | RTX4080 | ARC A310 | 4x2TB NVMe | 72TB RAID 6 HDD | Win11 23H2

14” MacBook Pro | M3 Max 16 Core | 40 Core GPU | 64GB RAM | 1TB SSD | MacOS 14.4.1
Offline

4EvrYng

  • Posts: 676
  • Joined: Sat Feb 19, 2022 12:45 am
  • Warnings: 1
  • Real Name: Alexander Dali

Re: Best CPU for new build

PostSun May 19, 2024 10:27 pm

PeterDrage wrote:That is a lot of questions so here goes:-

Option 1 - Go with the Sparkle A310 ECO Single Slot

Option 2 - Replace 2080 Super with an Intel A770 16GB

I’m always very hesitant to put all my eggs into single basket, especially when that basket still seems to be going through growing pains, and even more when daily correct functioning and stability is of importance. So far Nvidia has shown that I can rely on it. Also, if I were to take it out that wouldn’t be quick affair as I would have to deal with its water cooling. Same if Arc started misbehaving and I had to put 2080 back in. Having Arc as an add-on would be simpler & quicker so that is the path I will be taking. With that in mind Sparkle A310 sounds like a good compromise to me, I wouldn’t have to worry about PCIe lanes behavior (like I would with other models), it is a single slot, and it is inexpensive enough to not worry about cost.

However, I’m concerned about reports of Sparkle’s maddening fan behavior. If you don’t mind I will DM you, please, with questions about that.

When it comes to Arc’s AV1 features, I’m curious how they work in dual card setup where card that can handle AV1 isn’t primary and doesn’t have monitor connected to it. I’m guessing encoding will work fine but what about decoding though? When I try to play YouTube AV1 video would Web browser use Arc to decode it and then pass that to Nvidia for display?

PeterDrage wrote:ReBAR can be a benefit / hindrance in games depending on the Game but I have not seen or heard anything about DaVinci having an issue with it.

I couldn’t find much about ReBAR’s impact on Arc’s performance outside of games. https://www.reddit.com/r/IntelArc/comme ... resizable/ indicates it results in significant difference even for other things, like encoding, and viewtopic.php?f=21&t=190594&p=1004782 indicates ReBAR _might_ improve performance of Resolve too, so naturally I would like to have it on if possible, but later post also indicates Above 4G required for ReBAR _might_ cause instability.

I don’t see possibility of Resolve itself having an issue with ReBAR because that interaction happens on system level, I am concerned whether my system will handle it correctly because a) manufacturer seemed to add support for it in a rush, and b) to have ReBAR one needs to enable Above 4G Decoding, which in turn makes some of other devices too use area above 4G and one doesn’t have control over which ones will so if one of them is incompatible that could introduce stability/performance issue elsewhere. I guess there is only one way to find out will that happen to me.

PeterDrage wrote:My 10980XE has 18C/36T all running at 4.8GHz

Do you find 10980XE’s extra cores make a significant difference in Resolve? When I was making purchase decision Puget reviews were indicating at best 10% better performance in Resolve over 10920X. When I watch Task Manager I perceive two CPU pain spots:

1. There still seem to be good chunks of code that are single-threaded so only way to significantly improve their performance is by using CPU that has significantly better single core performance which upgrading to 10980XE wouldn’t bring me.

2. Node caching is where I wait most and during that time all cores are maxed out. Significantly more cores might improve that IF code scales up well. How well 10980XE handles such tasks, for example when full resolution DNxHR HQX is used for node caching of 4K timeline?

In other words, I’m not sure would I benefit from upgrading CPU to 10980XE. If you know the answer I would appreciate it.

PeterDrage wrote:… whilst new platforms are running at 6GHz they have less cores and way less PCIE Lanes. I have used all 48 CPU PCIE lanes …
x299 is an undervalued and misunderstood platform. The new HEDT platform from Intel and AMD are way over by budget.

I feel the same way. IMO Intel’s current desktop platform covers corporate, casual, and gamer users but is severely limiting for power users that need workstation with plenty of expandability. I feel later group runs fast out of even what X299 offers so I kept having high hopes for what would successor platform bring but when, after a long wait, W790 and CPUs for it finally arrived and I saw the prices my jaw hit the floor and stayed there, at such prices it is practically same to me as if they don’t exist at all. Thanks to that I am for the first time seriously considering switching to AMD once I am forced to upgrade (assuming I could afford it). In the meantime, I am trying to extend life of my X299 setup as much as I can.

Unfortunately, my X299 mb has, unlike yours, 44 CPU PCIe lanes. Which motherboard you are using, please?

PeterDrage wrote:One for the best tweaks I made was to the File System Cluster Size of the volumes, for example optimising it for Video Storage. From my Internal 6 x 20TB HDD Archive Array, I am now getting 1GB/s Read and Write sustained.

When I was trying to pick optimal cluster size my effort resulted in one step forward one step back. Going from 4K to 64K resulted in sustained speeds going up in benchmarks but I couldn’t find it making any practical difference in Resolve. I’m curious which value you selected as optimal cluster size, how you arrived to it, and did it make a visible difference in Resolve, please?

Thank you again for very thorough and helpful input, it is very much appreciated!

Return to DaVinci Resolve

Who is online

Users browsing this forum: Google [Bot], panos_mts, robodog1 and 124 guests