waltervolpatto wrote:You can try to split the effects in a different timeline and "render in place" as workaround
Hello Waltervolpatto,
Dear fellow Resolvians,
Sorry for my silence. Back on the air. Once more, thank you for trying to help or to share the pain.
I still believe DVR19.1 is a fantastic software and my perpetual Christmas tree

In the meantime, I created a new PC and some more underwater video post-productions/testings.
1) Here are the main components of my homemade brand new PC as of today :
- CPU = AMD Ryzen Threadripper 7970X 32-Cores,
- Motherboard = Gigabyte TRX50 Aero D,
- RAM = 192 GB = 4 x Micron 48GB DDR5-5600 RDIMM 2Rx8 CL46,
- GPU1 = ASUS GeForce RTX 4090 24GB (new one),
- GPU2 = Gigabyte GeForce RTX 3090 24GB (my old one),
- Win11 Pro Drive = SSD Samsung 990PRO 1TB NVMe PCIe 4.0 (up to 7450 MB/s),
- DVR Work Drive = SSD Samsung 990PRO 2TB NVMe PCIe 4.0 (up to 7450 MB/s),
- HDD = 2 x 16 GB.
It is working fine and with lightening speed, which is a great relief!
In other words, let's get rid of the hardware and power problem, even if I may consider a possible weakness of my "old" 3090 GPU2 which might not stand the saturation of its memory as well as the new 4090 GPU1.
2) Let's face it:
My DVR19.1 renderings keep on:
- saturating my GPU memories,
- crashing without warning and without freezing my PC, this time (thanks to Win11 pro, to the 48GB GPU memory, to the 192GB RAM or to the brand new PC in gener
al?).
It simply vanishes from the screen and from the running tasks after some hard processing.
So, either I do not use DVR19.1 properly or DVR19.1 is (too) memory greedy.
However, I developped a strategy which seem to prevent it most of the time (see below).
3) Some empirical observations, first, based on what I see on the Windows Task Manager-Performance tool and on how is structured my timeline :
- DVR uses my 2 GPUs in priority, more than the CPU;
- In my configuration and after some tests, DVR seems to consider that the Main GPU is the one managing the 2 monitors (here: GPU1);
- All the 3D GPU processing shows on GPU1 only;
- The memory of GPU1 and GPU2 are used simultaneously;
- The temperature of GPU1 and GPU2 rises simultaneously during the processing;
- Most of the time the CPU is used at, let's say, 10% but can go up to 50% and seldomly up to 70% (hypothesis: 1.when my Fusion macro used to level horizontally, to resize/move and to blackframe the clips is involved or 2.when some serious compression/decompression is taking place like during the Color Render caching with DNxHR HQX codec);
- The use of the two GPU memories is always scaling up and never goes down during the render;
- The release of the two GPU memories takes place only when I end DVR;
- The crashing seems to occur when my GPU2 (3090) memory is saturated;
- My timeline is set to 4K only for Render ; it is made of clips with a lot of color management, some fusion macros when necessary (see Use of CPU above), linked together by standard cross-dissolve effects.
4) Here is an empirical and only intuitonal strategy I use to prevent the Render to crash as of today (written for you as well as a procedure for me) :
Step 1: Set Project settings > Render cache format = DNxHR HQX => Maximum compression of the cache with very few concession on quality in order to use the power of my CPU and to decrease the volume of cache on my DVR Work Drive
Step 2: Set Playback > Render Cache = None => No cache production when working on the timeline Color Management
Step 3: Set Timelines > Timeline Settings > Timeline Resolution = HD (1080p)
Step 4: Do the Color Management of the whole HD timeline
Step 5: Do not use any video transition effect => avoid the likely requirement for DVR to have two clips in GPU memory at the same time during the main color Render caching of step 8
Step 6: Set Timelines > Timeline Settings > Timeline Resolution = Ultra HD (4K) => I duplicate my HD Timeline to play safe as a backup and set the new one to Ultra HD
Step 7: Set Playback > Render Cache = Smart
Step 8: Let the timeline main color Render caching run automaticaly (the red line on top of the timeline in Color must become blue) => serious data crunching, power consumption (not ecological, unfortunately) and heating is taking place, hopefuly without crashing

Step 9: Put the video transition effect in the timeline and set its duration, one at a time => DVR is automatically going to update the cache of the overlapping clips to take the new effect into account, with minimum resource (memory, compute) consumption hopefully
Step 10: Do the Render of the Ultra HD (4K) Timeline
ONLY WHEN statisfied with the rendered 4K timeline:
Step 11: Set Playback > Render Cache = None => To stop the automatical color Render caching
Step 12: Set Playback > Delete Render Cache = All => To release space on the DVR Work Drive
NB1: As an amateur, I really wonder how the Hollywood and other Professional post-producers are playing it with their zillionK rushes. One clip at a time? A full datacenter? Other means?
NB2: BlackMagic Design, can you improve the DaVinci Resolve GPU memory management, please?
Cheers
Jm