Rethinking How Resolve's Pages Interact 2.0

  • Author
  • Message
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Rethinking How Resolve's Pages Interact 2.0

PostWed Feb 26, 2020 4:00 am

This is a little outdated. The Fusion Effects idea was implemented but not to the extent that is mentioned in this post. None of the structural changes were adopted by BMD though so Resolve 17 is still largely a mess in its implementation of features and it's Fusion integration has not been fixed whatsoever.

Since Resolve now lets people disable pages they won't use and lets them disable the page navigation entirely, Resolve users that specialize in just audio, color, compositing, or editing can treat Resolve like it's the stand-alone application... but when you try to use these pages alone, you very quickly realize how ill-suited each page is to work on it's own. There are features all throughout that require you to jump to another page when you really shouldn't have to. Furthermore, I found that the way that pages interact isn't always as clear as it should be.

To try to fix this, I decided to do was to make a diagram of Davinci Resolve showing both feature overlap and the order of operations to get a sense of this. This is what I came up with.

The Current

Image

Image Fairlight Image Edit Image Fusion Image Color

The page navigation is representative of the processing chain?

I think most people assume this: The changes you make in the Color page happen after the Fusion page and its changes happen after the Edit page in the processing chain. That's not true. The order of operations interweave between the pages in a way that's only made clear through a experimention or finding it in the 3500+ page manual and that overlap isn't represented in the interfaces of those pages. As a result, if a colorist was handed-off an edit where the editor as already added effects and blend modes would need to switch pages to change those settings. A colorist shouldn't have to that.

Should I used the Color page or Fusion?

Right now the Fusion and Color pages are made to feel incredibly redundant. All clips get turned into unique Fusion compositions when they're brought into the timeline. The Color page excels at the grading and Fusion excels at complex composition but they can both do the other's job to some extent and both do their thing with nodes. Everybody can agree that there kind of similar but also really different, different enough that they can't be combine with out getting worse in one of those aspects. Lastly all the Fusion nodes happen before the ones on Color page and the official way to swap that order is to grade a clip, then encapsolate it in the compound clip and put the Fusion composition on that clip.

None of that feels well-thought-out.

Top Request: Bring X Feature From That Page To This Page

Fairlight is... a little bare. You can see in the diagram that the overlap between the Edit page and Fairlight is nearly 100%. They're two incredibly similar pages. The same applies to the Cut page which you just have to imagine is being completely overlapped by the Edit page. All three are timeline editing pages and despite supposedly being tailored to a specific purpose, the ways in which they were made to be different seem to be more arbitrary than some thing as the requests to bring features of one page to the others are frequent.

Split Software or the All-in-One Approach?

I've seen a bunch of requests and discussions calling to split parts of Resolve off into their own software. I've heard it about Cut, Fusion, and Fairlight and it's usually for one of two reasons. Some believe that the extra code is making it harder to keep Resolve stable but that doesn't really make any sense since there's way more code these pages could share and getting rid of redundant code will only make development easier. Others just think these pages just don't feel like they fit well within Resolve and that has to do more so with implementation than clashing of function.

This can all be fixed.



The New

Image

Image Video Pipeline Image Audio Pipeline

You can see that it's very different to the point that most of the colors in the original diagram don't apply. Gone are names like Fairlight and Fusion since they don't do a good job conveying what the pages are for. Fairlight could still be the name of Resolve's audio engine and be used for audio products.

Some Name Changes

Fusion would become the Create page for two reasons. One of those reasons is so that the pages are named after tasks and not a brand. To new users, Fusion isn't going to tell that what the page does. Similarly, Audio would have been a more descriptive name for the Fairlight page but since this new layout gets rid of it. That's not as relevant. That's not to say that these brands need to be done away with. Fusion and Fairlight can be still be the names of the compositing and audio engines respectively and can be used for any Resolve hardware. The reason why Fusion page would become Create instead of Compose or Composite will be mentioned a little later.

Timeline Effects or Compositions

The question of whether or not one should use the Color page or the Fusion page becomes a question of whether Timeline Effects or Compositions work for what you want to do.

Timeline effects would be handled by the Process page. They're effects that get managed across the timeline to shot-match and apply effects to groups of clips or the timeline as a whole. It's really no different then the Color page's role.

The Create page would specialize in making complex compositions which would get used as assets. That's still very similar to it's current usage but instead of each clip on the timeline being a unique composition, compositions would be a unique type of clip in the timeline links back to an original composition in the Media Pool. In otherwards, they act like clips not effects.

By giving the two unique behaviors, the Create page looses it's feeling of being the "advanced mode" of the Process page despite their other similarities. That's not to say that their similarities can't be acknowledged. By giving the Create page a full superset of the nodes that are available to the Process page it would gain the same color correction tools without stepping on the Process page's toes. The result is a less hacky method of color correcting before compositing than using clip nesting and could allow already graded clips to carry over their grades if they get converted to a composition.

Despite compositions no longer existing in the timeline, you could still change compositions by just selecting it in the timeline and switching to the Fusion page just as you do now.

Broadening the Create Page

Fusion users know how incredibly powerful, flexible, and even programmable it is. When it was added to Resolve it not only gave Resolve the ability to render compositions but it immediately became the backbone of Resolve's titling system. Since then its integration has been expanded to create custom transitions and generators. With the change to Create it would be able to do effects and soundscapes.

Soundscape would be created by setting up audio emitters and microphones (effectly the audio equivalent of lights and cameras) in a 3D environment. These soundscapes could then be brought into the Edit page and used like normal sound clips except their procedural natuire would allow them to be extended to whatever length is needed without actually looping. One of the benefits of doing this in 3D is that the clips can adapt to mono, stereo, or surround and reverb and doppler could theoretically be created in a more physically accurate way. Think of it like a node-base version of Sound Particles.

Of all of the types of assets that Fusion can currently create, effects would be most similar to transitations but instead of applying them to cuts, they can be applied directly to clips or even used as nodes on the Process page or in other compositions. The entire purpose behind this is that there was may be some effects complex effect that you might want to apply to a bunch of clips in your timeline that doesn't actually exist in Resolve FX but is something you can make with a composition. Effects would allow you to make that effect and use it just like any other effect in Resolve. It can be though of as a simple plugin. The node that will facilitate turning a composition into an effect the MediaIn just as it's used for transitions and will, in fact, be the single purpose of MediaIn nodes. Any external assets that are brought into a composition would use Loaders.

Composition Controls

Of course, neither of those new features of the Create page would be useful if they didn't have some controls. Right now Fusion has macros but they can't really be updated after they're been created and they needed to be stored in specific directories that require a restart to update, etc. The problem with the way these controls are currently implemented is that they require that compositions be packaged as Macros and stored in the Generators or Titles folders. This makes them a nuisance to set up and altering them after the fact will break controls. It also makes them poorly suited to use for regular effects shots and titles that you may need to use often but only within a certain project.

The Create page would be add a panel for setting up controls that would allow you to set up controls for the asset you're creating. The benefit is that you could continue to update the composition after the controls are created, one control can be set up to control multiple parameters of multiple nodes in a programmable way using functions, these controls can be tested while making the compositions, and they don't need to be stored anyone in particular. They're kind of like controls in Houdini.

On Blackmagic's end, these features can be used to replace the built-in transitions and generators that come with Resolve. From the user's end, they can create complex soundscapes that included rustling trees, crickets, frogs, a distant city and rain and set up controls to change the volume of each, the frequency of the ribbits, the amount of rain, or whatever and keyframe and tweak these tools in the context of the edit. A VFX artist could create a composition of a shot with billboard before the asset for the ad is determined and set up a file path control. Then the editor can encorporate that shot into the edit and easily add the ad asset in themselves later on.

For any programmers out there, you can think of the compositions like classes, composition controls as member variables, and composition instances as... instances.

The Media Pool

The Media Pool would be the one facet of the project that the composition and timeline share but they wouldn't necessarily share assets. Since compositions are assets, any pictures, video, or audio used by the composition would be self-contained within the comp. Doing this would prevent the Media Pool from being polluted with textures and FBX files that have limited to no use on the other pages. This wouldn't prevent someone from placing and reorganizing assets in the Media Pool as they want, it just wouldn't automatically add things to the Media Pool just because they're used in a composition.

One Program Split Two Ways

One of the advantages of the way the Create page would be implemented is that, if a stand-alone version of Fusion is still deemed necessary by VFX artists, it can be cleanly, vertically sliced from Resolve as Resolve would architecturally be Fusion with timeline extensions. Both would be able to open up projects but the standalone version wouldn't be able to be able to see compositions in the Media Pool.

When sliced horizontally, Resolve is equal parts audio and video. The Media, Edit, and Deliver pages already have the ability to work with both audio and video. Bringing over the remaining Fairlight features like ADR to the Edit page would preserve the current Fairlight experience while turning the Fusion page into the proposed Create page would add audio to it as well.

The Edit/Sequence page

The Edit page would be an amalgamation of Cut, Edit, and Fairlight and becomes the sole sequencing page for video and audio. All recording, automation, syncing, subtitling, retiming, and transitions will be done just as they are now but without having to switch pages. That sounds like it would make the Edit page overly busy but that's not actually case. Fairlight and the Edit page already have a huge amount of overlap. a lot of that can be seen at a glance but even where thier appears to be differences, there's similarities.

For example, the Subtitles panel of the Edit page and the ADR panel of the Fairlight page are very close to being identical in layout and intended goal. Both contain lists of lines, when they're said, and how long they're said for and having on-screen subtitles would likely be helpful to the actors recording ADR. In some cases, the panels that exist on page are literally just supersets of the other.

The Cut page is might seem very different from Edit but features like the Sync Bin, Source Tape view, and Clip view would only really amount to a few icons being added next to Thumbnail and List views of the Media Pool. The editing workflow for Sync Bins could theoretically be expanded to supercede multi-cam editing instead of existing alongside it. The mini-timeline could made collapsible and work like an excellent replacement to the Edit page's zoom and scroll tools. Lastly, the differences in the timeline could really be handled by a few added options in the timeline settings.

The Process Page

Image

As stated before, the Process page would work almost entirely the same as the Color page. However, with contextual palettes, audio sweetening (spectral noise reduction and masking) and routing features could be added without getting in the way it's traditional color grading features. With that addition of audio features, the name "Color" doesn't really fit anymore which is the reason for change to "Process".

The Process page would gain one other feature though, and that's...

Source Processing

Currently grades are stored as part of the clip instance on the timeline. If an Editor goes back to Edit a scene after it's been color graded, they could potentially remove clip only to add it back in later on which means the colorist as to re-grade that shot. Most of the time, that would just be a slight annoyance but it becomes a huge issue if that clip neeed alot of clean-up.

Source processing fixes that by allowing effects to be added to clips when they're in the Media Pool. There nothing stopping you from doing a full grade in the Source Grade but it's primary purpose is for pre-processing like noise reduction, spot removal, or base grades and they can be shared between different takes. Doing that will allow to seperate seperate them from the expressive parts of the grade or mix which, in some instances, may just require one effect to apply to the whole scene.

Source processing also has a huge potential performance benefit. Both audio and video noise reduction and other clean up can be very computationally intensive compared to reverb or RGB curves. Source Processes happen at an ideal place in the pipeline to be cached or flattened into the proxy files which would allow the Editor to use clean assets without need to worry about poor performance.

This does present one minor problem though. If we go back to the idea of the Page Navigations being indicative of Resolve's order of operations, then the Process page makes sense before and after the Edit page. To be completely accurate, it would should probably be both before and after it with each linking to a different mode of the Process page. Ideally, you'd want to have it represent this...

Image

... but without redundancy. Of course, this is minor and really shouldn't be something to worry about. After all, in Resolve's current form, the Fairlight page also doesn't have a good place in the page navigation. Instead, it would be better to keep the Process page in the same spot the Color page currently is and convey Source Processing by adding an "Edit Source Processing" option to the Source Monitor and to the context menu when right-clicking footage in the Media Pool. This preserves the behavior where selecting it in the page navigation opens up the current clip in the timeline while allowing you to navigate directly to the Process page's source mode from both the Edit page and Media Page.

Deliver

Phillipp Glaninger got it right. Adding the Media Pool to the Deliver page is a simple change that would allow someone to apply Source Processing to a bunch of clips and render out without needing to go to the Edit page. Go to his topic and give the idea a +1.
Last edited by Mark Grgurev on Thu Mar 25, 2021 12:52 am, edited 21 times in total.
Offline

Peter Chamberlain

Blackmagic Design

  • Posts: 13944
  • Joined: Wed Aug 22, 2012 7:08 am

Re: Rethinking How Resolve's Pages Interact

PostFri Feb 28, 2020 12:30 pm

Thanks for the considered feedback
DaVinci Resolve Product Manager
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact

PostFri Feb 28, 2020 10:28 pm

Peter Chamberlain wrote:Thanks for the considered feedback


Glad you read them. Just wish I noticed all the typos beforehand :-)
Offline

nixart

  • Posts: 12
  • Joined: Mon Jan 27, 2020 10:21 pm
  • Real Name: Nick Ritter

Re: Rethinking How Resolve's Pages Interact

PostWed Mar 11, 2020 8:05 pm

I love this idea of having a Composition in your media pool that timelines reference to.

My plus-one to this idea would be that once you lay in an instance of the Composition in your timeline, assuming you can expose certain parameters in the Edit Page, that you'd be able to make a single parameter independent to that one instance.

For example, if I create a lower third animation in a Fusion comp, I could drag instances into my timeline. Currently they all look the same and if I change one of them in a Fusion tab, it changes all of them. But there's an additional option that could be in a context menu to release the selected parameter like source text or font family that I could change in this one instance only in the edit page without that change carrying over to the other instances.

Effectively this could make every composition clip a template.

A suggestion of course. This is just a workflow I could see myself using a bunch.
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact

PostWed Mar 11, 2020 9:39 pm

nixart wrote:For example, if I create a lower third animation in a Fusion comp, I could drag instances into my timeline. Currently they all look the same and if I change one of them in a Fusion tab, it changes all of them. But there's an additional option that could be in a context menu to release the selected parameter like source text or font family that I could change in this one instance only in the edit page without that change carrying over to the other instances.

Effectively this could make every composition clip a template.


That's actually what I was suggesting should be the default behavior lol It would be cool to have the option to gang a parameter among all instance in the timeline though.

Glad you like my suggestion :-)
Offline
User avatar

Dmytro Shijan

  • Posts: 1760
  • Joined: Wed Sep 17, 2014 7:15 pm
  • Location: UA

Re: Rethinking How Resolve's Pages Interact

PostWed Mar 11, 2020 10:36 pm

Great ideas and great charts!
I also noticed many times that Fairlight is formally just a 99% copy of Edit page and it is 99% totally useless and inconvenient as separate page in video production.
If Fairlight legacy is so principal for developers, why don't release it as separate DAW for musicians and keep to improve it?

Color and Edit pages relation sometimes may be complicated but it is really impossible (and not necessary) to fit all current Color tools and Timeline into one single page.

I am not too deep into Fusion but those suggestions seems logical.

Also a lot of people requested to combine best parts of Cut and Edit page into single page Edit.
BMMCC/BMMSC Rigs Collection https://bmmccrigs.tumblr.com
My custom made accessories for BMMCC/BMMSC https://lavky.com/radioproektor/
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact

PostThu Apr 02, 2020 8:49 pm

Dmitry Shijan wrote:If Fairlight legacy is so principal for developers, why don't release it as separate DAW for musicians and keep to improve it?


Software is malleable enough that it isn't really needed.

Dmitry Shijan wrote:Color and Edit pages relation sometimes may be complicated but it is really impossible (and not necessary) to fit all current Color tools and Timeline into one single page.


And I definitely wouldn't want them to be one page. They would still be separate pages, but their feature overlap would be slightly different.

Dmitry Shijan wrote:Also a lot of people requested to combine best parts of Cut and Edit page into single page Edit.


Myself included lol I really don't like the idea of the Cut page and Edit page being separate which is why the Cut page isn't even included in my diagram lol
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostSat May 23, 2020 8:47 pm

I apologize for bumping an old topic of mine but I've come up with a new idea for how to re-structure Resolve and I thought it would feel spammy if I made a new topic for it.

Dmitry Shijan wrote:I also noticed many times that Fairlight is formally just a 99% copy of Edit page and it is 99% totally useless and inconvenient as separate page in video production


Dmitry will probably be pleased to see that I've arrived at the same conclusion he did and I not only combined Edit and Cut, I also merged them with Fairlight.

Since I've updated the first post and especially since a BMD employee commented on it, I'm posting a copy of the original post here so the topic makes sense and nobody is confused.

I've been thinking about how the each page interacts with each other.

Could things be improved?
Do the pages interact with each other in a way that makes sense?
Does the program do a good job of showing how these pages interact?

Since Resolve now lets people disable pages they won't use and lets them disable the page navigation entirely, Resolve users that specialize in just audio, color, compositing, or editing can treat Resolve like it's the stand-alone application... But do those pages currently give those people all the functionality they may need?

What I decided to do was to make a diagram of Davinci Resolve showing both feature overlap and the order of operations to get a sense of this. This is what I came up with.

Image

Image Fairlight
Image Edit
Image Fusion
Image Color

While making this, I found some interesting behavior that could probably be better represented in Resolve's UI.

Color Page

I think most people assume the changes you make in the Color page happen after the Edit page in the processing chain but that's not true. They interweave.

If a Colorist is handed off an edit and the editor already added effects and blend modes to a clip, a colorist shouldn't have to switch pages to edit those settings. The Colorist should have a way to view the Edit page effects and the clips blending mode so they can change, remove, or bypass them.

I've previously suggested that the Color Page should have a contextual palette area that changes depending on the node you have selected.

viewtopic.php?f=21&t=92731

If something like that were implemented then a colorist would be able to select a Source Node a be presented with these options:

Code: Select all
Clip Settings
      Camera Raw
      Info
   Scaling
      Input Sizing
      Edit Sizing
   Reference and Tracking
      Stabilizer


Add the Edit page's OFX stack to that and Blending settings to the output node and that would visually demonstrate the order of operations to the colorist.

This goes the other way, too. The Camera Raw settings should be added to the Edit page's Inspector, too, since it's technically Metadata that can be used to update a Sidecar file on the hard drive. This would put the Edit page settings and the Source Node settings inline with each other and make it very clear how the two page interact.

Fusion Page

I've long had issues with aspects of Fusion's implementation. When it was first added, I loved it but really disliked that you needed to drag a Fusion comp into the timeline to get started. By existing only in your in your edit space, it put the composition at risk of being deleted accidentally. That could throw away days of work and is just bad practice.

I very quickly figured out that you could drag a composition into the Media Pool for safe keeping and that you can create a Fusion composition in the Media Pool to begin with. Regardless, it seems that a lot of people don't know that because I still find Fusion tutorials that tell people to create compositions in the timeline.

But that's not all! Even if you do have a composition in the Media Pool, dragging it into timeline doesn't create an instance of it, it creates a full copy. If you change the version in the Media Pool, the changes can't propagate to the clips. Compositions are the only things in the Resolve that behave this way. That and the fact that all clips in Resolve can be turned into Fusion comps made it very difficult to try to show in the diagram.

Why not get rid of timeline compositions? They should work like footage. Just like any other footage, Double clicking them in the Edit Page's Media Pool would allow you to preview them in the Source Viewer. Putting them on the timeline would create links to the source Composition. Each instance could just be modified using controls just like in Fusion Titles.

The source composition could be changed by Right-clicking on the composition and clicking "Open in Fusion page", you can switch to the Fusion page when an instance is selected in the timeline, or if you're in the Fusion page you could double click the comp in the Media Pool just like you can now. Any changes you make would then apply to the other instances in the timeline and if you want to make a copy... you can just copy the composition.

The New Layout

After these changes, that diagram would look like this.

Image

There's two things you might notice about this.

1. Where did "Source Color" and "Source Audio FX" come from?

Resolve is currently set up with the understanding that anyone who works on audio or color is working on something that's already been edited. There's a lot of situations where someone might be ask be asked to start sweetening the audio or grading clips while the editor is working on something else.

Both of these things are technically possible in Resolve but only if these modified clips are in timelines instead of the Media Pool. The idea here is to allow the Fairlight and Color page to apply effects and grades to clips in the Media Pool without ever having to drag clips onto a timeline. These source grades and audio FX would be applied before any timeline changes. You can look at the Source Grade as your Color Correction while your timeline would be for providing a color look. The Source Audio FX could be used for sweetening and noise reduction while the effects in the timeline would be for mixing.

Both could also improve performance. Consider audio sweetening. You could potentially stack a lot of noise reduction and other heavy effects. Instead of rendering them every frame, a mixer can cache the Source Audio effects. Then every reference to the clip on the timeline would refer to the cached clip and only the mixing effects would get rendered in real-time.

2. Looks like the Edit page can do everything the Fairlight page can do

Yes. Short of a few minor differences, the Edit page can do everything the Fairlight page can. If you just used Fairlight for mixing audio for videos, this diagram might be enough to make you feel that the Fairlight page should be removed and the Edit page should do it's job. The problem is that that's likely not good enough for someone who would use a DAW for music for example.

Fairlight needs to do some things to distinguish itself from the Edit page. Audio is not my forte so I might be totally off about what audio professional might want but I think there's an opportunity to bring in some code from the Color page and Fusion page.

Audio tracks don't have a stack order. It doesn't matter if a track is above or below another one, the mix will sound the same. It has that in common with 3D compositing. So maybe Fairlight could borrow some of Fusion's 3D code to allow people to pan tracks and clips in 3D space and establish environment shapes for reverb. The actual work of putting clips on to different tracks at different times, but things like bus routing can be done with nodes.

I understand it's not very conventional for DAW but they might be interesting ideas to try.
Offline

wfolta

  • Posts: 625
  • Joined: Fri May 15, 2020 1:12 pm
  • Real Name: Wayne Folta

Re: Rethinking How Resolve's Pages Interact 2.0

PostMon Jun 29, 2020 3:07 pm

Wow, very nice!

EDIT: OK, I've looked more closely at the diagram and see "Clip FX" further downstream than "Create". So it looks like it's possible to apply Fusion FX to edited clips rather than entire assets. So my question would be: how does this work in practice?

For example, I create an intro and outro in Create -- super-fancy animations. Doing this independently of any editing totally makes sense. But then I'm in the middle of editing the program itself and at one point decide, "Hmmm, the voiceover says something about something in the video that's not obvious, so I'll add a callout with some text that tracks with the object in the scene." So I'd like to right-click on that clip and be thrown into Create (Fusion) with MediaIn pre-populated with the appropriate clip.

I think that's fairly straightforward, even if Create lives entirely at the front. It would create a Create composition with a MediaIn node that refers to the original asset, but is trimmed down to what's showing in the timeline. Cool.

What, then, if I edit the clip in the timeline -- say making it longer at both ends -- and then go back to Create? Would it be straightforward to have my MediaIn automatically resize as appropriate? If Create's output is pre-rendered -- i.e. MediaOut is to a file, not the timeline -- does that mean I have to go back to Create after lengthening the clip? In my particular scenario, the answer has to be "yes" in the sense that the new footage could not have been tracked if I only tracked what was originally in the clip, but there are other scenarios that aren't so dependent on a pixel-specific operation like that.

Part of the appeal of the current process -- which is full of inconsistencies and redundancies as you note -- is that Fusion lives downstream of Edit -- i.e. its MediaIn nodes come from edit -- and also upstream of Edit -- i.e. its MediaOut nodes feed into edit. Would Create appear to work the same way -- even if the underlying pipeline/flow is better organized?
Resolve Studio 17 latest, Fusion Studio 17 latest, macOS Big Sur latest, MacBook Pro 2020 64GB RAM and Radeon Pro 5600M 8GB VRAM
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostMon Jun 29, 2020 7:56 pm

wfolta wrote:Wow, very nice!

Thank you!

wfolta wrote:EDIT: OK, I've looked more closely at the diagram and see "Clip FX" further downstream than "Create". So it looks like it's possible to apply Fusion FX to edited clips rather than entire assets. So my question would be: how does this work in practice?

For example, I create an intro and outro in Create -- super-fancy animations. Doing this independently of any editing totally makes sense. But then I'm in the middle of editing the program itself and at one point decide, "Hmmm, the voiceover says something about something in the video that's not obvious, so I'll add a callout with some text that tracks with the object in the scene." So I'd like to right-click on that clip and be thrown into Create (Fusion) with MediaIn pre-populated with the appropriate clip.

I think that's fairly straightforward, even if Create lives entirely at the front. It would create a Create composition with a MediaIn node that refers to the original asset, but is trimmed down to what's showing in the timeline. Cool.

What, then, if I edit the clip in the timeline -- say making it longer at both ends -- and then go back to Create? Would it be straightforward to have my MediaIn automatically resize as appropriate? If Create's output is pre-rendered -- i.e. MediaOut is to a file, not the timeline -- does that mean I have to go back to Create after lengthening the clip? In my particular scenario, the answer has to be "yes" in the sense that the new footage could not have been tracked if I only tracked what was originally in the clip, but there are other scenarios that aren't so dependent on a pixel-specific operation like that.

Ah okay. Took me awhile to get what you were saying. I'm first gonna go on a bit of a tangent then hopefull bring it back.

Right now there are two ways to turn a clip in the edit into a fusion composition.
  1. You can go into a clips Fusion composition bring in some other media then connect that the output instead. When you go back to the Edit page, the clip will show the thumbnail and file name of the original clip but the timeline viewer will playback the new media with the original clips audio and the clip can be resize to the same extend as the original clip. This can also be done on a compound clip or adjustment layer.

  2. You can right-click the clip in the timeline and click "New Fusion Clip". Fusion Clips are kind of like compound clips where each track is fed into the Fusion composition as a MediaIn node . A Fusion clip can't be extended past the length of it's longest clip. By default, that will be the length of the clip as it was at the time of being turned into Fusion clip. By right clicking the Fusion Clip in the Edit page, you can open it as a timeline extend that clip and even put different clips on the same track. Adding a new track into the Fusion Clip doesn't effect the Fusion Composition and adding a new MediaIn node in the Fusion Composition doesn't seem to add that Media to the Fusion Clip.

Both of these methods have some interesting potential but they're coupled with a LOT of undesirable behavior. For example, both methods technically allow someone to have nested Fusion compositions and color grade before compositing because they both give you ways to sort of loop Resolve's entire graphics pipeline multiple times per-frame. The problem is that they're so confusing and unpredictable in how they'll be used that they can very quickly become really unstable. I encourage people to convert a clip to a Fusion Clip then experiment with how changing the Fusion Clip effect it's composition and vice versa. I did this twice while writing this post and within two minutes Resolve start throwing up Error messages and continually tell me it can't save the project when I tried to do anything. I've never seen Resolve do this before and that was with without color grades or nested compositions and the composition only consisted of three nodes with only two connected.

I know it seems gatekeepery to say how things should or shouldn't be done, but things shouldn't be done this way. Maybe we should look at Fusion in Resolve's current implementation as experimental, find more sane ways to implement certain behavior, and restrict other behaviors.

For example, I recently changed the Fusion composition of an Adjustment Clip and immediately saw the benefits of something like that. Like lets say that you want everything on your timeline to appear be playing on a old CRT. This is something you can do right now with a Fusion Composition on an Adjustment Clip but I don't know if this was by design or if the two features just happen to intermingle in a useful way. You could achieve the same effect if MediaIn node had a property to allow a composition to work like an adjustment clip. Then you could make titles that are capable of applying color adjustments to the clips below them. Or what if Create(Fusion) had the ability to make Fusion effects? Then these effects could be applied to clips or adjustment layers.

I've seen someone request the ability to color grade something in the Color page before bringing it into a Fusion composition. Obviously, you can color grade in Fusion but they wanted to it with the Color page for whatever reason. As mentioned before, this can be done now by applying a grade to a clip then putting it into a Compound or Fusion Clip which wouldn't even be an option with what I'm suggesting. Considering how unstable that can potentially be, maybe that's a good thing. Maybe the better option would be to give the Process(Color) page the ability to bring Fusion Effects into the Process page just like an OFX node. Then that person could use the Color page to grade before and after the Fusion Effects node. That would get that person what they wanted without cluttering complimenting the Edit or looping the timeline.

If you want to skip the tangent...
Basically what I'm getting at is that I can't guarantee that what I'm suggesting would be without some disadvantages but I'm very confident that these changes will bring with them more advantages not only to the end user but to the development and stability of Resolve. So in regards to your specific question, things won't work exactly as they are now but I have some ways that could be helpful to the scenario you presented.

wfolta wrote:So I'd like to right-click on that clip and be thrown into Create (Fusion) with MediaIn pre-populated with the appropriate clip.

In this case, it would be Right Click > Convert to Composition. Then it would either replace the clip on the timeline with a composition or make the composition an alternate take in the take selector. Also MediaIn nodes wouldn't be used. MediaIn nodes are weird in that they can refer to clips in the timeline or clips in the Media Pool. Since Create would be outside of the project, it really wouldn't be aware of either. Instead it would use Loader nodes (which would have to be made to support more than image sequences in this case) that refer to the original source file and inherit the ins and outs from the edit.

wfolta wrote:What, then, if I edit the clip in the timeline -- say making it longer at both ends -- and then go back to Create? Would it be straightforward to have my MediaIn automatically resize as appropriate?

Once a clip is converted to a Composition, it works like Fusion Clip to the editor in that the longest the editor can make the clip is the length of current composition. So if the VFX artist didn't change the length since the clips was converted to a Fusion composition then editor will be stuck with the length that it was when it became a Composition. The reason for this is that VFX artist may not have tracked an object past the extents of the original in and out points. However, if the VFX artist extended the out point, then it wouldn't extend the length of the clip in the timeline but the editor would be allowed to extend the clip up to the new outpoint.

wfolta wrote:If Create's output is pre-rendered -- i.e. MediaOut is to a file, not the timeline -- does that mean I have to go back to Create after lengthening the clip?

It's output wouldn't be pre-rendered. If it were pre-rendered than it would just be a normal clip or image sequence. In this case, MediaOut would work in exactly the same way and the composition would be cached upon import just like it is now.

wfolta wrote:Part of the appeal of the current process -- which is full of inconsistencies and redundancies as you note -- is that Fusion lives downstream of Edit -- i.e. its MediaIn nodes come from edit -- and also upstream of Edit -- i.e. its MediaOut nodes feed into edit. Would Create appear to work the same way -- even if the underlying pipeline/flow is better organized?

That's hard to answer. In some ways it would be the same, in other ways, it would be very different. So one way it would be different is that when you drag a Fusion composition into the timeline now, it makes a copy instead of an instance. So if you go into the composition for that title and change it, it doesn't effect the other copies. With Create being outside of the project, you would have to make a separate composition file to do that but what you gain is the ability to make changes to a title and have it propagate to all instances of that title being used in any project and the project itself stays a lot of smaller.

From the perspective of being able to Right Click a composition in the timeline and click "Open in Fusion/Create" that would stay the same. Not sure if I answered this well enough.

Edit: Somehow I missed the very first thing you said and I think some of the terminology I used might be confusing given the terminology you used.

wfolta wrote:OK, I've looked more closely at the diagram and see "Clip FX" further downstream than "Create". So it looks like it's possible to apply Fusion FX to edited clips rather than entire assets. So my question would be: how does this work in practice?

Even though Fusion transitions and titles are in the Effects panel, they aren't really effects in Resolve. Titles are really just footage with editable properties, transitions are kind of like effects that apply to the ends of clips, but Resolve doesn't really give you away to apply a Fusion compositions on a clip as if they were Video FX.

What I called "Clip FX" in that diagram would be video FX applied to a clip on the Edit page. I think Resolve's manual calls them "Edit FX" and they're already downstream to Fusion processing.
Last edited by Mark Grgurev on Mon Jun 29, 2020 8:17 pm, edited 1 time in total.
Offline
User avatar

TheBloke

  • Posts: 1905
  • Joined: Sat Nov 02, 2019 11:49 pm
  • Location: UK
  • Real Name: Tom Jobbins

Re: Rethinking How Resolve's Pages Interact 2.0

PostMon Jun 29, 2020 8:09 pm

EDIT: This reply crossed with your reply to wfolta, so I've not taken into account your reply to him in this as yet

Very interesting ideas, Mark. I totally agree that the current setup is messy and confusing and involves a huge amount of duplication. Perhaps because BMD have been glueing new components on at a rate of knots and haven't really had the time to sit back and refine the process. That, and perhaps not wanting to change too much and give the feel of a completely different program that has to be re-learnt.

With regards to the Fusion split-off to Create, I agree with many of the points, especially how ephemeral Fusion compositions are in Resolve. This is something I've noticed myself and think definitely needs improving. The only indicator you have outside of the Fusion page that a composition exists in a timeline is a little Fusion composition icon on the clips concerned, and from the Media Pool there's no indication at all. Even 'Fusion Clips' which you'd think surely reference a composition, actually don't contain one; again the comp is stored in the timeline that uses the Fusion Clip, not in the Clip itself.

However, like wfolta has says, I'm not sure if the Create idea covers all that Fusion can today do in Resolve. For example, my current (and first) Resolve/Fusion project uses numerous Fusion clips, each of which contains three layers of video with multiple clips on each. The Fusion comp accesses these layers independently as MediaIn1, 2 and 3. This is vital for what I'm doing, and means that my Fusion workflow is very much integrated into the Edit side of things.

Likewise, the reason a comp is made on an individual media clip is so that comp can access the content of that clip via MediaIn. And then there's Fusion Adjustment Clips, which sees the composite of every video layer below it via a single MediaIn node.

So we could call that four different kinds of Fusion composition in Resolve:
1. Fusion Composition - no MediaIn, entirely standalone.
2. Normal timeline clip - MediaIn connected to the clip.
3. Adjustment Clip - MediaIn connected to all layers below the clip
4. Fusion Clip - one MediaIn per video layer in the Fusion Clip.

Of these four, only 1 really fits your separate Create model, if I've understood it correctly.

My current workflow involves switching back and forth between Edit and Fusion on an almost minute-by-minute basis as I sync Fusion camera movements with audio and make changes that involve both editing the Fusion comp and altering the timing of source footage. I'd therefore love better integration between Edit and Fusion - in fact, ideally what I want is the ability to have both open at once in different windows - not for these to be moved further apart, or separated entirely.

Fusion isn't just lower thirds and title cards, and one of the great things about having it in Resolve is that it becomes easier to leverage that power due to its close linking to footage.

When I was setting up this project I did a lot of the Fusion composition design work in Fusion Studio, so I didn't need to dedicate RAM to Resolve stuff I wasn't using, and because Fusion Studio has a better and more customisable UI than Resolve's Fusion page. Once the composition design was completed, I briefly considered if I should continue using Studio for the actual integration with my footage.

But to do that would have required intermediate rendering of all my Resolve Edit page stuff, pulling it into Studio, making the necessary per-scene decisions, then rendering it in full to get it back in Resolve. VFXConnect automates that process, however it would have been nothing like as immediate as it was doing the work in Resolve, with Fusion fully integrated. Stuff like syncing my Fusion work to audio, and making changes that involved both Fusion alterations and changing the source footage, would have taken far longer if I'd had to render out each time. In fact they'd basically have been impossible for me as a newbie who didn't have everything planned out carefully in advance and was learning along the way.

So in summary, I definitely agree that streamlining is much needed, and that there's a huge amount of duplication that it would be wonderful to get rid of. I am baffled by the existence of a separate Cut page, and I really wish I didn't need to drop in and out of the Fairlight page to do things that could also be available on Edit. I think all of this could be improved massively, especially if BMD would let go of their rigid page-by-page approach and consider making a more fluid, flexible UI with different presets/workflows that reconfigures the UI according to the job at hand.

But on the Fusion issue, I would hate to see any change made that reduced the power of the integration with Edit. I want tighter integration and more flexibility, not less. And the issues Fusion has today - like how vague compositions are - can easily be solved in other ways, IMHO.
Resolve Studio 17.4.3 and Fusion Studio 17.4.3 on macOS 11.6.1

Hackintosh:: X299, Intel i9-10980XE, 128GB DDR4, AMD 6900XT 16GB
Monitors: 1 x 3840x2160 & 3 x 1920x1200
Disk: 2TB NVMe + 4TB RAID0 NVMe; NAS: 36TB RAID6
BMD Speed Editor
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostMon Jun 29, 2020 8:48 pm

I don't want to fully respond to your post since you said you haven't adjusted it to take into account my other post but I'll respond to some stuff that I don't imagine you changing.

TheBloke wrote:So in summary, I definitely agree that streamlining is much needed, and that there's a huge amount of duplication that it would be wonderful to get rid of. I am baffled by the existence of a separate Cut page, and I really wish I didn't need to drop in and out of the Fairlight page to do things that could also be available on Edit. I think all of this could be improved massively, especially if BMD would let go of their rigid page-by-page approach and consider making a more fluid, flexible UI with different presets/workflows that reconfigures the UI according to the job at hand.

I really dislike the Cut page. Not it's features but the fact that it's treated like a seperate thing.

Same thing with Fairlight. Before the Fairlight page, audio track automation was on the Edit page but they moved it just to the Fairlight page. The Elastic Wave feature that they added to the Fairlight page is actually the same as the Retiming tools in the Edit page. Using Elastic Wave on an audio clip with linked video actually applies retiming to the video and their retime points are bound to each other.

I actually don't want them to get rid of their page approach, I just wish they didn't use pages just to facilitate having different layouts. For example, the Color and Fusion pages aren't just different layouts, they act completely differently. Sure they may both have nodes but Fusions nodes and Color's node work very differently their approach to playback is very different so they make sense to me as different pages. On the other hand, Cut, Edit and Fairlight are all sequencing pages just with slightly different layouts.

I'd like to see them use a hybrid approach where they have a fixed amount of pages (Create, Media, Edit, Process, and Deliver) and allow the user to create and store different layouts for each page.

TheBloke wrote:But on the Fusion issue, I would hate to see any change made that reduced the power of the integration with Edit. I want tighter integration and more flexibility, not less. And the issues Fusion has today - like how vague compositions are - can easily be solved in other ways, IMHO.

I'm curious what exactly you're doing in your project. Based on what you said, it definitely sounds like you're using Edit and Fusion in very tightly integrated ways but I'm not sure of the specifics enough to know whether or not what I'm thinking would work as a good replacement for you.

I can say that though, is that I think there's probably some things that Fusion is currently needed for that the Color page could probably grow to the needs of. Like when you talk about applying things to tracks, I immediately wonder if track-based transforms, FX, and grades would do the trick. It only took a few minutes of me messing with them in Vegas Pro to realize how useful they could be.
Offline

wfolta

  • Posts: 625
  • Joined: Fri May 15, 2020 1:12 pm
  • Real Name: Wayne Folta

Re: Rethinking How Resolve's Pages Interact 2.0

PostMon Jun 29, 2020 9:28 pm

TheBloke wrote:So we could call that four different kinds of Fusion composition in Resolve:
1. Fusion Composition - no MediaIn, entirely standalone.
2. Normal timeline clip - MediaIn connected to the clip.
3. Adjustment Clip - MediaIn connected to all layers below the clip
4. Fusion Clip - one MediaIn per video layer in the Fusion Clip.


This. The thing I would want to avoid is making Create feel like I'm using Nuke or Natron and having to keep track myself of what clips -- and what pieces of clips -- I've exported to Nuke/Natron, which versions I've reimported, etc. The current workflow doesn't feel like a round-tripping, it feels integrated, and so Create needs to also feel that way. (And work that way, too: the pipeline should be able to optimize operations along itself in the same way that multiple nodes in Fusion can optimize into a single operation that doesn't instantiate intermediate results. At least I think Fusion nodes work that way.)

Mode 1, above, I'd use mainly for a near-standalone intro/outro motion graphic kind of thing. It could be done in a separate program, but Resolve has a Fusion mode that I've learned, so why not? I'm comfortable with essentially working entirely in Fusion on that, since it really doesn't have a tie-in to editing: it won't be edited, it doesn't use assets that the editor is using -- it's creating an asset -- and it's not meant to be edited. (And it may be reused across projects or at least timelines.)

Mode 2 is where I live a lot: park on a clip in Edit, switch to Fusion and start off with a comp that's set up with that clip. I'd be doing stuff that uses that clip and maybe generators, but no other media. (I cringe when people use this mode to key over a background, and bring the background directly into Fusion, bypassing a timeline. Maybe that's just me.)

Mode 4 is how Id do a key: editor designates foreground and background, and I get a comp pre-setup in Fusion with those two assets.

Mode 3 I don't really do, but I see the distinction.

Then I think there is a functionality dimension to this. Some VFX don't depend on any asset that might be part of the editing process. (Some VFX use only generators and don't depend on any external assets at all.) While other VFX are tightly coupled to the assets as they exist in the editing process. Maybe something like:

1a) titles or motion graphics,
1b) tracking, and
2a) keying (assuming for discussion a theoretical "pure" key that doesn't also include tracking),
2b) other continuous effects

Functionality 1 is about keyframes essentially: in the case of 1a the keyframes might be able to scale to the length of the clip -- they don't depend on pixels in any underlying content -- while in 1b they keyframes are synchronized to pixels in the underlying content. Functionality 2 doesn't have timing per se: 2a depends on underlying pixels but mostly as a snapshot in time, while 2b may be an effect that affects underlying pixels but doesn't depend on them -- say a distortion or blur.
Resolve Studio 17 latest, Fusion Studio 17 latest, macOS Big Sur latest, MacBook Pro 2020 64GB RAM and Radeon Pro 5600M 8GB VRAM
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostMon Jun 29, 2020 10:30 pm

wfolta wrote:The current workflow doesn't feel like a round-tripping, it feels integrated, and so Create needs to also feel that way.

I don't see any reason why it wouldn't feel integrated.

wfolta wrote:(And work that way, too: the pipeline should be able to optimize operations along itself in the same way that multiple nodes in Fusion can optimize into a single operation that doesn't instantiate intermediate results. At least I think Fusion nodes work that way.)

Oddly enough, I was thinking about this recently. Fusion does optimize certain things like transforms but otherwise I think it works on intermediate results. In the last few months I was made aware of the fact Fusion tries to sort of collapse transforms so things aren't interpolated multiple times. The reason why the Flatten Transform option exists is specifically to disable that behavior at a certain point.

I think if it sort of compiled a bunch of nodes into something like shaders then rendering should speed up after the first frame. I don't feel like I've seen anything like that though the Custom Tool could definitely use something like that.

One thing that should be brought up is how Fusion's current place in the pipleline is functionally redundant with Color in a lot of ways. Before Fusion was integrated, you could still do green screening, text tracking, and a number of other things using the Color page. Here's what a three layer composition with key and color grading looks like in the Color page.

Image

If we compare them, they have a lot of equivalent functionality.

Nodes = Nodes
Templates = PowerGrades
Merge Nodes = Layer Mixers
Keyframes Panel = Keyframes Panel

Both also give you access to the same LUT library, both can use ResolveFX, both give you a way to Clips panel to hop to the next clip in the timeline, and both have tools for color correction, keying, power windows, transforms, and tracking.

However, Fusion's feature set is a super set of the Color page. From a purely functional perspective, Fusion could totally replace the Color page but I wouldn't want to do that.

The Color page only has a few types of node: Corrector, Layer Mixer, Parallel Mixer, Alpha Mixer, Color Splitter, Color Combiner, and OpenFX. The majority of the work is done with the Corrector node which contains 13 panels in fixed pipeline and ResolveFX and LUTs can be applied to them as well. All these decisions allows you to do a lot with fewer nodes. What you lose in fine-grained control over the order of operations, you gain back in playback speed and quicker access to tools.
TheBloke wrote:So we could call that four different kinds of Fusion composition in Resolve:
1. Fusion Composition - no MediaIn, entirely standalone.
2. Normal timeline clip - MediaIn connected to the clip.
3. Adjustment Clip - MediaIn connected to all layers below the clip
4. Fusion Clip - one MediaIn per video layer in the Fusion Clip.


1. We all agree that Fusion compositions that use their own unique assets don't gain anything from solely existing within a project, or more specifically, in a timeline.
2. I really can't even see what advantage this would have over a standalone composition. Any new assets brought in would be self-contained and you can't adjust things relative to tracks beneath because you can't see them in the Fusion page. Depending on what's being done, there may not even be any advantage over doing the same thing in the Color page.

3. I feel like Fusion Adjustment Clips actually negate the need for regular Fusion comps. When create an empty Fusion composition then you can't see any of the underlying tracks from the Fusion page. With Fusion Adjustment Clips, not only can you see those tracks but you decide whether or not you want modify them. If you don't want to modify them then just delete the MediaIn node, if you do then just add nodes after the MediaIn and you still have the option to merge other things on top.

At the moment, Resolve has an artificial limitation in place where it won't allow you to bring in external mattes on Adjustment Clips so Fusion Adjustment Clips are the only way to adjust the layer beneath while compositing something else on top within the same clip. If Resolve got rid of that limitation for the Color page then, again, this might not have any advantage over doing the same work in the Color page.

4. Fusion Clips have maybe the most integration between the Edit page and Fusion pages of any of these examples. They have the benefit of allowing you to link multiple clips to a MediaIn node by puting them in series. However, it's difficult for me to see any advantage in this either. If you're doing complex greensgreen work then it's not unlike that you'd need to track it one to the other and moving the clip in the timeline would misalign the tracking. The same applies to tracking text.

I feel like it would just make more sense to have Create be the page be the out-of-project compositor that creates stand-alone clips that involve a lot of tightly integrated assets and let Process (Color) being the in-project effects compositor.
Offline
User avatar

TheBloke

  • Posts: 1905
  • Joined: Sat Nov 02, 2019 11:49 pm
  • Location: UK
  • Real Name: Tom Jobbins

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Jul 05, 2020 8:54 am

Note: The following three posts started out as one mega post responding to Mark's post, but I've since edited them to separate them out a little. Apologies for the multi-essay composition!

This first post describes my current project and how I found Edit->Fusion integration essential. The second post talks in more detail about why I feel that integration is so important and powerful, and considers some alternative changes that would solve current problems (eg the lack of visibility to Fusion compositions) while also improving the power of Fusion. The third post responds to a couple of Mark's quotes, on related matters.

Mark Grgurev wrote:I'm curious what exactly you're doing in your project. Based on what you said, it definitely sounds like you're using Edit and Fusion in very tightly integrated ways but I'm not sure of the specifics enough to know whether or not what I'm thinking would work as a good replacement for you.
I am working on what started out as a Covid19 lockdown project for my mother's theatre company. Lockdown is now officially over, or at least relaxed, in the UK and I'm still not done. But that's another story..

We got together over Zoom and first rehearsed, then recorded extracts from a play they performed last year. The three actors performed in their homes in front of green screens, and recorded their audio locally via decent quality microphones and Audition or Audacity. Video was captured using basic webcams (one from an iPad, one from a Macbook Pro, one a USB Logitech C920), and I recorded the video signal at my end using OBS, capturing the full screen Zoom window (Zoom's built in screen recording being compressed as hell). I put Zoom at full-screen on a 4K monitor, captured it to a single 4K video, and then later used ffmpeg to crop out each of the three actors into their own separate video files. These ended up being 1808x1016.

In the play, the cast are imprisoned in Newgate Prison, London at the height of the Great Plague of 1665.

Rather than make the video like so many other "Zoom projects" and just have them in boxes on screen, I wanted to try and make it seem like they were really in the same room. Hence recording in front of green screens, so I could key them out and then integrate them.

Here's the layout of my Scene 1 Fusion clip (click for larger):
Image

These clips started out on a standard Timeline. I synced the separate audio files to the three video files, and then made an edit of the three layers of clips. We recorded at least 3 takes of each scene (three scenes total), and in some places there were extra mini takes: if an actor fluffed a line or there was a loud external noise, they'd stop and then go back a couple of lines and carry on. Scene 1 for example is edited to 13.5 minutes, from over 75 minutes of source footage (for each of the three actors.)

After I had my sound and video edit, I merged all the video clips into a single Fusion Clip, and moved into Fusion.

Here's my Fusion composition (click for larger):
Image

The three nodes at the exterior of the group, named "Liam", "Sorcha" and "Char" are the MediaIn nodes, taking the video layers directly from the Fusion Clip shown earlier. This leverages the unique capability of a Fusion Clip to provide multiple separate MediaIn feeds to the composition.

These MediaIn videos are first cropped back to 1808x1016, as Resolve insists on providing the MediaIn video at the project resolution of 1920x1080 (they don't get scaled because I have Scaling: Crop set in the Timeline).

Then each actor's video goes to an ImagePlane, which is part of a 3D scene. The scene looks like this:
Image

That's three ImagePlanes positioned inside another ImagePlane, which has been displaced (Displace3D) by a height map generated from the background image, which depicts a curved cell/dungeon type room. There's a spotlight and a camera. This is all connected to an OpenGL Renderer3D.

To create shadows I then need a simplified duplicate of the same scene, which is connected to a software Renderer3D. This is necessary as the OpenGL renderer can't handle transparency plus shadows, and all my actor videos are using the alpha channel (as they were keyed out of the green screen). It turned out to provide better performance to use two renderers like this and then merge them rather than just use a software renderer for everything. Though 'better performance' is still very much relative, as I only get 3-5 FPS total on my aging hardware.

The green screen keying was also done in Fusion (Clean Plate + DeltaKeyer), but to improve efficiency I did that as a separate step, rendered out to ProRes 4444 intermediate footage. Regardless of efficiency it was necessary to render this separately anyway, as I later need to do Color page adjustments on the actors after they've been keyed out and before they reach the Fusion scene. Once the Fusion scenes are done I will render them out to ProRes, then import and colour grade the results and add some titles and the like before doing a final render.

So, that's the setup. A complex Fusion 3D scene that requires three separate video inputs, with those inputs being entirely dependent on my editing in the Edit page.

When I was setting up the camera movements - throughout the 13.5 minutes of scene one there's at least 50 keyframes on each of the camera's X,Y and Z coordinates - I had to constantly go back and forth between Edit and Fusion, to ensure those camera movements were correctly lined up to the audio, and to my edit points.

For example, whenever there's a cut between two different takes I need to hide this with a camera movement or change in camera position. So I did some "noddies" and the like, and also switched the camera back and forth between showing all three actors at once, or zooming in on one or two of them.

When camera moves were doing for hiding cuts I needed to carefully check where source clips started and ended, and then apply the camera movements accordingly to hide those. Other times camera movements were done for artistic reasons, in which case I usually needed to refer to the audio. In both scenarios, I was using information on the Edit page to define what I did on the Fusion page.

I was therefore jumping back and forth between Edit and Fusion very often, sometimes multiple times a minute. I dearly wished that it was possible to have Edit and Fusion open at the same time, in separate windows.

I later wondered if perhaps there was some way to expose the Fusion node's controls on to the Edit page, so I would not have needed to go into Fusion nearly as often? But I haven't checked into that as yet. I did make a Fusion macro out of my composition, combining the ~40 nodes into a single macro group which exposes the parameters I might need to tweak. In addition to the camera movements I can tweak a variety of other parameters, like lighting and shadow levels, on the macro node. But still only within Fusion, currently at least.

Anyway, I hope it's clear how important it was for me that Fusion and Resolve were tightly integrated. The ability to have a Fusion Clip that can access any number of layers was absolutely vital.

How would I have done the same thing using a separated Fusion, like Fusion Studio is today? Well I suppose I would have had to render out the clips (the ones in the Fusion Clip) to intermediate ProRes, and then import with Loaders into Fusion Studio. Then I'd have probably have had to add markers for all the scene cuts, so I'd have some reference in Studio as to where the edit points where (I think it's possible to export and import markers, not sure.) But then how to sync to the audio? Fusion (Studio) does have some very basic audio support, so I suppose I could import the audio too.

Then any time I realised I needed to make a change on the Edit page - which happened a number of times after I'd started the Fusion comp - I would have to re-render all that intermediate footage, or else try and make the changes twice, in both pieces of software.

It's certainly all possible, but it's far less flexible and dynamic than having a full Edit->Fusion integration. And it would have been particularly awful for me as a newbie, learning along the way and changing my mind frequently, and thus needing all the flexibility and efficiency I could get.
Last edited by TheBloke on Sun Jul 05, 2020 10:56 am, edited 3 times in total.
Resolve Studio 17.4.3 and Fusion Studio 17.4.3 on macOS 11.6.1

Hackintosh:: X299, Intel i9-10980XE, 128GB DDR4, AMD 6900XT 16GB
Monitors: 1 x 3840x2160 & 3 x 1920x1200
Disk: 2TB NVMe + 4TB RAID0 NVMe; NAS: 36TB RAID6
BMD Speed Editor
Offline
User avatar

TheBloke

  • Posts: 1905
  • Joined: Sat Nov 02, 2019 11:49 pm
  • Location: UK
  • Real Name: Tom Jobbins

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Jul 05, 2020 10:16 am

In my view changing Resolve's Fusion into a separated, less-integrated Create component would be a significant retrograde step, losing a key part of what makes Resolve's Fusion page so great today.

Despite the many issues and inconsistencies that we know about and have been discussed here, I still think it's absolutely fantastic that Resolve combines an NLE, color grading, DAW and VFX into a single package at a single price, which can be as low as free.

In my view Fusion's power and appeal is greatly enhanced by its integration to Resolve. It's a major selling point for BMD that rather than using Premiere Pro and After Effects separately, you can use a single program, Resolve. PP + AE has Dynamic Link, and that seems kind of like what you're proposing for Fusion? But that's a step backward for Resolve; it's not as good as Fusion's current integration, and IMHO it would be a terrible shame if BMD went backwards on that integration instead of improving it even further.

I absolutely agree that there are some aspects that are sub par. We really need a better way of keeping track of Fusion compositions. In my view, every Fusion composition should get an entry in the Media Pool, so they're clearly visible and adjustable. Every time you open Fusion on a given clip and start editing, it should automatically make a new Fusion Composition in the Media Pool, with a new field added that indicates which clip it's attached to, and another that shows how many inputs it supports ("0" for a Fusion Composition, "1" for a clip composition and an Adjustment Clip; "Many" for a Fusion Clip comp).

This would open up great new possibilities, like being able to copy a Fusion composition from one clip to another, or to many clips at once, using the Edit + Media Pool panes, without Fusion even being open. For example, select a group of clips on Edit, highlight a Fusion composition in Media, and then right-click on the clips in Edit and choose "Set Fusion Composition from Media Pool". Similar in operation to "Conform Clip with selected clip from Media Pool". That would be a non-destructive operation, because you'd either be adding a composition to a clip that didn't already have one, or else changing which comp was used by a clip. Either way, you could easily revert to the prior state, with no risk of compositions being lost or deleted.

Therefore I think Resolve should support a many-to-one relationship between clips and comps, such that I could take a single Fusion comp and apply it to a whole bunch of clips of once. As well as being a major efficiency boost over the steps required today (open Fusion, click on every clip in turn and paste in some previously copied nodes) , that would also solve a major annoyance: currently if you have a clip on which you create a Fusion composition and then you cut that clip in Edit, you end up with two separate comps, no longer linked, and any changes made to one must be manually copied to the other.

Instead, with my suggested changes, you would still have a single comp, referenced by both clips. This would be made clear in the UI because the comp would have a single name and a single Media Pool entry. If you did actually want two comps across the two clips, you could simply copy the current comp to a new clip and then assign that new comp to the second clip.

Stuff like composition versioning would then be stored against - and visible in - this Fusion Composition, and thus be clearly visible and accessible from both Media Pool and Edit. Likewise the File -> Export Fusion Composition and Import Fusion Composition menu items would move to the Media Pool, making it possible to manage your compositions from any part of Resolve, not just when you're looking at that specific comp in Fusion.

That's all off the top of my head and it can definitely be refined further. The main point as far as I'm concerned is: don't remove the wonderful integration we currently have with Fusion and the rest of Resolve. Fix the issues we have in ways that build on the integration we have today rather than diminishing it. And I don't think that's hard to do at all.

From that point, I could see all sorts of possibilities. Like expanding on the MediaIn/MediaOut concept to support compositions that reference each other. This has been a common request amongst the Fusion userbase: support for a 'live macro' where one definition can be used in multiple comps, allowing the single definition to be changed and affect all comps that use it.

Well, I could see that being straightforward to achieve in Resolve. We have Fusion Clip comps that support an arbitrary number of MediaIn inputs. So extending the static input allocation we have today (as per my list of 4 comp types in my earlier post), why not make "inputs" and "outputs" a configurable parameter on a comp.

This could be combined with the existing Macro Editor. You would take an existing comp and turn it into a "Live Comp" or "Macro Comp" or whatever name. The UI for this would be similar to the existing Macro Editor: you define the input(s) and output(s), and you expose any parameters that need to be accessible to calling comps.

Now in any other comp, you can add this Live Comp as a node. This node has the appropriate number of inputs and outputs, and shows in the Inspector the configured parameters.

So far, this is identical to a Macro. But the key difference is that this is not a static definition on disk which is copied and therefore duplicated into any comp that uses it. This is a node that references another comp in the same project, or in the cross-project Power Bins. Therefore you can go and edit that comp definition at any time, and the effects of those edits apply to comps that use it.

That's the sort of extra power and flexibility I would love to see, and which I believe is totally possible. More flexibility, not less.
Last edited by TheBloke on Sun Jul 05, 2020 11:03 am, edited 1 time in total.
Resolve Studio 17.4.3 and Fusion Studio 17.4.3 on macOS 11.6.1

Hackintosh:: X299, Intel i9-10980XE, 128GB DDR4, AMD 6900XT 16GB
Monitors: 1 x 3840x2160 & 3 x 1920x1200
Disk: 2TB NVMe + 4TB RAID0 NVMe; NAS: 36TB RAID6
BMD Speed Editor
Offline
User avatar

TheBloke

  • Posts: 1905
  • Joined: Sat Nov 02, 2019 11:49 pm
  • Location: UK
  • Real Name: Tom Jobbins

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Jul 05, 2020 10:17 am

Mark Grgurev wrote:However, Fusion's feature set is a super set of the Color page. From a purely functional perspective, Fusion could totally replace the Color page but I wouldn't want to do that.
Mark Grgurev wrote:I can say that though, is that I think there's probably some things that Fusion is currently needed for that the Color page could probably grow to the needs of.

I think that in a perfect world, Fusion and Color would merge. As you say, there's a lot of overlap in what they do - both node based, both with trackers and qualifiers/keyers, both supporting OFX nodes, both using keyframes in a similar way, both using the same cache, and applying after the Edit page, and so on. You can do some colour grading in Fusion, and you can do some simple visual effects in Color. Combining them makes total sense to me.

I've recently started using Color for the first time, and it's fantastic. But several times I came up on a more complex task and thought "this would be easier if I could use such-and-such Fusion node here".

I don't think it will ever happen, but I think it would be awesome if it did. It would require more UI flexibility than Resolve has today, because it would require being able to bring up the appropriate panels according to which nodes were selected, and have this work seamlessly as you moved between Color and Fusion nodes. Plus Color has a lot of existing users - more than Fusion I'd think - who will expect the UI to remain very similar to what they've used for years.

I think it'd need something akin to Adobe's UI profiles/workspaces, where at the click of a preset you could change the layout of the "Color/Fusion" UI according to which of those tasks you're currently focused on, and what information is most useful to see. Like how in Photoshop the entire UI can reconfigure when you start doing 3D stuff; with of course the ability to customise those workspace profiles.

They could even maintain the separate "Color" vs "Fusion" tabs, but have them simply be UI presets that configure the interface appropriately for that task. The underlying engine would be the same, and you'd be able to access all the features in either mode, but the interface would be tailored according to the tab you're on. Actually that speaks to a general idea for future UI improvement - and solving some of the duplication you created this thread to discuss. All of the current "pages" could in effect become workspace presets. There'd then effectively be two underlying 'engines': Cut/Edit/Fairlight/Deliver (layer based), and Color/Fusion (node based). The separate tabs/pages would just configure the UI according to which task the user was focusing on, but would not preclude accessing all the features of the rest of that 'engine' via UI customisation.

Anyway, I can't see most or any of this happening. Certainly not a merge of Fusion and Color. There may well be technical hurdles arising from how Fusion and Color implement similar things differently, and regardless of that it's definitely a big and complex change that's not strictly necessary.

But I do believe it would all be possible and I think that if BMD started making Resolve today, they might well implement it like that from the start.

Mark Grgurev wrote:I immediately wonder if track-based transforms, FX, and grades would do the trick. It only took a few minutes of me messing with them in Vegas Pro to realize how useful they could be.
Yes that's definitely a feature I would appreciate: it would be fantastic to have per-track effects, both built-in like Transform, Crop and Retime, and also OFX. But not at the expense of tight Fusion integration :)
Resolve Studio 17.4.3 and Fusion Studio 17.4.3 on macOS 11.6.1

Hackintosh:: X299, Intel i9-10980XE, 128GB DDR4, AMD 6900XT 16GB
Monitors: 1 x 3840x2160 & 3 x 1920x1200
Disk: 2TB NVMe + 4TB RAID0 NVMe; NAS: 36TB RAID6
BMD Speed Editor
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Jul 05, 2020 5:57 pm

TheBloke wrote:Note: The following three posts started out as one mega post responding to Mark's post, but I've since edited them to separate them out a little. Apologies for the multi-essay composition!

Have you seen my posts?! I'm totally guilty of writing essays as well lol

TheBloke wrote:When I was setting up the camera movements - throughout the 13.5 minutes of scene one there's at least 50 keyframes on each of the camera's X,Y and Z coordinates - I had to constantly go back and forth between Edit and Fusion, to ensure those camera movements were correctly lined up to the audio, and to my edit points.

So the main advantage you're seeing with the current setup is that you can hide per-track edits from the Fusion view and the thing that's annoying you is that you can't see those edits from the Fusion view?

TheBloke wrote:I was therefore jumping back and forth between Edit and Fusion very often, sometimes multiple times a minute. I dearly wished that it was possible to have Edit and Fusion open at the same time, in separate windows.

I later wondered if perhaps there was some way to expose the Fusion node's controls on to the Edit page, so I would not have needed to go into Fusion nearly as often? But I haven't checked into that as yet. I did make a Fusion macro out of my composition, combining the ~40 nodes into a single macro group which exposes the parameters I might need to tweak. In addition to the camera movements I can tweak a variety of other parameters, like lighting and shadow levels, on the macro node. But still only within Fusion, currently at least.

This doesn't really feel integrated to me. It feels like you're mainly using the Edit page to make up for the fact that MediaIn nodes(ones dragged from the Media Pool) don't allow you to set multiple in an out points and would instead require a new MediaIn for each edit.

TheBloke wrote:How would I have done the same thing using a separated Fusion, like Fusion Studio is today? Well I suppose I would have had to render out the clips (the ones in the Fusion Clip) to intermediate ProRes, and then import with Loaders into Fusion Studio. Then I'd have probably have had to add markers for all the scene cuts, so I'd have some reference in Studio as to where the edit points where (I think it's possible to export and import markers, not sure.) But then how to sync to the audio? Fusion (Studio) does have some very basic audio support, so I suppose I could import the audio too.

Then any time I realized I needed to make a change on the Edit page - which happened a number of times after I'd started the Fusion comp - I would have to re-render all that intermediate footage, or else try and make the changes twice, in both pieces of software.

It wouldn't require any more rendering than it currently does. Just because I'm proposing that the Fusion or Create tab operate separately form the project does not mean that things need to be rendered. They would still be within the same program so you can have a live link on the timeline to an open Fusion comp and that can be cached and changed (via exposed settings) on the Edit timeline.

When I say it isn't part of the project and that it works more like footage within the project, I mean that the .comp file doesn't exist within the project file. Right now a Fusion comp exists within the Media Pool and then when you drag it into the timeline you're making a new, identical comp that exists within the timeline. Each time you drag that comp into the timeline, that project is becoming larger by the size of that comp. If the project or timeline gets corrupted, so do the comps.

My idea would have it to so that the .comp file is saved separately somewhere on your hard drive. When the comp is on a timeline, it isn't a copy. It's a clip of an asset in the Media Pool that's referencing that .comp on your hard drive just like normal footage. Changes to that comp would then change all instances of where it's used. It would also allow you to import that comp into other projects by just dragging and dropping or using the Media Page instead of creating an empty Fusion Composition and then importing the comp via File > Import Fusion Composition. Also if it's defaults aren't changed or there's no settings to be changed then it would only need to be cached once instead of once every instance of it being used.

TheBloke wrote:How would I have done the same thing using a separated Fusion, like Fusion Studio is today?

Just now I put together a dumbed down version of your project. Three keyed clips placed on image planes in a 3D environment with a light and, in this case, three cameras. I used the three cameras as different camera angles. Then I put them into a macro and exposed controls for the Render3D node's camera selector and the In points for all three clips.

Now when I bring that macro into the timeline, I get these settings.

Image

Changing the In points can be used to select which part of each clip is being used. Changing the camera works as a cut between each camera. I can do this via keyframes or I can just cut the clip and change the settings per-clip.

There is one problem with this which is that it won't show me the other cameras for some reason but that would otherwise work. Of course you could just use one camera like you currently are and expose all of it's controls via the macro. Then if you need to go back to a camera shot you already used but a further point in time, then you could copy that earlier clip and change the In points to move the performance further ahead.

I definitely think you should explore using macros. They would make more sense for what you're doing. What you're doing is both complex, and not complex. It's complex in that you're compositing a 3D scene while editing but it's simple in that there's really no interdependencies between these clips. For example, you're not tracking a background and then keeping something else in sync with that tracking data. You're just using the timeline for trimming.

TheBloke wrote:Anyway, I hope it's clear how important it was for me that Fusion and Resolve were tightly integrated. The ability to have a Fusion Clip that can access any number of layers was absolutely vital.
[/quote]
To be honest, it hasn't. It don't see any advantages to them. It's like not like you can see and work with the edit points in the Fusion page and the additional controls of the Edit timeline don't really offer any large advantages, they just introduce un-needed complexity to the job.

I actually originally set up my macro using a Fusion Clip and the macro wouldn't work because the MediaIns that it uses refer to tracks that only exist within the original Fusion Clip. I had to use MediaIns from the Media Pool in order for them to work. The behavior of MediaIns is really all over the place.
Last edited by Mark Grgurev on Mon Jul 06, 2020 4:16 am, edited 1 time in total.
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Jul 05, 2020 6:43 pm

TheBloke wrote:In my view Fusion's power and appeal is greatly enhanced by its integration to Resolve. It's a major selling point for BMD that rather than using Premiere Pro and After Effects separately, you can use a single program, Resolve. PP + AE has Dynamic Link, and that seems kind of like what you're proposing for Fusion? But that's a step backward for Resolve; it's not as good as Fusion's current integration, and IMHO it would be a terrible shame if BMD went backwards on that integration instead of improving it even further.


Yes it's similar but different. Unlike PP and AE, they would still be one program sharing the same memory, code, etc. Caching something on the Create page would mean it's already cached for the Edit page. It's really not a step back. The "advantages" of the current setup aren't really advantages at all .

TheBloke wrote:I absolutely agree that there are some aspects that are sub par. We really need a better way of keeping track of Fusion compositions. In my view, every Fusion composition should get an entry in the Media Pool, so they're clearly visible and adjustable. Every time you open Fusion on a given clip and start editing, it should automatically make a new Fusion Composition in the Media Pool, with a new field added that indicates which clip it's attached to, and another that shows how many inputs it supports ("0" for a Fusion Composition, "1" for a clip composition and an Adjustment Clip; "Many" for a Fusion Clip comp).

This would open up great new possibilities, like being able to copy a Fusion composition from one clip to another, or to many clips at once, using the Edit + Media Pool panes, without Fusion even being open. For example, select a group of clips on Edit, highlight a Fusion composition in Media, and then right-click on the clips in Edit and choose "Set Fusion Composition from Media Pool". Similar in operation to "Conform Clip with selected clip from Media Pool". That would be a non-destructive operation, because you'd either be adding a composition to a clip that didn't already have one, or else changing which comp was used by a clip. Either way, you could easily revert to the prior state, with no risk of compositions being lost or deleted.


This would make things horribly messy. You need to consider what sorts of use case would be the most common and which are more niche. When would a clip composition be applicable to other clips? When it's an effect or color grade. That's exactly what the Color page is already made to do so why not use it for that?

If the composition isn't made with any specific input clip in mind then what does that imply? That it doesn't require integration at all. You could make the comp with any old clip and change the node at the end. That's why I suggested this idea in a previous post in the form of Fusion FX. However I wouldn't put Fusion FX into the Media Pool as that would be very inconsistent. Effects aren't footage, they're effects thus they'd be in the Effects Library just like Fusion Transitions are in the Transitions list, and Fusion Titles are in the Titles list. Then you can just drag and drop the effect on top of a clip or adjustment clip without introducing any new behaviors or clip types.

That's why I call it the Create page. You could use it to create Compositions and Titles which can be dragged into timelines or you can create Transitions and Effects which can be applied to edit points, clips, and adjustment layers. Basically, if you don't have an OFX or VST effect that can do something, you can make it in Create.

Your idea and my idea are both similar in regards to fact that turning to a clip into a composition would also create in the Media Pool. The difference is that mine would be stored as a separate file outside the project while yours would exist within it. So both ideas add some degree of clutter in the Media Pool. However, Create would use loaders to refer to source media and only use MediaIns to refer to arbitrary clips on the timeline. That would allow the editor to remove the original clip from the Media Pool without breaking the composition. Your idea wouldn't allow for that because the composition would be dependent on the clip.

TheBloke wrote:Therefore I think Resolve should support a many-to-one relationship between clips and comps, such that I could take a single Fusion comp and apply it to a whole bunch of clips of once. As well as being a major efficiency boost over the steps required today (open Fusion, click on every clip in turn and paste in some previously copied nodes) , that would also solve a major annoyance: currently if you have a clip on which you create a Fusion composition and then you cut that clip in Edit, you end up with two separate comps, no longer linked, and any changes made to one must be manually copied to the other.

Instead, with my suggested changes, you would still have a single comp, referenced by both clips. This would be made clear in the UI because the comp would have a single name and a single Media Pool entry. If you did actually want two comps across the two clips, you could simply copy the current comp to a new clip and then assign that new comp to the second clip.


I already mentioned this in my previous post too. By making it an effect, it means that it can be used in the Color page in the same ways you can currently use OFX nodes. That also brings with it the advantage of allowing you to apply color grading to a clip before and after the Fusion effect. You could also apply more than one and you could apply them to groups of clips, source clips, timelines, track, etc. That's literally the kind of batch style work that the Color page was designed for.

TheBloke wrote:From that point, I could see all sorts of possibilities. Like expanding on the MediaIn/MediaOut concept to support compositions that reference each other. This has been a common request amongst the Fusion userbase: support for a 'live macro' where one definition can be used in multiple comps, allowing the single definition to be changed and affect all comps that use it.


Why would that be a feature of the MediaIn and MediaOut nodes? Shouldn't this instead be an import options or it's own Reference node? A live macro would be used for the same things as regular macros and templates are so they wouldn't necessarily contain "media".

TheBloke wrote:This could be combined with the existing Macro Editor. You would take an existing comp and turn it into a "Live Comp" or "Macro Comp" or whatever name. The UI for this would be similar to the existing Macro Editor: you define the input(s) and output(s), and you expose any parameters that need to be accessible to calling comps.

Now in any other comp, you can add this Live Comp as a node. This node has the appropriate number of inputs and outputs, and shows in the Inspector the configured parameters.


That's exactly how I was suggesting the macros would work in the Edit page. If the macro has inputs then it's clearly the an effect so that would also allow it to apply in the node page as well.

TheBloke wrote:So far, this is identical to a Macro. But the key difference is that this is not a static definition on disk which is copied and therefore duplicated into any comp that uses it. This is a node that references another comp in the same project, or in the cross-project Power Bins.


Exactly, something like this wouldn't need Fusion compositions to be part of the project.

TheBloke wrote:Therefore you can go and edit that comp definition at any time, and the effects of those edits apply to comps that use it.

That's the sort of extra power and flexibility I would love to see, and which I believe is totally possible. More flexibility, not less.


This was so frustrating to read though. You reached all of the same conclusions as I did while introducing other concepts that aren't antithetical to my ideas while claiming my idea is less flexible.
Last edited by Mark Grgurev on Wed Jul 08, 2020 3:58 pm, edited 2 times in total.
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Jul 05, 2020 7:45 pm

TheBloke wrote:I think that in a perfect world, Fusion and Color would merge. As you say, there's a lot of overlap in what they do - both node based, both with trackers and qualifiers/keyers, both supporting OFX nodes, both using keyframes in a similar way, both using the same cache, and applying after the Edit page, and so on. You can do some colour grading in Fusion, and you can do some simple visual effects in Color. Combining them makes total sense to me.

I've recently started using Color for the first time, and it's fantastic. But several times I came up on a more complex task and thought "this would be easier if I could use such-and-such Fusion node here".


They absolutely shouldn't merge. Yes, there's a lot of behind the scenes code that can and should be shared between but they having entirely different strengths. There's reason why color grading applications were sought after years after compositing software was already a thing. The best color grading software always functions like simplified compositing software and are optimized for modifying a large amount of clips quickly. It's why Autodesk acquired Lustre despite already having Flame, it's why Adobe made Lightroom years after Photoshop was already the top image editing software, and it's why Adobe acquired Speedgrade even though they already had After Effects. In each case they work at different levels of complexity and are optimized for those different levels of complexity.

I said this in another topic. Fusion will always work on the assumption that it can't render something in real time and will cache things in memory to insure that you can always at least play back the cached part. The Color page will also ways try to render in real-time because it's optimized for a very specific tasks and it expects that it'll be able to play it back just fine.

Separating the two in the pipeline also allows compositions to be graded without invalidating the comp's cache.

TheBloke wrote:It would require more UI flexibility than Resolve has today, because it would require being able to bring up the appropriate panels according to which nodes were selected, and have this work seamlessly as you moved between Color and Fusion nodes. Plus Color has a lot of existing users - more than Fusion I'd think - who will expect the UI to remain very similar to what they've used for years.


In general, Resolve could really use more customization. And yea, that's true. The Color page does have more users than Fusion because it's easier for people to grasp.

TheBloke wrote:I think it'd need something akin to Adobe's UI profiles/workspaces, where at the click of a preset you could change the layout of the "Color/Fusion" UI according to which of those tasks you're currently focused on, and what information is most useful to see. Like how in Photoshop the entire UI can reconfigure when you start doing 3D stuff; with of course the ability to customise those workspace profiles.

They could even maintain the separate "Color" vs "Fusion" tabs, but have them simply be UI presets that configure the interface appropriately for that task. The underlying engine would be the same, and you'd be able to access all the features in either mode, but the interface would be tailored according to the tab you're on.


This works for the Fairlight, Cut, and Edit pages but there is more of a difference between them, the Color page, and the Fusion page. There's especially a difference between the Create and Process pages as the Process page is essentially a routing page for an NLE while Create is a self-contained asset and effect creator with an compositing focused timeline.

TheBloke wrote:Actually that speaks to a general idea for future UI improvement - and solving some of the duplication you created this thread to discuss. All of the current "pages" could in effect become workspace presets. There'd then effectively be two underlying 'engines': Cut/Edit/Fairlight/Deliver (layer based), and Color/Fusion (node based). The separate tabs/pages would just configure the UI according to which task the user was focusing on, but would not preclude accessing all the features of the rest of that 'engine' via UI customisation.


That wouldn't work. Edit, Cut, and Fairlight aren't layered-based, they're sequencers. Audio doesn't work in layers, track order doesn't change how they mix. Yes, video tracks do work like layers but it's important to remember that their primary job isn't compositing or VFX.

What makes Resolve so clean and powerful is that the unlike Premiere, FCPX, and Media Composer, it can make it's editing UI focused and doesn't have to worried about handling masking, tracking, etc. within the same UI. Instead the Color page and Edit page are meant to compliment the other's functionality.

Just think about the difference between an editing timeline and VFX timeline. A VFX timeline only has one asset per track and the track order doesn't usually determine draw order. An editing timeline is very different in that there's potentially hundreds of clips per track and their track order determines draw order. To deal with this, the Color page splits the timeline up and views it at different scales: at the Clip scale, Group scale, and Timeline scale. They also have two different node graphs per group for pre-group and post-group grades. The timeline view can't currently see tracks but if expanded to see this at the track level than it could work in a more similar way to a Fusion clip and would be able to do track grades and adjustment grades without adjustment clips.

TheBloke wrote:Anyway, I can't see most or any of this happening. Certainly not a merge of Fusion and Color.


Which is partly why I like that my idea lets them them play off each other while minimizing their redundancy.

TheBloke wrote:I think that if BMD started making Resolve today, they might well implement it like that from the start.


Have you used Lightworks. It's free to download. It's basically Resolve if the Color page and Fusion page were combined into a VFX page and it just isn't good. The VFX page is node-based, too and the Edit page's timeline can be seen from it.

TheBloke wrote:But not at the expense of tight Fusion integration :)

That's extremely frustrating. I'm not suggesting less integration, you're just seeing dependency that don't really exist.
Offline

Mark Grgurev

  • Posts: 802
  • Joined: Fri Nov 03, 2017 7:22 am

Re: Rethinking How Resolve's Pages Interact 2.0

PostSun Nov 01, 2020 6:04 am

Just updated the original post once more. I previously suggested that Compositions would be outside of the project but I'm realizing that's an unnecessary separation.

Return to DaVinci Resolve Feature Requests

Who is online

Users browsing this forum: No registered users and 21 guests