Sam Biddle wrote:I think this is why I suggested the merge of the two. I do agree that they are two different apps, but you could kinda of say that about the coloring vs editing although those are more closely related than compositing vs editing. My reason for the merge idea is that for a system with 64GB or more RAM, RAID 0 SSDs, plenty of storage, and high end graphics cards (which seems to be pretty common for those that make money doing this, given that I have most of that and this is a hobby for me)... it seems that if you are working on both it would benefit not having to go back and forth between the two and have both apps loaded at the same time. But, then again, I would also think in most professional use cases, compositors and editors/colorists are different people, so it doesnt make sense for them to be merged in that regard. In fact, I am purely being selfish in saying that as a hobbyist who uses both apps for myself, it would be beneficial to my use case to not have to load both apps up at the same time. That way I can apply 48GB of RAM to the one app, rather than 24GB to each app, and if it were merged, the two tabs (editor/compositor) would have shared access to the same media pool and changes to say a comp could be reflected immediately in the NLE timeline rather than having to deal with export/import stuff.
The thing is, coloring and editing can both be done together quite comfortably when you're working with prores or similar, but throwing multichannel EXR files into the mix would slow the process down no end, hence why compositing tends to be a separate affair. You couldn't change a comp and then see the changes on the timeline immediately as you'd still need to render the node graph. Obviously one day this could change with faster hardware.
Speaking of RAM though, by the time you're using something with 64GB+ it's hardly a problem swapping between programs, I'd have multiple instances of Max open with Fusion's cache full and still can happily edit also. My home machine has substantially less RAM but I still do this, just purge the Fusion cache when you don't need it.
The other difference between editing/colouring on one hand, and compositing on the other, is that editing and grading are tasks that are completed on the entire project level. When you edit, you are editing an entire film or commercial or whatever the project you are working on. You need to have all your shots there, to see how they are working with each other. Grading is similar. Although you grade shot by shot, you still need to see the entire thing to make sure the colours and mood match between the similar shots.
Compositing mainly happens on a shot by shot level. You even dissect the shot into its bare elements to do the compositing. Therefore, you need an application that focuses mainly on these shots for both technical and conceptual reasons. Technically, in the projects that rely on heavy compositing, the comps tend to be ENORMOUS. Imagine having loads of these compositions living inside one timeline. Actually, you don't need to imagine since you can have something that is slightly similar in Premiere if you have hundreds of live AE comps on the timeline. The playback starts getting sluggish and you have to drop the preview quality to make it work, even on computers that are supposed to be really powerful.
On a conceptual level, it would be really distracting for the compositors whose main job is to focus on the one shot they are working on, to work in an environment that has every single other shot in the movie. It might be useful that they see a couple of shots before and after the one they are tackling, but nothing more. Compositors are essentially similar to animators in the fully animated movies where they each work on a single shots rather than the entire thing.
This what I personally think