Jump to: Board index » General » Fusion

Thoughts on blending real and 3D VFX scenes together?

Learn about 3D compositing, animation, broadcast design and VFX workflows.
  • Author
  • Message
Offline

Dazzer

  • Posts: 228
  • Joined: Sat Aug 24, 2019 8:31 am
  • Real Name: Daz Harris

Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 10:58 am

Hi folks,

Imagine this:

Please check the youtube film below.

It's me filming my sofa on my crappy Huawai smartphone.

This is just an experiment, i wouldn't actually want to do this!

Ok, so, what i want to do if recreate this scene in 3D space so i can blend elements of the real scene (perhaps i had someone sitting on the sofa?) with 3D stuff.

I have the measurements of the sofa (length 218cm, height 63cm, Depth 98cm).

Now, in my naïve mind, i imagine the following should be possible:

- I create a box in 3D with the same proportions.

- I position my camera so it matches the first frame.

- I do the same for the other frames (maybe not every frame, that would be a lot of work, maybe once every 3 frames then smooth the camera movement curves in between).

- Now i have recreated the scene in 3D. And because i have done it carefully, everything else in these two "worlds" should also match up.

Easy in theory, but in practice it doesn't work. My Huawai phone has a crap lens with lots of distortion, so it's never gonna match up in my recreation

But even with a good camera (i have a sony A7) i think it'll be the same.

So i'm wondering, is this something even possible in Fusion. Or is this more like high-end Hollywood type stuff and i'm being over-ambitious? 

This is really complicated, right? Getting a 3D VFX camera to have the some properties as a real camera?

And lastly, is this in fact the wrong way to do this and is it more a job for tracking?

My suspicion is that in high-end (Hollywood level compositing) they wouldn't use tracking to blend a real and VFX scene together, they'd do something more on the lines of my idea.

I'm not looking for "do XYZ" replies, just some general thoughts!

If this isn't do-able in Fusion, any ideas what software is better for it?

Thanks!

Offline

xunile

  • Posts: 3075
  • Joined: Mon Apr 23, 2018 5:21 am
  • Real Name: Eric Eisenmann

Re: Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 7:45 pm

The main reason for 3d camera tracking is for the blending of vfx with camera footage.

Here are a couple of videos that show what I think you are describing and also what camera tracking can do in Fusion.







Win 10 Home | Intel i7 - 10700f 64 GB 1 TB GB SSD 2 TB SSD
RTX-3060 12 GB | Resolve Studio 18.6.6| Fusion Studio 18.6.6

Win 10 Home | Intel Core I7-7700HQ 32 GB 1 TB NVME SSD 1 TB SATA SSD
GTX-1060-6GB | Resolve 17.4.6
Offline
User avatar

Bryan Ray

  • Posts: 2491
  • Joined: Mon Nov 28, 2016 5:32 am
  • Location: Los Angeles, CA, USA

Re: Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 7:53 pm

Fusion does have a CameraTracker node, which can be used to solve a 3d camera move, assuming sufficient parallax (as you have in this shot).

Fusion's CameraTracker isn't as robust as software dedicated to that purpose, such as PFTrack or Syntheyes, but it can frequently get the job done.

The tracking process will typically also produce a point cloud, which can assist with modeling, especially in cases like this one, where the deformations of the couch would be impossible to estimate. You might be able to eyeball them, but it would be a lot of work. A high-density point cloud makes it much easier.

If you shoot some footage of objects that have straight lines and place them near the edge of the frame, preferably perpendicular to the frame, you can estimate the lens distortion and remove it using Fusion's LensDistort tool. You'd want to do this prior to tracking, and save the parameters so you could later distort the CG to match the original footage. Any zooming (optical or digital) changes the distortion, so the solution is only valid as long as you're using the same zoom level (and lens, but lenses on smartphones aren't usually changeable).

The tricky thing in this shot (which obviously wouldn't happen if you shot properly with a manually-controlled camera) is the refocus artifacts. They'd disrupt the tracking, making it difficult to correlate features on opposite sides of the snap focus change. If I were tracking this shot, I'd probably try to reconstruct those frames from the ones on either side using Optical Flow, just to allow the trackers to cross that segment. Then in the eventual solution I'd probably just delete the camera's keyframes at those points and plan on doing some manual warping.

One other thing to watch out for with a smartphone camera is an artifact called rolling shutter. The sensors record each line in sequence, from top to bottom, and if you're panning quickly enough, you might see vertical lines sort of lean a bit because the bottom of the sensor is recording a fraction of a second later than the top. This also creates the "jello effect" common in POV sports photography shot with a Go Pro. Even top-line cameras like the ARRI Alexa exhibit this effect, although they're usually quick enough that you only see it in the fastest of whip-pans. Anyway, this rolling shutter can make tracking difficult, so you're advised to shoot in such a way as to minimize it as much as possible. That is, keep your pans and trucks smooth and slow, especially when working with a slow camera.

And make no mistake: Fusion is high-end Hollywood type software. I happen to be sitting in Hollywood and using Fusion right now. ;) Although I admit that I don't generally use its CameraTracker; I have PFTrack for that job.
Bryan Ray
http://www.bryanray.name
http://www.sidefx.com
Offline

Dazzer

  • Posts: 228
  • Joined: Sat Aug 24, 2019 8:31 am
  • Real Name: Daz Harris

Re: Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 8:30 pm

Many thanks guys.

So say you're using an external tracker like PFTrack or Syntheyes.

You do the tracking. But then how do you get that information or "scene" back into Fusion? (some specific file format?)

Do any of these tracking software have like a manual mode? What i mean is, i would imagine that you can draw some lines on a frame and give them numbers. Then, say, you do the same thing every 5 frames or so. The surely it's possible to make an algorithm that can work out not only the geometry of your scene but also your focal length, especially if you can tell the algorithm some of the physical measurements of your scene.

I did a bit of googling and saw a program called Autodesk Recap which seems to be along these line, but i think it's more aimed at industrial applications.

Sorry if i'm over-thinking this. Just trying to learn!
Offline

Dazzer

  • Posts: 228
  • Joined: Sat Aug 24, 2019 8:31 am
  • Real Name: Daz Harris

Re: Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 8:38 pm

And @ Bryan, regarding your comments about the technicalities of tracking:

Is it sometimes an idea to first do some processing on the video, like edge detection and filtering, to emphasize the elements you want to track?
Offline
User avatar

Bryan Ray

  • Posts: 2491
  • Joined: Mon Nov 28, 2016 5:32 am
  • Location: Los Angeles, CA, USA

Re: Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 10:14 pm

Dazzer wrote:Many thanks guys.

So say you're using an external tracker like PFTrack or Syntheyes.

You do the tracking. But then how do you get that information or "scene" back into Fusion? (some specific file format?)


Typically FBX or Alembic is used to bring the tracking information back into Fusion. Both will also bring along any point clouds and geometry used in getting the solve (assuming you send them to the export module, that is).

Do any of these tracking software have like a manual mode? What i mean is, i would imagine that you can draw some lines on a frame and give them numbers. Then, say, you do the same thing every 5 frames or so. The surely it's possible to make an algorithm that can work out not only the geometry of your scene but also your focal length, especially if you can tell the algorithm some of the physical measurements of your scene.


Sure. PFTrack, for instance, has an Estimate Focal length node, which lets you draw some perspective lines, and it estimates the Angle of View from those lines. And you can, if you need to, animate the focal length if you're solving a scene with a zoom or a large rack focus.

Lacking the focal length information, or having only an approximate value, the solver can estimate the focal length itself, although the accuracy of that estimate can vary wildly. The solver can estimate lens distortion, as well.

I imagine Syntheyes (much cheaper than PFTrack) has similar capabilities, although I've only used it once.

I did a bit of googling and saw a program called Autodesk Recap which seems to be along these line, but i think it's more aimed at industrial applications.


My understanding is that Recap is a surveying tool, intended to help build a very detailed model from photographs and LIDAR scanning. RealityCapture is another software that does the same thing. This can be a very useful addition to tracking—you can feed the survey data into PFTrack, or whatever you're using, and the solver uses that information to get a more reliable understanding of the scene, and therefore a better track. PFTrack can do the same thing to a limited extent—you can even extract texture for the model from the shot camera, allowing for some cool projection and set extension capabilities.

Sorry if i'm over-thinking this. Just trying to learn!


Not a problem at all! Tracking happens to be one of my favorite tasks, so I'm happy to talk about it.


Is it sometimes an idea to first do some processing on the video, like edge detection and filtering, to emphasize the elements you want to track?


Frequently, yes. Oversharpening is a technique I use fairly often (denoise first). A high-pass filter is often useful. And sometimes I might do a Mocha planar track or point track, then insert something that PFTrack will have an easier time detecting. It likes texture, so blurry images give it problems.
Bryan Ray
http://www.bryanray.name
http://www.sidefx.com
Offline
User avatar

Kristof Indeherberge

  • Posts: 75
  • Joined: Fri Jul 01, 2016 8:15 pm

Re: Thoughts on blending real and 3D VFX scenes together?

PostFri Nov 15, 2019 10:27 pm

Tim Dobbert's book on matchmoving is a good starting point. And to be honest, I don't like to mess with the image too much at first. PFTrack, carefully selected tracking features, distortion grid, survey data. All that jazz.
Offline

Dazzer

  • Posts: 228
  • Joined: Sat Aug 24, 2019 8:31 am
  • Real Name: Daz Harris

Re: Thoughts on blending real and 3D VFX scenes together?

PostSun Nov 17, 2019 10:38 am

Thanks for the great info.

Damn, PFTrack is crazy expensive! So, yeah, perhaps something for the future.........

So now i understand what an alembic mesh is and how to import it into Fusion 3D via an Alembic node.

That is very powerful, because Fusion's 3D modelling is very basic, but now i see i could do it in Blender then import in.

One more quickie on this:

If i send an image (or clip) into a 3D shape, it'll project onto the shape.

But this doesn't work with an alembic mesh. So how do you go about doing that?
Last edited by Dazzer on Sun Nov 17, 2019 11:02 am, edited 1 time in total.
Offline

Sander de Regt

  • Posts: 3584
  • Joined: Thu Nov 13, 2014 10:09 pm

Re: Thoughts on blending real and 3D VFX scenes together?

PostSun Nov 17, 2019 10:44 am

PFTrack may be really expensive, Syntheyes isn't really. It's really, really, really good.
Sander de Regt

ShadowMaker SdR
The Netherlands
Offline
User avatar

Bryan Ray

  • Posts: 2491
  • Joined: Mon Nov 28, 2016 5:32 am
  • Location: Los Angeles, CA, USA

Re: Thoughts on blending real and 3D VFX scenes together?

PostSun Nov 17, 2019 3:29 pm

First, don't import by just adding an Alembic Mesh node. Instead, use File > Import > Alembic. That will give you the entire hierarchy of your Alembic.

Many Alembic exporters don't include UVs. Go back to Blender and make sure the UVs are turned on when you export (and also that your object has UVs to start with).

Fusion's ability to create new UVs is pretty limited. You can do a simple projection—spherical, planar, or cylindrical. It also has a Cubic mode, but it acts weird. It's far better to set up UVs in 3d software, which usually has much better unwrapping tools.

Yeah, PFTrack is expensive, but it's the software I know, which is why I keep referencing it. Syntheyes, as Sander says, is much cheaper, and it's arguably just as powerful. I'd say it's downfall is its idiosyncratic UI, but every other tracker is just as peculiar.

Blender, by the way, also has a 3D tracker; I've never used it and don't know how much control it gives you beyond what Fusion's CameraTracker can do, but it can't hurt to try it out.
Bryan Ray
http://www.bryanray.name
http://www.sidefx.com
Offline

Dazzer

  • Posts: 228
  • Joined: Sat Aug 24, 2019 8:31 am
  • Real Name: Daz Harris

Re: Thoughts on blending real and 3D VFX scenes together?

PostWed Nov 27, 2019 10:33 am

Hi Bryan,

Sorry, forgot to say: thanks for the info!

Slowly getting my head around all this stuff!

grtz ............. D

Return to Fusion

Who is online

Users browsing this forum: No registered users and 67 guests