Page 1 of 1

Ursa Gyroscope With Fusion

PostPosted: Wed Apr 22, 2015 9:09 am
by Andrew Koutsou
The new 4.6 cameras have built in gyroscopes which record all the cameras movement information. Have black magic done any tests in fusion using this information to track objects into shot? Or help roto things out.

I would love to see a video on this/ further information.

Many thanks

Re: Ursa Gyroscope With Fusion

PostPosted: Wed Apr 22, 2015 6:17 pm
by Blazej Floch
Neat idea. I haven't seen this yet. Wondering which metadata is used to store the information. If Fusion could access it stabilization would be possible.

Let's hear BMDs answer.

Re: Ursa Gyroscope With Fusion

PostPosted: Wed Apr 22, 2015 7:06 pm
by Andrew Koutsou
Blazej Floch wrote:Neat idea. I haven't seen this yet. Wondering which metadata is used to store the information. If Fusion could access it stabilization would be possible.

Let's hear BMDs answer.


Well I'm hoping that they built it into the ursa with fusion effects in mind...

But it will probably be working with AE also and any other 3d Program.

I will probably go fusion route - seem like a good route for all sorts of effects.

Re: Ursa Gyroscope With Fusion

PostPosted: Wed Apr 22, 2015 7:07 pm
by Chad Capeland
To do camera tracking with no drift, it would have to be AMAZINGLY accurate. To do camera motion blur removal? Not so much. To give your tracking system a rough idea about your location and orientation to allow it to bundle multiple takes of a static scene (think structure from motion) or multiple cameras recording one scene (think mocap but with moving cameras)? Sure, it's absolutely enough.

I think the motion blur removal, along with highlight restoration and super-resolution would be the ideal applications. That's assuming there's inertial sensors too (like MEMS), not just a gyro.

Re: Ursa Gyroscope With Fusion

PostPosted: Wed Apr 22, 2015 7:24 pm
by Andrew Koutsou
Chad Capeland wrote:To do camera tracking with no drift, it would have to be AMAZINGLY accurate. To do camera motion blur removal? Not so much. To give your tracking system a rough idea about your location and orientation to allow it to bundle multiple takes of a static scene (think structure from motion) or multiple cameras recording one scene (think mocap but with moving cameras)? Sure, it's absolutely enough.

I think the motion blur removal, along with highlight restoration and super-resolution would be the ideal applications. That's assuming there's inertial sensors too (like MEMS), not just a gyro.


The fact is it probably does not have to be amazing accurate. if it is close then the operator /tracker using the software should hopefully be in the right area and make the necessary adjustments.

Who knows? BMD will have to do a few tests. If I had my hands on an Ursa Mini I could use a contact at one of the biggest VFX post houses in london to see what they thought of it.

Lets wait and see.

Re: Ursa Gyroscope With Fusion

PostPosted: Thu Apr 23, 2015 7:24 am
by Holger Neuhaeuser
Well, the same gyro sensors are in the red epic since its on the market (End 2011)
Up till now no application has come up to use those data with any compositing or 3d software.

Only project was an automatic horizon correction which never turned to a full product.

As the red epic is widely used for effects work, I think there would have been some use if the data had produced some usable results.

But maybe Blackmagic does the stunt. Sensors are better nowadays and the perfect software (Fusion) is in blackmagic“s portfolio too.

Re: Ursa Gyroscope With Fusion

PostPosted: Thu Apr 23, 2015 1:48 pm
by Chad Capeland
Chad Capeland wrote: To give your tracking system a rough idea about your location and orientation to allow it to bundle multiple takes of a static scene (think structure from motion) or multiple cameras recording one scene (think mocap but with moving cameras)? Sure, it's absolutely enough.


Andrew Koutsou wrote:The fact is it probably does not have to be amazing accurate. if it is close then the operator /tracker using the software should hopefully be in the right area and make the necessary adjustments.


That's exactly what I said, if it's another input to the solver, then yes, it could help, but not much other than getting roughly aligned to a specific coordinate system. You're still going to be doing the real tracking optically. So it's not going to revolutionize anything unless A) It's going to be so amazingly accurate that you don't need the optical solve at all or B) It's used for something else, like camera motion blur or rolling shutter mitigation.

But hey, let's say you plan on using it for an ENG camera, or for location scouting, or using it in an industry like insurance adjustment, landscape design, or something where a casual user might just be trying to collect footage to document something, like a camera junkie tourist... Just automatically knowing their rough location and what general direction they are facing might be really useful.

Re: Ursa Gyroscope With Fusion

PostPosted: Thu Apr 23, 2015 7:34 pm
by Rakesh Malik
Holger Neuhaeuser wrote:Well, the same gyro sensors are in the red epic since its on the market (End 2011)
Up till now no application has come up to use those data with any compositing or 3d software.


The big difference here is that BMD owns and therefore controls a major, high-end VFX application and a major, high-end finishing application that is about to become an NLE... which BMD also owns and controls.

Re: Ursa Gyroscope With Fusion

PostPosted: Thu Apr 23, 2015 10:45 pm
by Jun Yokoishi
The gyro sensor and it's metadata will be useful for post-effect stabilizer.
Projecting the shaky footage from shaky camera with recorded pitch, roll and yaw data,
we can re-shoot with smooth camera like 3-axis gimbal shooting.

Re: Ursa Gyroscope With Fusion

PostPosted: Thu Apr 23, 2015 11:29 pm
by Jules Bushell
Rakesh Malik wrote:
Holger Neuhaeuser wrote:Well, the same gyro sensors are in the red epic since its on the market (End 2011)
Up till now no application has come up to use those data with any compositing or 3d software.


The big difference here is that BMD owns and therefore controls a major, high-end VFX application and a major, high-end finishing application that is about to become an NLE... which BMD also owns and controls.

GPS is accurate to 8m (?) and at 1fps of data. How can this and the gyroscope be useful for VFX?

Jules

Re: Ursa Gyroscope With Fusion

PostPosted: Fri Apr 24, 2015 1:11 am
by Chad Capeland
Jules Bushell wrote:
Rakesh Malik wrote:
Holger Neuhaeuser wrote:Well, the same gyro sensors are in the red epic since its on the market (End 2011)
Up till now no application has come up to use those data with any compositing or 3d software.


The big difference here is that BMD owns and therefore controls a major, high-end VFX application and a major, high-end finishing application that is about to become an NLE... which BMD also owns and controls.

GPS is accurate to 8m (?) and at 1fps of data. How can this and the gyroscope be useful for VFX?

Jules


Assuming your mattes only have to be accurate to +/- 500 pixels, you should be fine. :)

Re: Ursa Gyroscope With Fusion

PostPosted: Mon Apr 27, 2015 7:03 am
by Jan Van Akkere
The GPS is not vfx related, it's to track down shot locations for pickup shots

Re: Ursa Gyroscope With Fusion

PostPosted: Mon Apr 27, 2015 5:01 pm
by Chad Capeland
Jan Van Akkere wrote:The GPS is not vfx related, it's to track down shot locations for pickup shots


But the specs on the camera indicate no other positioning sensor, like an accelerometer. Just gyro and GPS. Seems odd that they would have a positioning sensor but not list it on the specs alongside the gyro.

EDIT: And just to be clear, if all it took to make a camera capable of generating it's own track was an accelerometer and gyro, wouldn't there be companies selling you kits to strap a Wiimote or your cell phone to the camera? Likewise, wouldn't cell phone cameras have 3D tracking built-in to the app?

Re: Ursa Gyroscope With Fusion

PostPosted: Tue Apr 28, 2015 5:54 pm
by Jules Bushell
Hi,

Does anyone know of a third party device that can give accurate positional data of the camera per frame, assuming that the gyro in the URSA Mini 4.6K is good enough?

Cheers,
Jules

Re: Ursa Gyroscope With Fusion

PostPosted: Tue Apr 28, 2015 6:44 pm
by Chad Capeland
Jules Bushell wrote:Hi,

Does anyone know of a third party device that can give accurate positional data of the camera per frame, assuming that the gyro in the URSA Mini 4.6K is good enough?

Cheers,
Jules


https://neuronmocap.com/content/product ... ion-neuron

One sensor won't be enough to track accurately. But with 3+ you woudn't even need the gyro, so it would work with any camera.

This is one of the newest systems out there, and it's nowhere near accurate enough. <1 degree in rotation is bad enough, but they don't even list the positional accuracy because it drifts so badly over time.

It's not just them either. This company doesn't list accuracy in their specs either. https://www.xsens.com/products/xsens-mvn/

So I suspect the drift would result in a track being off by hundred or even thousands of pixels for anything other than a tripod shot.

Re: Ursa Gyroscope With Fusion

PostPosted: Wed Apr 29, 2015 2:54 pm
by Ryan Bloomer
just conceptually, couldn't something like iPi solve a camera's location in space? http://ipisoft.com/

I looked into this about a year ago, and the only real solution I could find that seamed to actually have a viable solution for camera tracking for VFX was http://www.lightcrafttech.com/.

Re: Ursa Gyroscope With Fusion

PostPosted: Wed Apr 29, 2015 3:51 pm
by Chad Capeland
Ryan Bloomer wrote:just conceptually, couldn't something like iPi solve a camera's location in space? http://ipisoft.com/

I looked into this about a year ago, and the only real solution I could find that seamed to actually have a viable solution for camera tracking for VFX was http://www.lightcrafttech.com/.


You could stick some tracking fiduciaries on the camera and shoot it with a couple other cameras and have an optical motion capture system for handheld cameras, sure. Or use a mount with rotation encoding joints. But the setup on all of those are way beyond having the camera run with internal sensors that were always ready to go.

Could BMD make such a tracking system for previz or post? Sure. I don't know if they have interest in pure VFX camera tech, they haven't so far shown any niche cameras yet. The question is one of scaling, too, does having a high volume line of traditional cameras allow you to make and market specialty systems in a way that's more profitable than the lost opportunity costs of investing more in mass market products?

Re: Ursa Gyroscope With Fusion

PostPosted: Fri May 01, 2015 11:37 am
by Tanawat Wattanachinda
I can see it use in tracking software as a soft constraint for the solver.
optical track + rotation from the gyro(as a soft input)
it doesn't need to be accurate, that way

Re: Ursa Gyroscope With Fusion

PostPosted: Fri May 01, 2015 2:35 pm
by Chad Capeland
Tanawat Wattanachinda wrote:I can see it use in tracking software as a soft constraint for the solver.
optical track + rotation from the gyro(as a soft input)
it doesn't need to be accurate, that way


The problem I mention is that rotation isn't enough information to be useful. Unless you can account for translation, you're going to be generating data that has a higher error than if you just did the optical track by itself. The gyro could simplify the creation of a coordinate system, but that's pretty trivial anyway. If you have a shoot where you know the optical track will be terrible, because of low light or massive occlusions, or a lack of rigid tracking features, you're best off using a motion capture system for the camera or using a crane that records the joint rotations.

I still think the application is for location scouting or roughly recording the placements of different cameras in a shoot. The GPS and gyro can get you enough information to orient your shoot to LIDAR or to DEM or satellite images, plus you can have 1400 clips from a documentary and be able to sort them by location or whatnot, which is great, you'll spend less time manually logging that.

This could also be great for other industries like real estate or insurance adjustment. Oh, and for ENG. Imagine you are a local news producer and a gas main blew up a building in town. You've got a crew shooting the scene, and the metadata gets compared to your archive of footage from the past few years and the software finds all the clips that could have the building in it and sorts it by similarity to the position and angle of what you just shot. Now you've found your before clip to edit into the story.

Re: Ursa Gyroscope With Fusion

PostPosted: Fri May 01, 2015 3:05 pm
by Tanawat Wattanachinda
Chad Capeland wrote:
Tanawat Wattanachinda wrote:I can see it use in tracking software as a soft constraint for the solver.
optical track + rotation from the gyro(as a soft input)
it doesn't need to be accurate, that way


The problem I mention is that rotation isn't enough information to be useful. Unless you can account for translation, you're going to be generating data that has a higher error than if you just did the optical track by itself. The gyro could simplify the creation of a coordinate system, but that's pretty trivial anyway. If you have a shoot where you know the optical track will be terrible, because of low light or massive occlusions, or a lack of rigid tracking features, you're best off using a motion capture system for the camera or using a crane that records the joint rotations.

I still think the application is for location scouting or roughly recording the placements of different cameras in a shoot. The GPS and gyro can get you enough information to orient your shoot to LIDAR or to DEM or satellite images, plus you can have 1400 clips from a documentary and be able to sort them by location or whatnot, which is great, you'll spend less time manually logging that.

This could also be great for other industries like real estate or insurance adjustment. Oh, and for ENG. Imagine you are a local news producer and a gas main blew up a building in town. You've got a crew shooting the scene, and the metadata gets compared to your archive of footage from the past few years and the software finds all the clips that could have the building in it and sorts it by similarity to the position and angle of what you just shot. Now you've found your before clip to edit into the story.

I totally agree, that's why I said "soft" a very very soft.
I once have to cam track with nothing but a actress head and a sky with hardly any cloud, luckily I got 2 withness cam to help solve.
If any chance I don't have that 2 withness cam. the rotation from the gyro might be useful