Jump to: Board index » General » Fusion

Can I inverse the operation of the tracker

Learn about 3D compositing, animation, broadcast design and VFX workflows.
  • Author
  • Message
Offline

theblueroom

  • Posts: 5
  • Joined: Tue Mar 11, 2025 7:36 pm
  • Real Name: Matt Green

Can I inverse the operation of the tracker

PostTue Mar 11, 2025 7:42 pm

Apologies for lack of technical language, I'm new to fusion. I'm a bit stuck with a project. Does anyone know if I can reverse the match move tracker? I've created some HUD for a video game to show a map. I have two angles, one face on to the character, using her nose as the tracking point. The second shot is over the shoulder using her hair clip as the tracking point. On the face on shot the tracking works as intended (as she moves her head to the right, the HUD moves to the right and so on.) However, using this same method over the shoulder as I'm tracking the back of her head, the opposite happens. Does anyone know of a fix for this? I'm sure I'm just being stupid.

https://imgur.com/a/1BnZ5pW

Thanks!
Last edited by theblueroom on Wed Mar 12, 2025 11:07 am, edited 1 time in total.
Offline
User avatar

KrunoSmithy

  • Posts: 3967
  • Joined: Fri Oct 20, 2023 11:01 pm
  • Real Name: Kruno Stifter

Re: Can I inverse the operation of the tracker

PostWed Mar 12, 2025 9:47 am

It would help to see what you are doing, but if you are just using a point to track that is X and Y position but not data on rotation, scale, perspective, shear etc. You would either need more points or planar tracker or something else, like surface or 3D tracker. Depending on what you are doing. It would he easier if we could see it. At least for me.

The reverse of match move would be stabilize if I'm not mistaken. in the point tracker or new inellitracker, if you set match move to Background only you can stabilize the background input.

Merge

The Merge control determines what is done (if anything) with the image provided to the green Foreground input of the Tracker. This menu appears when the operation is set to anything other
than None.

BG Only: The foreground input is ignored; only the background is affected. This is used primarily when stabilizing the background image.

FG Only: The foreground input is transformed to match the movement in the background, and this transformed image is passed through the Tracker’s output. This Merge technique is used when match moving one layer’s motion to another layer’s motion.

FG Over BG: The foreground image is merged over the background image, using the Merge method described by the Apply Mode control that appears.

BG Over FG: The background is merged over the foreground. This technique is often used when tracking a layer with an Alpha channel so that a more static background can be applied behind it.

Alternatively there are several good macros on the reactor (depository for fusion goodies) that offers various assist type tools to make it easier to work with tracker tool.
Offline

Sam Steti

  • Posts: 3033
  • Joined: Tue Jun 17, 2014 7:29 am
  • Location: France

Re: Can I inverse the operation of the tracker

PostWed Mar 12, 2025 9:55 am

theblueroom wrote:However, using this same method over the shoulder as I'm tracking the back of her head, the opposite happens. Does anyone know of a fix for this? I'm sure I'm just being stupid.
Track something else than hair, something not far but more reliable.
BTW, "the opposite happens" doesn't mean anything and makes no sense...
*MacMini M1 16 Go - Sonoma - Ext nvme SSDs on TB3 - 14 To HD in 2 x 4 disks USB3 towers
*Legacy MacPro 8core Xeons, 32 Go ram, 2 x gtx 980 ti, 3SSDs including RAID
*Resolve Studio everywhere, Fusion Studio too
*https://www.buymeacoffee.com/videorhin
Offline

theblueroom

  • Posts: 5
  • Joined: Tue Mar 11, 2025 7:36 pm
  • Real Name: Matt Green

Re: Can I inverse the operation of the tracker

PostWed Mar 12, 2025 8:16 pm

KrunoSmithy wrote:It would help to see what you are doing, but if you are just using a point to track that is X and Y position but not data on rotation, scale, perspective, shear etc. You would either need more points or planar tracker or something else, like surface or 3D tracker. Depending on what you are doing. It would he easier if we could see it. At least for me.

The reverse of match move would be stabilize if I'm not mistaken. in the point tracker or new inellitracker, if you set match move to Background only you can stabilize the background input.

Merge

The Merge control determines what is done (if anything) with the image provided to the green Foreground input of the Tracker. This menu appears when the operation is set to anything other
than None.

BG Only: The foreground input is ignored; only the background is affected. This is used primarily when stabilizing the background image.

FG Only: The foreground input is transformed to match the movement in the background, and this transformed image is passed through the Tracker’s output. This Merge technique is used when match moving one layer’s motion to another layer’s motion.

FG Over BG: The foreground image is merged over the background image, using the Merge method described by the Apply Mode control that appears.

BG Over FG: The background is merged over the foreground. This technique is often used when tracking a layer with an Alpha channel so that a more static background can be applied behind it.

Alternatively there are several good macros on the reactor (depository for fusion goodies) that offers various assist type tools to make it easier to work with tracker tool.


I’ve updated with an image, I did initially try to post an image but it was broken. Thanks for taking the time to reply
Offline
User avatar

KrunoSmithy

  • Posts: 3967
  • Joined: Fri Oct 20, 2023 11:01 pm
  • Real Name: Kruno Stifter

Re: Can I inverse the operation of the tracker

PostThu Mar 13, 2025 12:06 am

theblueroom wrote:I’ve updated with an image, I did initially try to post an image but it was broken. Thanks for taking the time to reply


I think when you are new to forum and you only have few posts you are limited to what you can upload etc.

Regarding your situation, its hard to know without trying or seeing video, but from the image you could try maybe offset tracking and or use more than one tracker to get more information or use other form of trackers. Planar tracker, or 3D tracker. Try that. But if what you are match moving is further away from what you are tracking, basically offset tracking, you will lose some stability as you move away from the area you track. So ideally you would track something close to where your HUD map element will be. When you tracked the nose it was easy since its close to each other. And nose goes where face goes. With the hair as she moves right or left likely you need the face and therefore I think I might know what you are asking, but its hard for me to say without trying. If you post a short clip somewhere of the video with the map I can try.

But from the top of my head you can do your tracking an than try to flip displacement curve. in the spline editor. Or if that doesn't work maybe convert it to X and Y axis and do it on just one of them. Right click on the displacement spline in the viewer and choose convert to X and Y path.

sshot-884.jpg
sshot-884.jpg (215.1 KiB) Viewed 943 times


sshot-885.jpg
sshot-885.jpg (220.07 KiB) Viewed 943 times


sshot-886.jpg
sshot-886.jpg (220.63 KiB) Viewed 943 times


The more trackers you have the more data like scale, rotation, perspective etc you can use.


Fusion Tracking | Part - 2 - Tracker Tool



Try some of these things and see what works.
Offline

birdseye

  • Posts: 454
  • Joined: Fri Jun 12, 2020 2:36 pm
  • Real Name: Iain Fisher

Re: Can I inverse the operation of the tracker

PostThu Mar 13, 2025 11:09 am

i can only surmise that you want to use the same tracking data from the forward view on the rear view. You could do it in a few ways. Here are some permutations. The first one would be your forward view, the second and third, flipped horizontally by Transform2, would be your rear view. The second only match moves horizontally, the third, I think, is what you want, it subtracts the Y tracking data.
You'll need to retrack, the tracking data makes the post too large.

Code: Select all
{
   Tools = ordered() {
      Background1 = Background {
         Inputs = {
            GlobalOut = Input { Value = 300, },
            Width = Input { Value = 1920, },
            Height = Input { Value = 1080, },
            ["Gamut.SLogVersion"] = Input { Value = FuID { "SLog2" }, },
            TopLeftRed = Input { Value = 1, },
         },
         ViewInfo = OperatorInfo { Pos = { -180.929, 55.8202 } },
      },
      Tracker1 = Tracker {
         Trackers = {
            {
               PatternTime = 0,
               PatternX = 0.223391420911528,
               PatternY = 0.329761904761905
            }
         },
         CtrlWZoom = false,
         Inputs = {
            Input = Input {
               SourceOp = "Merge1",
               Source = "Output",
            },
            Name1 = Input { Value = "Tracker 1", },
            PatternCenter1 = Input { Value = { 0.223391420911528, 0.329761904761905 }, },
            PatternWidth1 = Input { Value = 0.0438337801608579, },
            PatternHeight1 = Input { Value = 0.0785714285714285, },
            SearchWidth1 = Input { Value = 0.110857908847185, },
            SearchHeight1 = Input { Value = 0.173809523809524, },
         },
         ViewInfo = OperatorInfo { Pos = { 7.74066, 48.3143 } },
      },
      Merge1 = Merge {
         Inputs = {
            EffectMask = Input {
               SourceOp = "Rectangle1",
               Source = "Mask",
            },
            Background = Input {
               SourceOp = "Background2",
               Source = "Output",
            },
            Foreground = Input {
               SourceOp = "Background1",
               Source = "Output",
            },
            PerformDepthMerge = Input { Value = 0, },
         },
         ViewInfo = OperatorInfo { Pos = { -45.5964, 100.063 } },
      },
      Background2 = Background {
         Inputs = {
            GlobalOut = Input { Value = 300, },
            Width = Input { Value = 1920, },
            Height = Input { Value = 1080, },
            ["Gamut.SLogVersion"] = Input { Value = FuID { "SLog2" }, },
            TopLeftGreen = Input { Value = 1, },
         },
         ViewInfo = OperatorInfo { Pos = { -180.929, 99.4566 } },
      },
      Text1 = TextPlus {
         Inputs = {
            GlobalOut = Input { Value = 300, },
            Width = Input { Value = 1920, },
            Height = Input { Value = 1080, },
            ["Gamut.SLogVersion"] = Input { Value = FuID { "SLog2" }, },
            Center = Input { Value = { 0.222, 0.321 }, },
            StyledText = Input { Value = "text thing", },
            Font = Input { Value = "Open Sans", },
            Style = Input { Value = "Bold", },
            VerticalJustificationNew = Input { Value = 3, },
            HorizontalJustificationNew = Input { Value = 3, },
         },
         ViewInfo = OperatorInfo { Pos = { 222.842, 10.3152 } },
      },
      Rectangle1 = RectangleMask {
         Inputs = {
            Filter = Input { Value = FuID { "Fast Gaussian" }, },
            MaskWidth = Input { Value = 1920, },
            MaskHeight = Input { Value = 1080, },
            PixelAspect = Input { Value = { 1, 1 }, },
            ClippingMode = Input { Value = FuID { "None" }, },
            Center = Input {
               SourceOp = "Path1",
               Source = "Position",
            },
            Width = Input { Value = 0.09, },
            Height = Input { Value = 0.16, },
         },
         ViewInfo = OperatorInfo { Pos = { -46.2623, 133.063 } },
      },
      Path1 = PolyPath {
         DrawMode = "InsertAndModify",
         CtrlWZoom = false,
         Inputs = {
            Displacement = Input {
               SourceOp = "Path1Displacement",
               Source = "Value",
            },
            PolyLine = Input {
               Value = Polyline {
                  Points = {
                     { LockY = true, X = -0.321373551693405, Y = -0.247508137591273, RX = 0.0375932739350918, RY = -0.136966282434122 },
                     { LockY = true, X = 0.413874264705882, Y = -0.253241400545439, LX = 0.137810547187489, LY = -0.276352912483443, RX = -0.0690971918129052, RY = 0.0352213774460087 },
                     { X = 0.405885181152247, Y = -0.0944172268571967, LX = -0.0147120711233306, LY = -0.0711849425861277, RX = 0.0393279953475341, RY = 0.344599760507643 },
                     { X = 0.266738323195796, Y = -0.11666669343458, LX = -0.030943079874374, LY = 0.145438873753268, RX = 0.0566920992802611, RY = -0.308904355341198 },
                     { LockY = true, X = -0.089572192513369, Y = 0.0237529691211401, LX = -0.0963583558750425, LY = -0.203401329595518, RX = 0.0627639477803465, RY = 0.210995186222716 },
                     { LockY = true, X = 0.329771056149733, Y = 0.232574557930852, LX = 0.0377738295802234, LY = -0.115396985030781, RX = -0.0265837776637721, RY = 0.109158586587075 },
                     { LockY = true, X = -0.335069622101252, Y = 0.287427378420736, LX = 0.0490282839708146, LY = 0.0551569953423556, RX = -0.0575352883693742, RY = -0.0641865982721178 },
                     { LockY = true, X = -0.238598417098997, Y = -0.0608208365272486, LX = -0.151709278152855, LY = -0.132879120324601 }
                  }
               },
            },
         },
      },
      Path1Displacement = BezierSpline {
         SplineColor = { Red = 255, Green = 0, Blue = 255 },
         CtrlWZoom = false,
         KeyFrames = {
            [0] = { 0, RH = { 20, 0.077536470769376 }, Flags = { Linear = true, LockedY = true } },
            [60] = { 0.232609412308128, LH = { 40, 0.155072941538752 }, RH = { 80, 0.343253832101726 }, Flags = { Linear = true, LockedY = true } },
            [120] = { 0.564542671688922, LH = { 100, 0.453898251895324 }, RH = { 140, 0.60882326800362 }, Flags = { Linear = true, LockedY = true } },
            [180] = { 0.697384460633017, LH = { 160, 0.653103864318319 }, RH = { 200, 0.757397154248254 }, Flags = { Linear = true, LockedY = true } },
            [240] = { 0.877422541478729, LH = { 220, 0.817409847863492 }, RH = { 260, 0.918281694319153 }, Flags = { Linear = true, LockedY = true } },
            [300] = { 1, LH = { 280, 0.959140847159576 }, Flags = { Linear = true, LockedY = true } }
         }
      },
      Merge3 = Merge {
         Inputs = {
            Background = Input {
               SourceOp = "Merge1",
               Source = "Output",
            },
            Foreground = Input {
               SourceOp = "Transform1",
               Source = "Output",
            },
            PerformDepthMerge = Input { Value = 0, },
         },
         ViewInfo = OperatorInfo { Pos = { 339.198, 53.3381 } },
      },
      Transform1 = Transform {
         Inputs = {
            Center = Input {
               SourceOp = "Tracker1",
               Source = "UnsteadyPosition1",
            },
            Input = Input {
               SourceOp = "Text1",
               Source = "Output",
            },
         },
         ViewInfo = OperatorInfo { Pos = { 340.141, 10.5767 } },
      },
      Transform2 = Transform {
         Inputs = {
            FlipHoriz = Input { Value = 1, },
            Input = Input {
               SourceOp = "Merge1",
               Source = "Output",
            },
         },
         ViewInfo = OperatorInfo { Pos = { 527.766, 103.039 } },
      },
      Text1_1 = TextPlus {
         Inputs = {
            GlobalOut = Input { Value = 300, },
            Width = Input { Value = 1920, },
            Height = Input { Value = 1080, },
            ["Gamut.SLogVersion"] = Input { Value = FuID { "SLog2" }, },
            Center = Input { Value = { 0.866, 0.332 }, },
            StyledText = Input { Value = "text thing", },
            Font = Input { Value = "Open Sans", },
            Style = Input { Value = "Bold", },
            VerticalJustificationNew = Input { Value = 3, },
            HorizontalJustificationNew = Input { Value = 3, },
         },
         ViewInfo = OperatorInfo { Pos = { 715.938, -18.8365 } },
      },
      Merge3_1 = Merge {
         Inputs = {
            Background = Input {
               SourceOp = "Transform2",
               Source = "Output",
            },
            Foreground = Input {
               SourceOp = "Transform1_1",
               Source = "Output",
            },
            PerformDepthMerge = Input { Value = 0, },
         },
         ViewInfo = OperatorInfo { Pos = { 713.5, 61.0415 } },
      },
      Transform1_1 = Transform {
         Inputs = {
            Center = Input {
               SourceOp = "Tracker1",
               Source = "SteadyPosition",
            },
            Input = Input {
               SourceOp = "Text1_1",
               Source = "Output",
            },
         },
         ViewInfo = OperatorInfo { Pos = { 714.443, 21.7086 } },
      },
      Text1_2 = TextPlus {
         Inputs = {
            GlobalOut = Input { Value = 300, },
            Width = Input { Value = 1920, },
            Height = Input { Value = 1080, },
            ["Gamut.SLogVersion"] = Input { Value = FuID { "SLog2" }, },
            Center = Input { Value = { 0.864, 1.323 }, },
            StyledText = Input { Value = "text thing", },
            Font = Input { Value = "Open Sans", },
            Style = Input { Value = "Bold", },
            VerticalJustificationNew = Input { Value = 3, },
            HorizontalJustificationNew = Input { Value = 3, },
         },
         ViewInfo = OperatorInfo { Pos = { 981.81, 51.4457 } },
      },
      Merge3_2 = Merge {
         CtrlWZoom = false,
         Inputs = {
            Background = Input {
               SourceOp = "Transform2",
               Source = "Output",
            },
            Foreground = Input {
               SourceOp = "Transform1_2",
               Source = "Output",
            },
            PerformDepthMerge = Input { Value = 0, },
         },
         ViewInfo = OperatorInfo { Pos = { 980.315, 131.324 } },
      },
      Transform1_2 = Transform {
         Inputs = {
            Center = Input { Expression = "Point(Tracker1.SteadyPosition.X,-Tracker1.SteadyPosition.Y)", },
            Input = Input {
               SourceOp = "Text1_2",
               Source = "Output",
            },
         },
         ViewInfo = OperatorInfo { Pos = { 981.258, 92.8479 } },
      }
   },
   ActiveTool = "Merge3_2"
}
Offline

theblueroom

  • Posts: 5
  • Joined: Tue Mar 11, 2025 7:36 pm
  • Real Name: Matt Green

Re: Can I inverse the operation of the tracker

PostThu Mar 13, 2025 2:48 pm

KrunoSmithy wrote:
theblueroom wrote:I’ve updated with an image, I did initially try to post an image but it was broken. Thanks for taking the time to reply


I think when you are new to forum and you only have few posts you are limited to what you can upload etc.

Regarding your situation, its hard to know without trying or seeing video, but from the image you could try maybe offset tracking and or use more than one tracker to get more information or use other form of trackers. Planar tracker, or 3D tracker. Try that. But if what you are match moving is further away from what you are tracking, basically offset tracking, you will lose some stability as you move away from the area you track. So ideally you would track something close to where your HUD map element will be. When you tracked the nose it was easy since its close to each other. And nose goes where face goes. With the hair as she moves right or left likely you need the face and therefore I think I might know what you are asking, but its hard for me to say without trying. If you post a short clip somewhere of the video with the map I can try.

But from the top of my head you can do your tracking an than try to flip displacement curve. in the spline editor. Or if that doesn't work maybe convert it to X and Y axis and do it on just one of them. Right click on the displacement spline in the viewer and choose convert to X and Y path.

sshot-884.jpg


sshot-885.jpg


sshot-886.jpg


The more trackers you have the more data like scale, rotation, perspective etc you can use.


Fusion Tracking | Part - 2 - Tracker Tool


Try some of these things and see what works.


Thank you so much for taking the time to reply and to show your working. I've just tried this method and unfortunately it didn't give me the result I'm hoping for. I did make a youtube video going over the exact problem I have but unfortunately the forum won't let me post links :roll: , if there's a way I can get this to you that doesn't violate any sort of community rules please do let me know.


I tried your solution but it reverses the operation in its entirety. I'll try my best to explain my issue in case the link doesn't work and I apologise if it isn't clear, I'm new to fusion and I don't know any of the technical language.

So, clip one is face on to the subject using a match move tracker with her nose as the tracking point, this gives me the desired result (the HUD moves with the direction of her face). For the second clip, it's switched to an over the shoulder shot. I'm using a bright spot of a hair clip on the back of her head as the tracking point. Currently, it is matching the motion of the bright spot on her hairclip, so when the hairclip moves to the left, the HUD moves to the left. However this is the issue, as the camera is on the back of head, the hair clip moves in the opposite direction than her face would be. So the current result is, when she moves her head to look to the right (but the hair clip is moving to the left as it is on the back of her head) the HUD moves to the left. I'm looking for a way to have the tracker use the same numerical values, but inverse them. So if the hair clip tracking point moves 5 pixels (or whatever unit of measurement is used here) to the right, the tracker will move the HUD 5 pixels to the left instead.

I hope this makes sense? I know some people are a little frustrated with my lack of explanation, I promise it is not because I don't value your time, it is just because I don't understand the technicality of fusion yet, hence why I am trying actively to learn and seeking help instead of just giving up. (I have tried to google this but I haven't been able to find a solution) I will reiterate that I have made a video of my exact issue, but I am not allowed to post it.

Thank you for your help!
Offline
User avatar

KrunoSmithy

  • Posts: 3967
  • Joined: Fri Oct 20, 2023 11:01 pm
  • Real Name: Kruno Stifter

Re: Can I inverse the operation of the tracker

PostFri Mar 14, 2025 12:41 am

theblueroom wrote:Thank you so much for taking the time to reply and to show your working. I've just tried this method and unfortunately it didn't give me the result I'm hoping for. I did make a youtube video going over the exact problem I have but unfortunately the forum won't let me post links :roll: , if there's a way I can get this to you that doesn't violate any sort of community rules please do let me know.


You could maybe try either be more active on forum to get your post numbers up, but I'm not sure what the number should be. Maybe 10. You could also maybe try to send me a private message or you could perhaps just post link with [dot] instead of "."

theblueroom wrote:I tried your solution but it reverses the operation in its entirety. I'll try my best to explain my issue in case the link doesn't work and I apologise if it isn't clear, I'm new to fusion and I don't know any of the technical language.

So, clip one is face on to the subject using a match move tracker with her nose as the tracking point, this gives me the desired result (the HUD moves with the direction of her face). For the second clip, it's switched to an over the shoulder shot. I'm using a bright spot of a hair clip on the back of her head as the tracking point. Currently, it is matching the motion of the bright spot on her hairclip, so when the hairclip moves to the left, the HUD moves to the left. However this is the issue, as the camera is on the back of head, the hair clip moves in the opposite direction than her face would be. So the current result is, when she moves her head to look to the right (but the hair clip is moving to the left as it is on the back of her head) the HUD moves to the left. I'm looking for a way to have the tracker use the same numerical values, but inverse them. So if the hair clip tracking point moves 5 pixels (or whatever unit of measurement is used here) to the right, the tracker will move the HUD 5 pixels to the left instead.


I try to find some similar type footage online to give it a go, but I could only find this woman with her back turned and turning the head. Not much to use for tracking so I tracked back of her hat with planar tracker, but it was not a great track, a bit jittery. Anyway, I though I would give it a try anyway.

Basically it seems that the challenge is that body move differently than the head, and eyes if you want to be precise. You would essentially need a co-planar surface to track and that is a bit tricky. since her face and HUD map should be perpendicular to each other and you can't track her back or backpack if the head is moving differently from it. Plus unlike classical corner pin operation this HUD should be moving in accounting for perspective and all that.

Ideally you would have 3D model of the person and do it in 3D but depending on what you are working with you may be able to do it with 3D tracker, something like syntheyes or something. Or you could try to fake it manually if its simple enough clip.

Which is what I tried to do. I used planar tracker to track back of her hat. And use DVE node to try to change perspective which got me close until she turns her head, when its off again, so I just faked it by manually animating that part. In the end this is what I got, but there is some jitters, because track was not good probably.

It is a challenge, since I've not done something quite like this. Usually its frontal tracking and corner pining.

I used some template that comes with resolve for Hud Graphics and bit of glow and I used magic mask to select the woman so I can get the occlusion mask.

HUD test



theblueroom wrote:I hope this makes sense? I know some people are a little frustrated with my lack of explanation, I promise it is not because I don't value your time, it is just because I don't understand the technicality of fusion yet, hence why I am trying actively to learn and seeking help instead of just giving up. (I have tried to google this but I haven't been able to find a solution) I will reiterate that I have made a video of my exact issue, but I am not allowed to post it. Thank you for your help!


No problem. Try posting www.youtube[dot]com instead of www.youtube.com.

I didn't spend too much time on this, but its an interesting challenge. If I come up with better idea, I'll re-post it.
Offline

birdseye

  • Posts: 454
  • Joined: Fri Jun 12, 2020 2:36 pm
  • Real Name: Iain Fisher

Re: Can I inverse the operation of the tracker

PostFri Mar 14, 2025 7:49 am

The only example I can remember seeing that might be used for that, is this one. What is shown in the first half of this tutorial, would allow the 3 axis to be extrapolated out from the surfaces that were tracked. It's quite advanced, oh what the hell, it's very advanced but if you want to have a crack at it, I would like to see how you get on.

Offline

theblueroom

  • Posts: 5
  • Joined: Tue Mar 11, 2025 7:36 pm
  • Real Name: Matt Green

Re: Can I inverse the operation of the tracker

PostFri Mar 14, 2025 11:52 am

KrunoSmithy wrote:
theblueroom wrote:Thank you so much for taking the time to reply and to show your working. I've just tried this method and unfortunately it didn't give me the result I'm hoping for. I did make a youtube video going over the exact problem I have but unfortunately the forum won't let me post links :roll: , if there's a way I can get this to you that doesn't violate any sort of community rules please do let me know.


You could maybe try either be more active on forum to get your post numbers up, but I'm not sure what the number should be. Maybe 10. You could also maybe try to send me a private message or you could perhaps just post link with [dot] instead of "."

theblueroom wrote:I tried your solution but it reverses the operation in its entirety. I'll try my best to explain my issue in case the link doesn't work and I apologise if it isn't clear, I'm new to fusion and I don't know any of the technical language.

So, clip one is face on to the subject using a match move tracker with her nose as the tracking point, this gives me the desired result (the HUD moves with the direction of her face). For the second clip, it's switched to an over the shoulder shot. I'm using a bright spot of a hair clip on the back of her head as the tracking point. Currently, it is matching the motion of the bright spot on her hairclip, so when the hairclip moves to the left, the HUD moves to the left. However this is the issue, as the camera is on the back of head, the hair clip moves in the opposite direction than her face would be. So the current result is, when she moves her head to look to the right (but the hair clip is moving to the left as it is on the back of her head) the HUD moves to the left. I'm looking for a way to have the tracker use the same numerical values, but inverse them. So if the hair clip tracking point moves 5 pixels (or whatever unit of measurement is used here) to the right, the tracker will move the HUD 5 pixels to the left instead.


I try to find some similar type footage online to give it a go, but I could only find this woman with her back turned and turning the head. Not much to use for tracking so I tracked back of her hat with planar tracker, but it was not a great track, a bit jittery. Anyway, I though I would give it a try anyway.

Basically it seems that the challenge is that body move differently than the head, and eyes if you want to be precise. You would essentially need a co-planar surface to track and that is a bit tricky. since her face and HUD map should be perpendicular to each other and you can't track her back or backpack if the head is moving differently from it. Plus unlike classical corner pin operation this HUD should be moving in accounting for perspective and all that.

Ideally you would have 3D model of the person and do it in 3D but depending on what you are working with you may be able to do it with 3D tracker, something like syntheyes or something. Or you could try to fake it manually if its simple enough clip.

Which is what I tried to do. I used planar tracker to track back of her hat. And use DVE node to try to change perspective which got me close until she turns her head, when its off again, so I just faked it by manually animating that part. In the end this is what I got, but there is some jitters, because track was not good probably.

It is a challenge, since I've not done something quite like this. Usually its frontal tracking and corner pining.

I used some template that comes with resolve for Hud Graphics and bit of glow and I used magic mask to select the woman so I can get the occlusion mask.

HUD test


theblueroom wrote:I hope this makes sense? I know some people are a little frustrated with my lack of explanation, I promise it is not because I don't value your time, it is just because I don't understand the technicality of fusion yet, hence why I am trying actively to learn and seeking help instead of just giving up. (I have tried to google this but I haven't been able to find a solution) I will reiterate that I have made a video of my exact issue, but I am not allowed to post it. Thank you for your help!


No problem. Try posting http://www.youtube[dot]com instead of http://www.youtube.com.

I didn't spend too much time on this, but its an interesting challenge. If I come up with better idea, I'll re-post it.


I think you've understood my issue. I think I'm probably going to simplify the shot to have less head movement and keyframe it, I'd just wanted to avoid this as the clip is about 40 seconds long in 60fps, so it's a lot of work. I'm just very surprised there isn't a simple operation to do this. Thank you for your help, it's really kind that you've taken the time to do a mock up for me.

Here's the link to the video posted in the format you suggested just in case it helps shed any light.
https://www.youtube[dot]com/watch?v=5HkWjfl0l1s
Offline

theblueroom

  • Posts: 5
  • Joined: Tue Mar 11, 2025 7:36 pm
  • Real Name: Matt Green

Re: Can I inverse the operation of the tracker

PostFri Mar 14, 2025 11:54 am

birdseye wrote:The only example I can remember seeing that might be used for that, is this one. What is shown in the first half of this tutorial, would allow the 3 axis to be extrapolated out from the surfaces that were tracked. It's quite advanced, oh what the hell, it's very advanced but if you want to have a crack at it, I would like to see how you get on.



Thanks for linking this, I have plenty of time of this project so I might give this a go. I've been wanting to learn about fusion so I'm trying to regularly challenge myself to include more VFX. I can think of a lot of uses for this in the future so it's worth seeing if I can work out.

Return to Fusion

Who is online

Users browsing this forum: YaBoySuccuboi and 15 guests