- Posts: 1
- Joined: Wed Jan 28, 2015 6:32 pm
I have a built a recording app around Declink Quad (and the single channel Declink that does HDMI), I am recording an driving simulation that is running on clock that is set to 240 Hz, the projectors are Gen Locked at 60Hz, and the cameras running to the Declink cards are running at 30Hz 1920x1024.
I am using DirectShow (we changed over our original capture App that used SD cameras and some else's capture card to HD + Declink capture). In my filter graph, I have a filter that renders a frame number (our 240 Hz clock tick) and some additional data on top of the captured frame. I pass the data and frame number directly to my rendering filter (its gets sent over the network from one of the computers) the problem is there seems to be about 1 second of latency by the time my rendering filter gets its frame, and it is getting it's data at 60 Hz, so the data that gets overlaid on the video is 1 second newer than the data that is getting overlaid. The rendering filter also saves out a file with the current simulator frame number + video time, so we can cross reference data later.
My question is, is there any way for me to know how much latency I am getting in from the capture to the overlay filter. I can buffer my incoming data, but I need to know how much latency I have so I know how far back in my buffer I need to go.
I also notice there are timecode outputs from the capture card, and seems like I can get the raw VANC stream from my incoming sample, so its like I can get a timecode generator and inserter hardware, but I would still need a way to get that time code before it gets feed into the capture card so I can correlate our simulator frame number with time code.
Right now I can live about 100ms of accuracy in terms of lining up our frame numbers, however in the future I will need to get down to a single frame accuracy.
I am using DirectShow (we changed over our original capture App that used SD cameras and some else's capture card to HD + Declink capture). In my filter graph, I have a filter that renders a frame number (our 240 Hz clock tick) and some additional data on top of the captured frame. I pass the data and frame number directly to my rendering filter (its gets sent over the network from one of the computers) the problem is there seems to be about 1 second of latency by the time my rendering filter gets its frame, and it is getting it's data at 60 Hz, so the data that gets overlaid on the video is 1 second newer than the data that is getting overlaid. The rendering filter also saves out a file with the current simulator frame number + video time, so we can cross reference data later.
My question is, is there any way for me to know how much latency I am getting in from the capture to the overlay filter. I can buffer my incoming data, but I need to know how much latency I have so I know how far back in my buffer I need to go.
I also notice there are timecode outputs from the capture card, and seems like I can get the raw VANC stream from my incoming sample, so its like I can get a timecode generator and inserter hardware, but I would still need a way to get that time code before it gets feed into the capture card so I can correlate our simulator frame number with time code.
Right now I can live about 100ms of accuracy in terms of lining up our frame numbers, however in the future I will need to get down to a single frame accuracy.