Input frame timestamps drift

Ask software engineering and SDK questions for developers working on Mac OS X, Windows or Linux.
  • Author
  • Message

Paweł Kołodziej

  • Posts: 5
  • Joined: Tue Sep 08, 2015 8:09 am
  • Location: Poland

Input frame timestamps drift

PostThu Jan 20, 2022 12:02 pm


From time to time I have issues with different steps in values received from two IDeckLinkVideoInputFrame methods:
GetStreamTime() and GetHardwareReferenceTimestamp().

As I mainly work on i50 and p25 signals, for each frame both streamTime and hardwareReferenceTimestamp shall increase by 40ms. There is a noise added to hardware time, as this is the exact time the frame has arrived at but generally speaking the step value is near to 40ms.

Once in a while I get an extra 40ms (one or multiple) step to streamTime, which is not visible in hardwareReferenceTimestamp. This is usually related to lack of input frame signalized by bmdFrameHasNoInputSource flag.

In the example bellow, you can find for both hardware and stream time the exact time of the frame, its duration and the calucated step. All in miliseconds.
Code: Select all
hw: 172796090 40 0 st: 0 40 0
hw: 172796130 40 40 st: 40 40 40
hw: 172796170 40 40 st: 80 40 40 
hw: 172796250 40 39 st: 160 40 40
hw: 172796290 40 40 st: 240 40 80
2786.804 NO Input source! hw: 172796358 40 sw: 280 40
2786.816 NO Input source! hw: 172796370 40 sw: 360 40
hw: 172796410 40 120 st: 400 40 160
hw: 172796450 40 40 st: 440 40 40

In the second last line you will find out that while the hardwareReferenceTimestamp has increased by 120ms the streamTime has increased by 160ms signalizing one extra frame.
Is this for a reason or just a bug?


  • Posts: 13
  • Joined: Sat Jan 16, 2021 9:00 pm
  • Real Name: Petr Novak

Re: Input frame timestamps drift

PostThu Jan 20, 2022 2:19 pm

I believe this is a frame dropeed on input, please check the SDK 12.2.2 sample InputLoopThrough for Linux file Linux/Samples/InputLoopThrough/DeckLinkInputDevice.cpp around line 265:

// If there are any gaps in the stream time, then report the missing frames as dropped
while (streamTime >= m_lastStreamTime + 2 * frameDuration)
m_lastStreamTime += frameDuration;
m_videoInputFrameDroppedCallback(m_lastStreamTime, frameDuration, m_frameTimescale);

So you would need to do something similar in your application if you need to account for all frames (like generate an empty frame for output).
Petr Novak

Paweł Kołodziej

  • Posts: 5
  • Joined: Tue Sep 08, 2015 8:09 am
  • Location: Poland

Re: Input frame timestamps drift

PostFri Jan 21, 2022 7:32 am

I understand that there are dropped frame involved, but how is that we are dropping 3 frames, when there was time for only 2? In the example given in my previous post streamTime has increased by 160ms and hardwareReferenceTimestamp by only 120ms.
It ends up with 26 frames being received in a single second (counting those 3 missing in) and I have to use some tricks to eliminate the extra step in streamTime, which makes the code unclean and shall be avoided.

Return to Software Developers

Who is online

Users browsing this forum: No registered users and 2 guests