SDK autodetects a wrong input signal

Ask software engineering and SDK questions for developers working on Mac OS X, Windows or Linux.
  • Author
  • Message
Offline

ryansiu

  • Posts: 2
  • Joined: Tue Dec 18, 2018 8:52 am
  • Real Name: Ryan SIU

SDK autodetects a wrong input signal

PostTue Jan 08, 2019 1:22 am

Hi,
I am using Decklink mini 4K for capturing video signal. Before starting the capturing, I would like to know what the current display mode for input signal is.
So, as I follow the example from SDK, I use the bmdDeckLinkStatusCurrentVideoInputMode for detecting the current display mode of input source. Up to 1080, the SDK always reports a correct display mode. However, the case goes wrong if the resolution goes up. SDK reports a wrong display mode, like 2K input signal reported with 1080p.
Still, the behaviour between SDI and HDMI is not consistent. I have the same signal generator for SDI and HDMI. But they report a totally different display mode.
I would like to know if it is proper to use bmdDeckLinkStatusCurrentVideoInputMode for the detection of display mode for input signal. If not, what should I do to handle this?
Thank you!

Return to Software Developers

Who is online

Users browsing this forum: No registered users and 37 guests