Page 1 of 1

SDK autodetects a wrong input signal

PostPosted: Tue Jan 08, 2019 1:22 am
by ryansiu
I am using Decklink mini 4K for capturing video signal. Before starting the capturing, I would like to know what the current display mode for input signal is.
So, as I follow the example from SDK, I use the bmdDeckLinkStatusCurrentVideoInputMode for detecting the current display mode of input source. Up to 1080, the SDK always reports a correct display mode. However, the case goes wrong if the resolution goes up. SDK reports a wrong display mode, like 2K input signal reported with 1080p.
Still, the behaviour between SDI and HDMI is not consistent. I have the same signal generator for SDI and HDMI. But they report a totally different display mode.
I would like to know if it is proper to use bmdDeckLinkStatusCurrentVideoInputMode for the detection of display mode for input signal. If not, what should I do to handle this?
Thank you!