11.6 Sometimes auto-detects 10-bit when 8-bit selected

Ask software engineering and SDK questions for developers working on Mac OS X, Windows or Linux.
  • Author
  • Message
Offline
User avatar

andrewrh

  • Posts: 4
  • Joined: Sun Aug 23, 2020 4:31 pm
  • Location: Glasgow, UK
  • Real Name: Andrew Griffiths

11.6 Sometimes auto-detects 10-bit when 8-bit selected

PostSun Aug 23, 2020 4:54 pm

DeckLink 8K Pro here

I'm outputting 8-bit (either YUV or RGB) and looping it into the input (capture), but it always detects 10-bit input... I'm using 11.6 because I want to add support for the new bit-depth detection, but it doesn't seem to be working correctly.

This is happening in my own software, but I tried to replicate it using the DeckLink sample apps. I've attached a screenshot with signal generate outputting 8-bit, but input is detected at 10-bit.

Any ideas? Is it a bug?

Thanks,
Attachments
DeckLink10bit.JPG
DeckLink10bit.JPG (62.63 KiB) Viewed 1221 times
Last edited by andrewrh on Mon Aug 24, 2020 8:26 pm, edited 1 time in total.
Andrew Grifffiths
RenderHeads Ltd
Offline
User avatar

andrewrh

  • Posts: 4
  • Joined: Sun Aug 23, 2020 4:31 pm
  • Location: Glasgow, UK
  • Real Name: Andrew Griffiths

Re: 11.6 Always autodetects 10-bit when 8-bit selected

PostMon Aug 24, 2020 8:19 pm

Some more clues perhaps. It seems to be related to the mode used.

If I use the Signal Generator and generate:

1080p23.97
1080p24
1080p25
1080p30

and select 8-bit then 8-bit is detected at the input via Device Status (and my software). This is what I would expected.

But going higher than that (eg 1080p50), it always sets the input to 10-bit, even though output is 8-bit. This is not what I'm expecting.

It's similar with 720p. 720p50, 59.97 and 60 all report input at 10-bit, even when output is 8-bit.

With 2K DCI everything below <= 30 FPS is correct, but above that it doesn't match.

So for 2K and below, it seems like > 30 FPS is behaving incorrectly. Above 2K and it always seems to behave incorrectly (uses 10-bit even though 8-bit signal is sent)

Any ideas? Is it some sort of bandwidth issue?

Thanks,
Andrew Grifffiths
RenderHeads Ltd
Offline
User avatar

andrewrh

  • Posts: 4
  • Joined: Sun Aug 23, 2020 4:31 pm
  • Location: Glasgow, UK
  • Real Name: Andrew Griffiths

Re: 11.6 Sometimes auto-detects 10-bit when 8-bit selected

PostTue Aug 25, 2020 12:01 pm

I saw something in the SDK about VANC requiring 10-bit, so perhaps that would explain it... But I'm not using VANC.

Can anyone else reproduce this issue (using Signal Generator and Device Status apps)? Perhaps someone at Blackmagic has some insight?

Thanks,
Andrew Grifffiths
RenderHeads Ltd
Offline

Cameron Nichols

Blackmagic Design

  • Posts: 443
  • Joined: Mon Sep 04, 2017 4:05 am

Re: 11.6 Sometimes auto-detects 10-bit when 8-bit selected

PostWed Aug 26, 2020 6:48 am

Hi Andrew,

The 10-bit detection will likely be correct as this is the depth detected on the wire.

When EnableVideoOutput is called with bmdFormat8BitYUV, this is format that will be transferred to card. The DeckLink 8K Pro itself will transmit as 10-bit data words by appending 2 LSB. Conversely EnableVideoInput with bmdFormat8BitYUV is the format received over PCIe. When using this format, the DeckLink will discard the 2 LSBs of a received 10-bit signal.

Regards
Cameron
Offline
User avatar

andrewrh

  • Posts: 4
  • Joined: Sun Aug 23, 2020 4:31 pm
  • Location: Glasgow, UK
  • Real Name: Andrew Griffiths

Re: 11.6 Sometimes auto-detects 10-bit when 8-bit selected

PostThu Aug 27, 2020 4:52 pm

Cameron Nichols wrote:Hi Andrew,

The 10-bit detection will likely be correct as this is the depth detected on the wire.

When EnableVideoOutput is called with bmdFormat8BitYUV, this is format that will be transferred to card. The DeckLink 8K Pro itself will transmit as 10-bit data words by appending 2 LSB. Conversely EnableVideoInput with bmdFormat8BitYUV is the format received over PCIe. When using this format, the DeckLink will discard the 2 LSBs of a received 10-bit signal.

Regards
Cameron


Thanks @Cameron!

That's very interesting and helps solve my question.

Out of interest, any idea why some resolutions (such as 1080p30) when output as 8-bit, are auto-detected as 8-bit then? It just seems odd to me that some modes behave in one way, and then others another way.

Thanks,
Andrew Grifffiths
RenderHeads Ltd

Return to Software Developers

Who is online

Users browsing this forum: No registered users and 20 guests