- Posts: 12
- Joined: Thu Oct 20, 2016 8:18 pm
Hello,
I have a DeckLink Duo 2 card that is attached to a custom camera/firmware device. That camera is outputting 10-bit RGB 4:4:4 1080p/23.98 imagery over 3G-SDI. My custom software is correctly capturing that imagery, using the DeckLink SDK (bmdFormat10BitRGB & bmdModeHD1080p2398).
There is no real colorspace information being sent - each 10-bit R/G/B values can range from 0 (0x000) to 1023 (0x3FF). I have verified that the captured frame data does indeed represent this range, and each and every pixel I capture matches the expected sent frame. All good.
However, if I attempt to take that same captured frame-imagery, and replay it OUT from the DeckLink Duo (basically slapping it directly into the Frame buffer and setting the output as above), and then re-capture it (essentially, emulating the camera output), I find that the received values do [b]NOT/b] always bit-for-bit match the sent values.
My assumption is that the DeckLink Output is somehow modifying the pixel values, perhaps to fit them into some colorspace range? If so, is there a way to avoid/disable that so that I can output full-range, 10-bit RGB data?
I have a DeckLink Duo 2 card that is attached to a custom camera/firmware device. That camera is outputting 10-bit RGB 4:4:4 1080p/23.98 imagery over 3G-SDI. My custom software is correctly capturing that imagery, using the DeckLink SDK (bmdFormat10BitRGB & bmdModeHD1080p2398).
There is no real colorspace information being sent - each 10-bit R/G/B values can range from 0 (0x000) to 1023 (0x3FF). I have verified that the captured frame data does indeed represent this range, and each and every pixel I capture matches the expected sent frame. All good.
However, if I attempt to take that same captured frame-imagery, and replay it OUT from the DeckLink Duo (basically slapping it directly into the Frame buffer and setting the output as above), and then re-capture it (essentially, emulating the camera output), I find that the received values do [b]NOT/b] always bit-for-bit match the sent values.
My assumption is that the DeckLink Output is somehow modifying the pixel values, perhaps to fit them into some colorspace range? If so, is there a way to avoid/disable that so that I can output full-range, 10-bit RGB data?
- Code: Select all
Hardware: DeckLink Duo 2
Software: DeckLink Driver/SDK: 10.11.4