Page 1 of 1

EnableAudioInput Channel

PostPosted: Thu Jul 30, 2020 6:40 am
by gksjqs94
Hello
Sorry for using the translator.

First of all, sorry.
I have no basic knowledge.

I am currently using Decklink Duo 2 and coding using Capture Preview CSharp in Blackmagic DeckLink SDK 11.4.

Play Audio using Waveout Class and BufferedWaveProvider Class.

I want to set 8 channels to EnableAudioInput now

Below is my code.

Code: Select all
uint channelCount = 8;
m_deckLinkInput.EnableAudioInput(_BMDAudioSampleRate.bmdAudioSampleRate48kHz,
_BMDAudioSampleType.bmdAudioSampleType16bitInteger, channelCount);           

_waveFormat.nBlockAlign = (short)((_waveFormat.wBitsPerSample * _waveFormat.nChannels) / 8);

WaveOutInit();

//Waveout initialize
private void WaveOutInit()
{           
    m_waveProvider = new BufferedWaveProvider(new WaveFormat(48000, 16, 2));           
    m_waveProvider.DiscardOnBufferOverflow = true;

    m_waveOut.Init(m_waveProvider);
    m_waveOut.Volume = 0;           
}



I set it to 8 channels as above.

And it is a callback function that uses AudioPacket.

Code: Select all
void IDeckLinkInputCallback.VideoInputFrameArrived(IDeckLinkVideoInputFrame videoFrame, IDeckLinkAudioInputPacket audioPacket)
        {
            if (videoFrame != null)
            {               
                bool inputSignal = videoFrame.GetFlags().HasFlag(_BMDFrameFlags.bmdFrameHasNoInputSource);
                if (inputSignal != m_validInputSignal)
                {
                    m_validInputSignal = inputSignal;
                    InputSignalChanged(m_validInputSignal);
                }
            }

            System.Runtime.InteropServices.Marshal.ReleaseComObject(videoFrame);

            if (audioPacket != null)
            {
                IntPtr pBuffer;
                audioPacket.GetBytes(out pBuffer);               

                if (IntPtr.Zero != pBuffer)
                {                   
                    var sampleCount = audioPacket.GetSampleFrameCount();                   
                    var bufferLength = sampleCount * _waveFormat.nBlockAlign;                                       
                    var tempBuffer = new byte[bufferLength];                                                 
                    Marshal.Copy(pBuffer, tempBuffer, 0, bufferLength);                             
                    m_waveProvider.AddSamples(tempBuffer, 0, bufferLength);
                }
                System.Runtime.InteropServices.Marshal.ReleaseComObject(audioPacket);
            }         
        }   


There is no problem with 2 channels, but the sound is strange with 8 channels.

Sorry for the very rudimentary question.
please answer about my question.

Re: EnableAudioInput Channel

PostPosted: Tue Aug 04, 2020 1:31 am
by Cameron Nichols
Hi,

I suspect that the calls to Marshal.Copy and AddSamples are taking longer than 1 frame period. It so then ultimately the incoming frame buffer will be exhausted and frames/samples will be dropped.

You can test this theory by calling IDeckLinkInput.GetHardwareReferenceClock at entry to VideoInputFrameArrived callback to ensure that the PC time is only incrementing by period of time equal to the video frame rate. If the reference time increments by an amount larger than the frame rate, then you will eventually see dropped frames and audio samples.

In this case. my advice is to move these calls outside of the VideoInputFrameArrived callback by queuing IDeckLinkAudioInputPacket. Create a dedicated thread(s) to manage your BufferedWaveProvider processing. I recommend looking at StillsCSharp as an example of using multiple threads for processing.

Regards
Cameron

Re: EnableAudioInput Channel

PostPosted: Tue Aug 04, 2020 6:41 am
by gksjqs94
Hello
In fact, I was using QueClass.

Below is my code.
Code: Select all
 if (audioPacket != null)
            {

                IntPtr pBuffer;
                audioPacket.GetBytes(out pBuffer);

                var sampleCount = audioPacket.GetSampleFrameCount();

                var bufferLength = (_waveFormat.wBitsPerSample / 8) * _waveFormat.nChannels * sampleCount;

                var tempBuffer = new byte[bufferLength];
                Marshal.Copy(pBuffer, tempBuffer, 0, bufferLength);

                _queue.Enqueue(tempBuffer);               
            }


Code: Select all
 private void BufferThread()
        {           
            Task.Run(() =>
            {
                while (true)
                {
                    Thread.Sleep(1);
                    try
                    {
                        if (_queue.Count > 0)
                        {           
                            var buffer = (byte[])_queue.Dequeue();                           
                            m_waveProvider.AddSamples(buffer, 0, buffer.Length);
                        }
                    }
                    catch (Exception ex)
                    {
                        _logManager.Logger.Trace($"BufferThread Error = {ex.ToString()}");
                    }

                }
            });
        }


The execution result is the same.

As you said, is it a problem with Marshal.Copy?

Sorry for using the translator.

Re: EnableAudioInput Channel

PostPosted: Tue Aug 04, 2020 11:20 pm
by Cameron Nichols
Hi,

I would move as much code as possible outside of the IDeckLinkInputCallback.VideoInputFrameArrived callback and pass the IDeckLinkAudioInputPacket object into your processing task.

Regards
Cameron