Thu Sep 17, 2020 5:35 am
I would guess that the answer is probably (2), although you could probably test this by doing a local USB-C recording while streaming and see if the local recording is affected when the cache fills up and the internet stream quality is affected.
If you are just doing live streaming without local recording, then I'm not sure that it really matters where the caching is done. If you consider the video processing pipeline when streaming, it might look something like:
uncompressed video frames -> H.264 encoding -> sending encoded video to streaming destination
So if the bottleneck is the last step of actually sending the encoded video out over the internet, this is going to cause the cache to fill up regardless of where it is inserted in the processing pipeline. Since the H.264 encoder is a hardware implementation, I'm pretty sure that it is capable of keeping up with live encoding up to 1080p60 video, so any caching is to deal with network problems not the H.264 encoder speed.
Note that if you get into the details of the H.264 encoder, I'm sure that this is a highly simplified model of what is actually happening internally, as typically with interframe codecs like H.264, there are probably multiple frame buffers in use, since you there are multiple frames (often multiple seconds worth of frames) between each i-frame that is compressed in it's entirety (without relying on previous frames). H.264 also has b-frames which can have dependencies on both earlier and later frames in a sequence (until the next i-frame). This is part of the reason why H.264 encoders introduce some latency, as they need to buffer a certain amount of video data to do this type of compression.
Anyway, for most purposes you can probably treat the H.264 encoding engine as a black box with a fixed amount of latency, so it gets uncompressed video data going in and spits out compressed video data at the other end after a fixed amount of latency. Since this latency is constant and remains the same while the H.264 encoder is running, you don't really have to worry too much about it (at least from a caching perspective anyway).
This is different from network related problems due to bandwidth or latency which can vary over time due to the nature of internet delivery.