Wed Jun 26, 2019 12:37 pm
When hunting for latency don't forget that the camera adds 1 frame to the path.
In the days of tube cameras and tube displays a beam scanned the face of the pickup tube and that modulated the signal on the wire that then modulated the beam scanning the phosphor of the CRT.
(a very simplified description) That means there is only wire delays in the path so the timing seen on the monitor is as close to realtime as we can get.
With the advent of chip sensors the photosites have to absorb the light for the frame time and then transfer that charge to the matrix to be scanned out onto the wire. That means that the signal on the wire is 1 frame later than the light hitting the sensor. When that signal gets to a flat panel monitor it has to get fully buffered before it is displayed, that means the light emitted from the display is a minimum of 2 frames delayed from the light that hits the lens.
Toss in a frame sync device or some sort of conversion and more frames are added to the delay path.
Your existential question for the day is: What is real time?
Marty Brenneis
Pixel Corps Engineering Droid