MScDre wrote:But that's exactly what I am saying, it would be trivial to write an app that controls a wireless focus system like the title nucleus to give a cinema camera autofocus. Without having to write code for the camera.
The iPhone and the camera being controlled would not be focused on the same point. Even if you calibrated the distance of the iPhone lens vs. the distance of the lens of the other camera then pointed both at the same target, that would only be valid at one specific distance, as to get two cameras pointed at the same target would require one to be at an angle to the other, and the difference in the focus would change depending on the distance to the object, making the autofocus feature useless except at one very specific distance which you could as easily have manually focused to and left it there.
If the cameras were set up so the lenses were parallel to each other you would then have the issue of the rig only working for objects large enough to be in the field of view of both cameras and with the surfaces facing the cameras being perfectly parallel to the axis formed between the two cameras. People's bodies generally are not shaped that way.
The only way I could potentially see this working would be to establish a real-time video feed sending the video from the camera being recorded to, to the iPhone, and having the app process that signal. At that point the iPhone becomes a processing module for the camera, and it *might* be possible to make this work, but it would almost certainly require the use of additional hardware to make that connection.