Although I don't have one of these new Video Assist 12G HDR units myself yet, my best guess (based on experience with other Blackmagic devices) is that this is probably a chroma subsampling (4:2:2/4:2:0) or color depth (8-bit/10-bit) issue with the HDMI signal. A second possibility is some kind of HDMI EDID detection issue (also related to the signal format).
A bit of background. Up through HDMI 1.4, which supports video signals up to 2160p30 resolution, HDMI always used at least 4:2:2 chroma subsampling (or in some cases RGB/4:4:4). With the introduction of HDMI 2.0 and support for 2160p50/2160p60 video signals, the HDMI spec boosted the total possible video bandwidth of an HDMI signal from the 10.2 Gbps of HDMI 1.4 to 18 Gbps in HDMI 2.0. The HDMI 2.0 spec also introduced 4:2:0 sub-sampling for the first time, which made it possible to fit a 2160p60 video signal (@ 4:2:0 sampling) within the 10.2 Gbps bandwidth of the older spec (2160p50/60 @ 10-bit 4:2:2 requires the full 18 Gbps of bandwidth).
This meant that it was possible to take the 10.2 Gbps HDMI 1.4 chips and apply a few firmware tweaks to make them HDMI 2.0 compatible. These HDMI chips would only support 4:2:0 sampling for higher frame rate 4K signals (2160p50/60), but the HDMI specifications let device manufacturers support a subset of the possible signal formats in a particular HDMI version and still claim compatibility. So these could be considered HDMI 2.0 devices without supporting the full 18Gbps bandwidth.
The practical affect of this is that many early HDMI 2.0 devices (both on the output and display side of things) only supported this 2160p50/60 4:2:0 signal format. It is also possible that staying within the 10.2 Gbps bandwidth helped maximize device compatibility and minimize cable bandwidth issues.
At any rate the HDMI output of a lot of early cameras (including those from Sony, Canon, and Panasonic) that could record at 2160p50/60 was similarly limited to 4:2:0 over HDMI at that resolution. Even if the camera could record internally at 2160p60 10-bit 4:2:2, the HDMI output was often still 8-bit 4:2:0.
I am not sure about the FS7 Mark 2, but the original FS7 Mark I definitely had this limitation (some discussion about this here):
https://us.community.sony.com/s/questio ... anguage=esAs HDR and 18Gbps HDMI chips have become more prevalent, this situation has started to change somewhat, with more recent camera releases supporting 2160p50/60 10-bit 4:2:2 output over HDMI (although these cameras often have an option to select the subsampling when outputting at 2160p50/60 for compatibility with older devices).
The overall effect of these issues, is that since there are more signal format options in the HDMI 2.0+ spec used for 2160p50/60 signals, there are more opportunities for signal format incompatibilities between devices or EDID negotiation issues.
In particular, I have noticed that Blackmagic devices often do not like 2160p50/60 @ 8-bit 4:2:0 signals on input and will often behave strangely when given this signal format. I've noticed issues with the Teranex Mini HDMI to SDI units, the HyperDeck 12G and, and DeckLink devices when fed this signal format. The exact problem may vary depending on the device but I have seen screen flickering/scrolling issues like in your video. These same devices often work fine when fed 2160p25/30 video (which is always 4:2:2) or 2160p50/60 at 10-bit 4:2:2.
Now, I don't know for sure if that is the problem with the new Video Assist 12G, but that would be my initial guess. Unfortunately, it may be a little difficult to troubleshoot this problem unless you have another device that can output 2160p50/60 video over HDMI to test with.