Differences between Sony 50M 4:2:2 10bit and 100M 4:2:2 10bi

Apologies for a slightly off-topic post, but I hope someone may have an answer. I just purchased a Sony A7Cii; I'll be recording in XAVC HS 4K (Sony's flavor of H.265), and am trying to decide between recording in 50M 4:2:2 10 bit and 100M 4:2:2 10 bit. I'd prefer to get the best image quality possible, but I'm starting a long-form documentary that will probably require a lot of footage and my storage is not unlimited.
What sort of testing would best reveal the quality differences between the 50M and 100m recordings? Should I be looking for resolution and noise differences? Motion artifacts? Degradation when pushing color grades to the limit? I guess another way of asking this is should I shoot some lens charts at different ISOs, or shoot some sort of test that would involve (repeatable) motion, or play with some crazy color grades—or all of the above? Any suggestions appreciated!
Resolve 18.1.4 Studio (soon to be upgraded to 19 or 20), M1 Mac Mini 16GB ram, Mac OS 12.7.6
What sort of testing would best reveal the quality differences between the 50M and 100m recordings? Should I be looking for resolution and noise differences? Motion artifacts? Degradation when pushing color grades to the limit? I guess another way of asking this is should I shoot some lens charts at different ISOs, or shoot some sort of test that would involve (repeatable) motion, or play with some crazy color grades—or all of the above? Any suggestions appreciated!
Resolve 18.1.4 Studio (soon to be upgraded to 19 or 20), M1 Mac Mini 16GB ram, Mac OS 12.7.6