AdrianSierkowski wrote:I don't care if you have a 30K image shot at 120fps and played back at 120fps-- it'll look like crap unless you have enough data depth to render subtle colors, and luminance.
Try the other extremes of the triangle:
• 18 stops of Dynamic Range, 120 FPS with a WGA resolution will also look like crap.
• 18 stops of Dynamic Range, 6K resolution, at 12 FPS is not even a movie, just a pretty slideshow.
I do agree with you, but the data stream needs to be balanced between the three. I think we may agree that resolution was already “maxed out” for all practical purposes at 1080p. It’s been proven time and again the average Joe can’t tell 1080p from 4K. Sure you can use the extra resolution to reframe a shot, but you know what, you can also zoom in during the actual shoot, or use two cameras etc.
If I could have my way I would have capped the resolution at 1080p for one decade and focused on DR and frame rate. The average Joe does notice good DR and especially notices when a video is playing smoothly at 1/500 speed. The focus on resolution most companies have makes no sense to me either. If in ten years from now all cameras capture the same colors as the human eye, and 1/500 fps is common, THEN I would say higher resolutions are worth exploring.