Interesting.
But from a technical point of view his article is wrong on many levels:
the real issue is that the gray-scale range of video today is too limited and doesn’t accurately represent reality.
The dynamic range is not a gray-scale range but a contrast ratio. Capturing a similar dynamic range to the human vision is very possible with analog film or modern digital cameras like Arri Alexa and RED Dragon.
Come to find out, “reality” is a pretty complex thing. The sun generates about 1.6 billion NITS (“nits” is a measure of the amount of light). Starlight generates about 0.00001 nits. Your eye can easily see both, though not at the same time. That’s a pretty incredible range. In fact, the eye can easily see about 100,000 gray scale values at a time.
Again, this is about contrast ratio and not gray scale values. A contrast ratio of 100.000 : 1 does not mean the eye can distinguish 100.000 steps in-between. The eye uses a gamma adjustment and compression on the bright end.
Today, most video formats compress images into an 8-bit gray-scale straight-jacket. AVCHD, DV, and H.264 are all 8-bit formats. 10-bit video, such as ProRes, provides up to 1024 grayscale values. Better, but nowhere close to that 100,000 range. Expanding to 12-bit video, which some higher-end cameras provide, yield 4,096 grayscale values. While at the high-end, 16-bit video provides 65,536 gray scale values, pretty closely emulates what the eye can see.
You can compress a very wide dynamic range into 8-bit. And then there is the power of LOG. According to his theory you just need 16-bit video and then you emulate what the eye can see. That is just plain misunderstanding how things work.
Today’s cameras capture FAR more dynamic range than any of our current codecs support. So much so, that Dolby was able to go back to older film footage, rescan the negatives and color grade them using their new system – which they showed implemented on a BaseLight system – which made the old footage look amazing.
Again, this is totally wrong. Codecs support all the dynamic range the cameras are able to capture. The limit is the sensor. And rescanning with better scanners does clearly deliver better scans - shouldn't come as a surprise.
The problem is that our entire workflow from production through post to compression and distribution is locked into an 8-bit video format; which destroys all the dynamic range in an image. But, existing infrastructure is too complex to easily replace with all new gear.
Did he live under a rock for the last decade?
Post did not use 8-bit since many many years. The DPX file format has been there since 1994.
When you read at Dolby
http://blog.dolby.com/2014/04/introduct ... by-vision/you can see that what they write is different to what Larry wrote about the topic.