bentheanimator wrote:Fusion is a giant graphing calculator. It works with raw numbers. On a graph from 0.0 to 1.0, half way across is 0.5 or in a channel of color 0.0 is black, 1.0 is white and 0.5 is mid gray. That's a linear interpretation of color. It's a straight line.
0.5 is mid gray in nonlinear interpretation, unless you mean just middle value, which has little to do with mid gray in color perception sense. In linear domain, mid gray (perceptibly half way between black and 1.0 white) is around 0.18. In gamma-corrected images, like sRGB, correction lifts that value to somewhere around 0.45. What the display does with the values is another story, but in the end what must be achieved is light emission somewhere roughly around 18% of maximum luminance to be perceived as middle gray.
bentheanimator wrote:Where things get real funky is when you use a gamut like sRGB (aka viewable computer screen). Now mid gray is no longer at 0.5, it's at something like, 0.24 or something. I'm not in front of my computer. So all the math that you would use to add to make a number higher or lower either cranks way faster in certain ranges or has to be aware of the color gamut and compensate for the weird curvature of something like sRGB.
sRGB gamut has no correlation to nonlinarity, it is gamma that affects it. sRGB encoding lifts the middle value due to applied gamma correction. Math is math, question is rather, whether it is easy or not to model some physical behavior using simple operations like addition or multiplication. Linear-light domain allows handling values as if they are quantities of light (within limitations) where most effects are linear by nature.
bentheanimator wrote:A linear image is one where the highlights and shadows fall across the whole range from 0.0 to 1.0. it gives equal amounts of information to each pixel? When you use an sRGB. It shoves all the highlights and shadows into a really small area on the edges and makes adjusting those hard because there's not enough pixels to cover the breadth of the image.
There is equal amount of information in each pixel whichever gamma you use, question is what that information means. Linearized representation, not sRGB, bunches a lot of data into low end because each stop up has twice the value range: 0.25-0.5 to 0.5-1.0 to 1.0-2.0 etc. Gamma-encoded imagery (and log encoding) alleviate that by bending the encoding curve so that more code values are used for storing the low end, while compressing the highlights. I won't go into detail about why it is useful, but that's what happens.
bentheanimator wrote:To make it more confusing, the whole idea is intertwined with bit depth. A sRGB jpeg in 8bit only has 256 colors per channel to pull from. It's a glorified gif to a compositing program like Fusion. You work linear so that all your numbers are easier and that really only works when you have billions of colors so that you math can get really subtle. That's why compositing works with floats instead of integers if you can help it.
Bit depth and linearity are not correlated, bit depth only affects precision. 1bit image can be perfectly linear and cover humongous dynamic range, it just has no granularity inbetween its range.
bentheanimator wrote:That's a simplistic take on it because something like ACEScg is close enough to linear that all the tools are indecernable in their use but you could linearize of you want to.
ACEScg is literally linear, not close enough. Only difference from sRGB is the gamut (color primaries and whitepoint).
Long story short, your description is somewhat backwards. I blame the hideous misuse of terminology in Resolve for that.
shebbe wrote:KrunoSmithy wrote:Fusion is natively linear
In what way would you think it is natively linear? What does it mean? Last time I checked it is fully up to the user to decide what to do with any data inside Fusion.
This ^ but given that loaders by default don't linearize and there is no default viewer LUT and color palette is pure system sRGB one could even say Fusion is natively sRGB. Obviously that sounds wrong, but that's how it is. Compare it to Nuke for example, which by default linearizes all reads, applies a display lut and has linear-value color palette...