Stefan Ihringer wrote:I don't know how many people have those eagle eyes that are necessary to discern different shades on 10bit monitors. This discussion reminds me of an audiophile forum
I never heard the difference between 48kHz or 96kHz
In the end, there ARE high end graphics apps on Mac. There ARE cross platform apps ( Nuke ) and there ARE apps that can do full-screen video on every OS (flash player). So I think the world will not end because Fusion will finally no longer be locked to Windows.
Eagle eyes? I'm not talking about 16bit, I'm talking the difference between 256 and 1024 levels, which pretty much everyone can see. That's like saying the difference between 8kHz and 22kHz is stuff for audiophiles only.
- Code: Select all
{
Tools = ordered() {
Background1 = Background {
CtrlWZoom = false,
Inputs = {
Width = Input { Value = 4096, },
Height = Input { Value = 2160, },
Depth = Input { Value = 4, },
Type = Input { Value = FuID { "Horizontal", }, },
TopRightRed = Input { Value = 1, },
TopRightGreen = Input { Value = 1, },
TopRightBlue = Input { Value = 1, },
Gradient = Input {
Value = Gradient {
Colors = {
[0] = { 0, 0, 0, 1, },
[1] = { 1, 1, 1, 1, },
},
},
},
},
ViewInfo = OperatorInfo { Pos = { 935, 181.5, }, },
},
},
ActiveTool = "Background1",
}
That's a 32-bit float gradient at a "normal" resolution, and you can plainly see HUGE banding issues, even if you have dithering on, right? There's even a Mach band effect. Enable 10-bit display and those 16 pixel wide bands become just 4 pixels wide, something dithering can reasonably take care of, and it's small enough that you don't get a Mach band effect.
If running a 10-bit GUI on OSX was easy, wouldn't Resolve be doing that? And 10-bit isn't just for Windows, Linux supports it too. Other professional cross-platform apps have this segmentation too. Like Adobe has 10-bit GUI for Photoshop and Premiere, just not under OSX.
I don't think the world will end with Fusion running on OSX, no. I just think it's crazy to say "you can't have these features, Windows and Linux users, because it would be unfair to OSX customers, and they might complain." Adobe doesn't take this approach, they allow 10-bit color on platforms that support it, and they haven't been crushed under the weight of public outcry.
Forcing 100% parity on all platforms puts undue stress on developers or on users. Would you cripple the Windows and Linux version OpenGL and OpenCL because you don't want to have an ifdef? There is no benefit, really. Heck, having 10-bit display even on
some platforms gives Fusion a competitive advantage over Nuke and AE.