jdelisle wrote:Andrew Kolakowski wrote:Maybe there is a reason for gazillion lossy formats and none of the lossless ones
I'm curious how that makes sense, it feels like I must be misunderstanding something here.
If I were a professional working on e.g. a movie, I'd want to retain as many artifacts as possible in the best possible quality. I'd keep all the original footage and everything I render through Resolve in a lossless compressed format. This way, in 10 years even if I cannot reproduce the render using Resolve, I still have the master render to use as an input to further edits/ improvements etc. in any tool I wish. Why would I want to start from a lossy version? It'd be like a photocopy of a photocopy.. each time losing quality and original data.
Forgive my lack of insight into the video industry, but assuming the final movie output is rendered by Resolve, wouldn't a lossless compressed master and a lossy compressed versions be required by the client, or for my own safe keeping?
What are pros delivering when say a 4k Bluray is getting pressed? Lossy codecs?
Pro movies use DPX or EXRs so this is uncompressed (from 10 to32bit float precision). No one uses lossless as it's just not worth it. Simple lossless codec will be <2:1 for straight out of camera footage. Complex one will be at best 3:1, but slow to decode.
In the same time many movies were made using intermediate codecs as they simply are good enough even for big screen. Depending on the job profile you may have uncompressed master (definitely for Hollywood big titles), but there are cases where best master is actually intermediate codec. It doesn't mean much to be honest. Such a movie can still win Oscar
In most cases even if there is a 16bit TIFF master most delivery masters will be made from intermediate codecs. Then they are compressed again and again and again
Only few chosen deliveries will be made from main TIFF master- mainly DCP master. Netflix is picky about quality, but they still won't be able to tell if you made your IMF master from ProRes or TIFF sequence. They also later compress this crazy high quality master to 20Mbit for UHD, so whole amazing master is about pointless (you always need higher quality master compared to your end format though).
Have you actually looked what you loose with intermediate codecs? Do you know how curve for bitrate vs quality looks like?
If 50Mbit h264 can give you 99% quality then in order to get to 99.5% you may need eg 100Mbit. It's crazy non-linear curve. If codec has 5:1 mode this is already very high quality and things like 3:1 can simply replace uncompressed/lossless in real world.
For SD it's easy these days to use uncompressed or lossless. Try doing the same for UHD
It's all driven by money and budgets in real world, so things like perfect scenarios basically don't happen.