Kays Alatrakchi wrote:About a month ago I suggested that Blackmagic might introduce an 8K URSA at next year's NAB and people kinda mocked the idea, but based on these announcements just this past week, I think I might have been onto something:
https://www.cinema5d.com/canons-upcomin ... planation/https://www.eoshd.com/2018/11/sony-deve ... iii-delay/I think 8K is closer than we might think. Whether or not it's something that anyone actually needs is quite arguable, but in my opinion it's pretty clear where things are headed. I am sure that an 8K URSA Mini Pro is probably already being prototyped as we speak, and I have to wonder if the BlackmagicRAW codec was designed in large part to address the recording and playback needs of 8K raw footage.
Don't sweat it Kays, idiots, and most of the comments here are mistaken except for ones like David and Chad's.
The data rates are small compared to modern memory busses. Storage I supassing even uncompressed requirements. Cinemartin Fran 8k is designed to record 8k to flash unconpressed. People with good idiom actually see in 8k, loosers without good vision Whine about it, and oh don't they love to Whine to look significant wanted loved rather than earn that respect by arguing something actually useful. So over sampling helps to render the delivery format more reliably, and according to some, preserves the lens shaping of the image cinetographers love. So, they are talking about 16k+, and holographic filming will be many times that. One reason for the sort of 4x oversampling is the dumb inadequacy of raw Bayer. Using vertical color filtering you could get better results at a lot less increase in resolution.
A few CES's ago ambarella announced a consumer level 8k camera chipset. I expect next CES will be a lower powered version with 400-600mb/s h265 maybe, even some h266 mode as well. It's possible it could be done version with 800mb/s+. NVIDIA integra arm has had enough camera processing power for 8k for years, the now old chip has 1600mb/s h264, and enough processing power likely to be able to do an 8k cineform raw replacement video codec (GoPro uses on stills) even in professional 8k 60fps in a mini like camera The nvdia board is credit card sized. On the latest processing platforms they could probably handle 16k, even on less than top card (but development is only as good as the developers ussually).
I don't see black magic raw as superior with its jpeg technique and samples, but built down to a price. Red's technology is probably licensed from cineform, so much stuff under 4:1 is likely effectively lossless, 8:1 maybe visually lossless, like high end ProRes, you might conjecture. It is probably pretty high end. CDNG is based on jpeg. You can get lower datarate and higher speeds by simply sacrificing detail under jpeg. By sculpting other parts of the image you can preserve a feel suitable for low grade distribution (which you could say cinema is at 4k in their tight datarates) as I and ambarella are aware of.
8k is a product that could have come out more than 10 years ago professionally, even earlier. In 2007 NVIDIA was trying to push release of a 8k like resolution as an industry standard, with 8k like modes on high end GPU cards. 8k displays and panels have been shown in recent years (I think the first 8k camera I heard of might have been 2003 or so, but forget). Anyway, a 8k+ still is still an 8k+ camera, and they have been bought for a while. Now, the Toshiba pure veiw sensor used in the old Nokia phone, was effectively uhd like 8k, good for the day, 8k pixels used to produce it's FullHD video to better quality (they actually went into details about this). A large reduction in noise effect was had.
The industry has been holding off 4k and 8k for marketing and cost reasons. While FullHD was a hurdle in the older days, eventually when it became cheaper, it went mass market, as 4k has now on the screen size. But they seem to now want to wait for things to become so cheap, ambarella becomes market leaders, while they make money drip feeding minor updates and standards to the older generation. Likely to raise their profit margins, by producing cheap to make hardware. If a sub $100 media player can handle 4k you can be reasonably certain 8k is not likely to be 4x$100. The difference between 720p panels and FullHD panels was not much after a few years. The price difference between FullHD and 4k is also likely not much. Once new 8k display production lines are paid down there will be likely little disappointed deference between producing an 4k or 8k display, they use about the same materials, but in different ratios (more cell walls etc) but more cost to produce and verify. Somebody could perhaps produce an 8k TV now for 20% more than a 4k 65 inch (using 4k 32 inch panel tech X 4), or around 80 inch using around 40 inch panel tech. 50-55 inch maybe the cheapest to use, but that would be up to 100 inch. The larger sizes have freighting and installation cost issues. But, 4k 24 inch monitor tech might yeild the cheapest in a 48 inch screen (mind you, I don't want another TN LCD TV). A descent monitor tech would have to be used. But an high quality 8k head mounted display could be done under $500, and eventually $200, and out do most TV and cinema experiences. Even putting a mid range phone in a headset could do that.
People who can see in 8k will get benefit, but even those who see in 4k should get benefit. Those who see in 2k or 720p, not so much. I love the texture of details, which needs contrast consumer codecs try to decrease.