8bit + frc vs true 10bit monitor

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

halfmanhalfalligatorhalfshark

  • Posts: 43
  • Joined: Wed Feb 10, 2021 12:15 am
  • Real Name: tim archibald

8bit + frc vs true 10bit monitor

PostSat Oct 16, 2021 10:53 pm

hi,

I'm looking for a 4k monitor (editing and grading) and while I cannot quite justify the cost of a 4k true 10bit panel, I've seen affordable 8bit+frc monitors that claim 10bit colour output. My question is how much appreciable difference is there to see between true 10bit and 8bit+frc panels? Will I be seeing flickering or other ill effects (and then wind up with a migraine)?

Cheers all.
I'm losing my Resolve.
Offline

ZRGARDNE

  • Posts: 145
  • Joined: Sun May 16, 2021 12:32 am
  • Real Name: Zeb Gardner

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 12:04 am

I am interested to hear others justification for purchasing a 10bit monitor.

Obviously 10 bit footage is great to edit. As we stretch it it is less likely to break or show banding.

But you can edit 10bit footage just fine on an 8 bit monitor, so these two aren't related.
Offline

Peter Cave

  • Posts: 2699
  • Joined: Thu Aug 23, 2012 6:45 am
  • Location: Melbourne, Australia

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 12:32 am

I use an 8 bit FSI monitor with Decklink card for calibrated grading.
To check for 8 bit artefacts like banding, the 10 bit MacOS fed GUI is perfectly adequate.
Resolve 16.2.6 AND Resolve 17.3.1
Mac OSX 10.14.6, Hackintosh i7 8700 32GB RAM, Radeon RX 580 8GB
Decklink Studio 2, FSI LM1770 Monitor
Offline

georgekg

  • Posts: 114
  • Joined: Sat Sep 23, 2017 8:23 pm
  • Real Name: Aleksandar Djordjevic

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 8:31 am

You won't end up with a migraine. If 8bit monitors causes migraine, everyone on a planet would have one
The most important thing is - do your job justify investing in true 10bit monitor? If you work mostly with 8bit footage, or even worse, if you just start to learn, then true 10bit monitor isn't yet for you. Only if money isn't an issue, then you shoud buy true 10bit monitor.
But if you want 10bit monitor as your reference monitor, then it is much better to invest in OLED TV (like LG C1 or CX series). That is much better investment in low budget colour grading monitor.
If you are buying monitor for GUI, 4K monitors are good in terms of realestate, but there are still (I don't know how to call it properly) some "issues" in DR with interface on high DPI monitors. You can check it here viewtopic.php?f=21&t=126722
Offline

halfmanhalfalligatorhalfshark

  • Posts: 43
  • Joined: Wed Feb 10, 2021 12:15 am
  • Real Name: tim archibald

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 9:47 am

georgekg wrote:You won't end up with a migraine. If 8bit monitors causes migraine, everyone on a planet would have one
The most important thing is - do your job justify investing in true 10bit monitor? If you work mostly with 8bit footage, or even worse, if you just start to learn, then true 10bit monitor isn't yet for you. Only if money isn't an issue, then you shoud buy true 10bit monitor.
But if you want 10bit monitor as your reference monitor, then it is much better to invest in OLED TV (like LG C1 or CX series). That is much better investment in low budget colour grading monitor.
If you are buying monitor for GUI, 4K monitors are good in terms of realestate, but there are still (I don't know how to call it properly) some "issues" in DR with interface on high DPI monitors. You can check it here viewtopic.php?f=21&t=126722



when i said migraine, i was referring to the 'FRC' part, (not the 8bit part) of 8bit+FRC. because its essentially flickering very fast between 2 values.

interesting to hear about DR's issues with high resolution displays. less than ideal.
I'm losing my Resolve.
Offline

Steve Fishwick

  • Posts: 333
  • Joined: Wed Mar 11, 2015 11:35 am

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 10:31 am

halfmanhalfalligatorhalfshark wrote:when i said migraine, i was referring to the 'FRC' part, (not the 8bit part) of 8bit+FRC. because its essentially flickering very fast between 2 values.


Just about every modern led consumer UHD/HDR TV is 8bit+FRC, very few are true 10bit panels and very few people complain of migraines watching them. Even some critical reference monitors are actually FRC. If it's implemented well there should be little difference and variation from a true 10 bit panel.

An 8bit only monitor is not suitable for critical reference grading and finishing. The point of reference monitors is accuracy and confidence and as far as possible 1:1 true pixel representation, not whether it is able to show 10bit footage without banding or otherwise.
Offline

halfmanhalfalligatorhalfshark

  • Posts: 43
  • Joined: Wed Feb 10, 2021 12:15 am
  • Real Name: tim archibald

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 11:42 am

Steve Fishwick wrote:
halfmanhalfalligatorhalfshark wrote:when i said migraine, i was referring to the 'FRC' part, (not the 8bit part) of 8bit+FRC. because its essentially flickering very fast between 2 values.


Just about every modern led consumer UHD/HDR TV is 8bit+FRC, very few are true 10bit panels and very few people complain of migraines watching them. Even some critical reference monitors are actually FRC. If it's implemented well there should be little difference and variation from a true 10 bit panel.

An 8bit only monitor is not suitable for critical reference grading and finishing. The point of reference monitors is accuracy and confidence and as far as possible 1:1 true pixel representation, not whether it is able to show 10bit footage without banding or otherwise.



Sooo.. bottom line is, 8bit+FRC is good, FRC isn't any kind of negative, and i can happily carry on looking for a 8bit+frc monitor to take home with me?
I'm losing my Resolve.
Offline

georgekg

  • Posts: 114
  • Joined: Sat Sep 23, 2017 8:23 pm
  • Real Name: Aleksandar Djordjevic

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 11:56 am

It depends what are you plan to do with it. For reference monitor only, oled is better solution for the same price. For other purposes 8bit + frc can be quite satisfactory.
Last edited by georgekg on Thu Oct 21, 2021 1:11 pm, edited 1 time in total.
Offline

Steve Fishwick

  • Posts: 333
  • Joined: Wed Mar 11, 2015 11:35 am

Re: 8bit + frc vs true 10bit monitor

PostSun Oct 17, 2021 12:44 pm

halfmanhalfalligatorhalfshark wrote:Sooo.. bottom line is, 8bit+FRC is good, FRC isn't any kind of negative, and i can happily carry on looking for a 8bit+frc monitor to take home with me?


A true 10bit panel is theoretically always better than a 8bit+FRC but like I said depending upon how manufacturers state their specs, there are in fact even some pricey reference monitors, which state true 10bit when they are strictly speaking FRC - honesty is at least confidence in other specs. In most modern high quality professional implementations and good quality consumer ones the difference is invisible to the naked eye and practical operational use. I have recently bought a led Panasonic consumer UHD/HDR TV as a client monitor, which is 8bit+FRC and it looks very good to me for 10bit output downstream of my 'true 10bit', reference monitor.

In short, you should be fine but as always - you only get what you pay for, there are a host of equally important other considerations in this space too.
Offline
User avatar

IsraEliteMedia

  • Posts: 316
  • Joined: Fri May 31, 2019 6:08 pm
  • Location: Israel
  • Real Name: Erik Davis

Re: 8bit + frc vs true 10bit monitor

PostMon Oct 18, 2021 5:30 am

[quote="georgekg"]
But if you want 10bit monitor as your reference monitor, then it is much better to invest in OLED TV (like LG C1 or CX series).

I am currently looking into one of the LG monitors you mentioned. My understanding is that you can only view 10bit on these monitors in "HDR" mode.

Which leads me to my question:

I would only see proper 10bit output IF my color space is in HDR?

I am slowly moving to ACES for my workflow but I am not delivering or editing in HDR?

I would like to be able to see the full potential of my FX6 10bit image though so this is the reason I am looking into the LG monitors.

Erik
SYSTEM SPECS
DR Studio 17.4.2 (2 seats) Win 10 Pro (fully updated)
AMD Ryzen Threadripper 2950x 16-core processor 3.50 GHz
128GB RAM
GeForce GTX 1080i, 11GB Memory, Driver 472.47 Studio Driver

Erik Davis, IsraEliteMedia--Zichron Yaakov, Israel
Offline
User avatar

shebbe

  • Posts: 126
  • Joined: Tue Mar 06, 2018 11:48 am
  • Location: Amsterdam
  • Real Name: Shebanjah Klaasen

Re: 8bit + frc vs true 10bit monitor

PostMon Oct 18, 2021 9:30 am

You haven't mentioned what your deliveries are.
If you only ever deliver for web there's zero use for 10bit monitoring unless it's HDR but that's a whole other category.

There's no point seeing 10bit color if your deliverable is always 8bit.
Workstation: ASUS PRIME X299-A / i9 7960X 16c/32t / 64GB DDR4 / 2x EVGA GTX1080Ti / 2x M.2 512GB & 1TB
Offline

mpetech

  • Posts: 281
  • Joined: Wed Sep 04, 2013 9:52 pm
  • Real Name: Dom Silverio

Re: 8bit + frc vs true 10bit monitor

PostMon Oct 18, 2021 1:28 pm

IsraEliteMedia wrote:I would only see proper 10bit output IF my color space is in HDR?

I am slowly moving to ACES for my workflow but I am not delivering or editing in HDR?

I would like to be able to see the full potential of my FX6 10bit image though so this is the reason I am looking into the LG monitors.

Erik


No, as long as your media and output settings are set to 10bit, you will see in 10 bit.
HDR standards like Dolby requires 10 bit or higher. Traditional SDR supports 8 bit and higher.
Offline

mpetech

  • Posts: 281
  • Joined: Wed Sep 04, 2013 9:52 pm
  • Real Name: Dom Silverio

Re: 8bit + frc vs true 10bit monitor

PostMon Oct 18, 2021 1:31 pm

Steve Fishwick wrote:Just about every modern led consumer UHD/HDR TV is 8bit+FRC, very few are true 10bit panels and very few people complain of migraines watching them. Even some critical reference monitors are actually FRC. If it's implemented well there should be little difference and variation from a true 10 bit panel.


Is this true? Almost every mid-range to upper-range UHD TV (Samsung QLEDs, Sony and LG OLEDs) I see are 10 bit according to their specs.
Offline

georgekg

  • Posts: 114
  • Joined: Sat Sep 23, 2017 8:23 pm
  • Real Name: Aleksandar Djordjevic

Re: 8bit + frc vs true 10bit monitor

PostMon Oct 18, 2021 1:34 pm

IsraEliteMedia wrote:I would like to be able to see the full potential of my FX6 10bit image though so this is the reason I am looking into the LG monitors.

Erik


No matter how good is led/lcd monitor (I'm NOT talking about high end devices with prices of average sedan), oled screens will always have better blacks. With good black you are always on the good road, no matter is it 8bit, 10bit, hdr... When your black is represented (almost) as it is, you can much easier correct other colours. Try to remove blacks as much as software allows you and you'll see how other colours are dependent on it. That is one of the reasons why all manufacturers strive to achive true black and good contrast.
Offline

Steve Fishwick

  • Posts: 333
  • Joined: Wed Mar 11, 2015 11:35 am

Re: 8bit + frc vs true 10bit monitor

PostWed Oct 20, 2021 7:15 am

mpetech wrote:Is this true? Almost every mid-range to upper-range UHD TV (Samsung QLEDs, Sony and LG OLEDs) I see are 10 bit according to their specs.


Yes it's true. I specifically mentioned Leds - Oleds are 10 bit by their very nature and the Qleds from Samsung aim to compete in this area. You have to consider that any of these consumer TVs, no matter how premium, are considerably less expensive than even small HD reference critical grading monitors, some of which are, as I say 8bit + FRC. There are many other reasons why such a monitor costs so much to produce and sell, which is not just based on volume.

You do not need to have HDR content to display 10bit, only that the HDMI input has extended RGB 12/10bit. I can see no banding on good SDR content on the Panasonic. Deep blacks are very popular currently with consumers, hence the premium for Oleds but it can be artificial for proper calibration and is alien to the experience from decades of cinema, where no celluloid or digital projection has truly opaque areas in the picture. The main problem for calibrating Oleds is that their infinite blacks can create issues for grading to avoid sub black issues, though this can be calibrated for. A calibrated image is not about getting the most pleasing image but the most accurate one.
Offline

Ellory Yu

  • Posts: 2604
  • Joined: Wed Jul 30, 2014 5:25 pm

Re: 8bit + frc vs true 10bit monitor

PostWed Oct 20, 2021 7:20 am

Might want to read this post too…

viewtopic.php?f=3&t=148021&p=797216#p797216
Blackmagic Design URSA Mini Pro 4.6K G2, Blackmagic Design Pocket Cinema Camera 6K, RED Komodo 6K
PC Workstation Core I7 64Gb, 2 x AMD R9 390X 8Gb, Blackmagic Design DeckLink 4K Mini Monitor, Windows 10 Pro 64-bit, Blackmagic Design DaVinci Resolve 17.1.4
Offline

Mario Kalogjera

  • Posts: 908
  • Joined: Sat Oct 31, 2015 8:44 pm

Re: 8bit + frc vs true 10bit monitor

PostWed Oct 20, 2021 8:54 am

No one discussing the requirement that the GUI GPU also need to be capable of 10-bit in OpenGL surfaces (Quadro and RTX do) if there is no separate display card?
If it's only the GUI GPU, it is also unclear AFAIK whether Resolve support s 10-bit (clean feed included) in Windows. One needs to sort these out before determining whether he's in need of even 8 bit+FRC monitor.

Sent from my Mi 9T using Tapatalk
Asus Prime X370-Pro, R7 3700X@PBO, 32 GB G.Skill AEGIS DDR-4@3200MHz, Asrock RX 580 8GB, Adata A400 120 GB system SSD, A2000 500 GB scratch SSD, Media storage:"Always in motion is it", Windows 10 Pro+Resolve Studio 17.3.2+Fusion Studio 17.3.2
Offline

Steve Fishwick

  • Posts: 333
  • Joined: Wed Mar 11, 2015 11:35 am

Re: 8bit + frc vs true 10bit monitor

PostWed Oct 20, 2021 12:18 pm

Mario Kalogjera wrote:No one discussing the requirement that the GUI GPU also need to be capable of 10-bit in OpenGL surfaces (Quadro and RTX do) if there is no separate display card?
If it's only the GUI GPU, it is also unclear AFAIK whether Resolve support s 10-bit (clean feed included) in Windows. One needs to sort these out before determining whether he's in need of even 8 bit+FRC monitor.


If you're doing this as a hobby or just editing then the gpu can suffice but bit depth may be the least of your concerns. Professionally, it is absolutely universal and necessary that professional grading monitors are used with a Decklink or Ultrastudio device (In the case of Resolve), where you can be assured of bit depth and accurate colour. As it is, I see no mention of the OP asking specifically for a monitor to work with the gpu only or otherwise.
Offline

mpetech

  • Posts: 281
  • Joined: Wed Sep 04, 2013 9:52 pm
  • Real Name: Dom Silverio

Re: 8bit + frc vs true 10bit monitor

PostWed Oct 20, 2021 1:27 pm

Steve Fishwick wrote:
mpetech wrote:Is this true? Almost every mid-range to upper-range UHD TV (Samsung QLEDs, Sony and LG OLEDs) I see are 10 bit according to their specs.


Yes it's true. I specifically mentioned Leds - Oleds are 10 bit by their very nature and the Qleds from Samsung aim to compete in this area. You have to consider that any of these consumer TVs, no matter how premium, are considerably less expensive than even small HD reference critical grading monitors, some of which are, as I say 8bit + FRC. There are many other reasons why such a monitor costs so much to produce and sell, which is not just based on volume.

You do not need to have HDR content to display 10bit, only that the HDMI input has extended RGB 12/10bit. I can see no banding on good SDR content on the Panasonic. Deep blacks are very popular currently with consumers, hence the premium for Oleds but it can be artificial for proper calibration and is alien to the experience from decades of cinema, where no celluloid or digital projection has truly opaque areas in the picture. The main problem for calibrating Oleds is that their infinite blacks can create issues for grading to avoid sub black issues, though this can be calibrated for. A calibrated image is not about getting the most pleasing image but the most accurate one.


Ahh, OK. We never checked TVs below the LG/Sony OLEDs and Samsung QLEDs. Makes sense.

Return to DaVinci Resolve

Who is online

Users browsing this forum: 2D3D4K, Bing [Bot], Bécar Video Production, crtkecnkl659, garylamb, jfykeaucq871, Somaybe, TheBloke and 96 guests