10 bits Monitor on Davinci

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

10 bits Monitor on Davinci

PostTue Aug 08, 2017 12:04 pm

So, after research a LOT around internet and gather lots of not conclusive answers, I need someone which can give me a real answer for this.

Davinci CAN or CAN'T output images in 10bits (not export, output for the monitor) with GTX cards.

So, what I have until now, using a 10bits monitor:

GTX 10xx - As far I could get you can activate the 10bpc option on Nvidia Window. But will only work through the Display Port and for softwares which use DirectX (Don't know if davinci uses opengl or directX).

NVidia Quadro - You can have a 10bits output as far I know, but I'm not sure if you have the option of 10bits on Premiere Scopes (don't have a quadro to test). And if it's a real 10bits output.

Blackmagic Deck Link - Have no Idea.

So, anyone can give me the real answer for this? Will Davinci work with a 10 bits monitor with one or more of these 3 cards? Will work with the GTX 10xx?

Thanks!
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline
User avatar

Micha Clazing

  • Posts: 240
  • Joined: Sat Apr 01, 2017 3:26 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 12:21 pm

Resolve only outputs "accurate" video over a playout card like the Decklink, and yes, 10-bit is supported there. Primary display cards like a GeForce or Quadro will only get "basic" 8-bit output in the GUI viewer.
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 12:25 pm

Micha Clazing wrote:Resolve only outputs "accurate" video over a playout card like the Decklink, and yes, 10-bit is supported there. Primary display cards like a GeForce or Quadro will only get "basic" 8-bit output in the GUI viewer.


100% sure? Thx for the answer!
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 12:35 pm

Micha Clazing wrote:Resolve only outputs "accurate" video over a playout card like the Decklink, and yes, 10-bit is supported there. Primary display cards like a GeForce or Quadro will only get "basic" 8-bit output in the GUI viewer.


On some Macs with correct GPU (and El Capitan+) Resolve GUI can be 10bit (this is the whole reason to have setting for it in preferences). Accuracy is a different matter :)

André, you have 5K Dell monitor? This should support 10bit. Are you running it over 2xDP cables? What if you try just 1 cable with UHD resolution?
Offline
User avatar

Micha Clazing

  • Posts: 240
  • Joined: Sat Apr 01, 2017 3:26 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 1:04 pm

Andrew Kolakowski wrote:On some Macs with correct GPU (and El Capitan+) Resolve GUI can be 10bit (this is the whole reason to have setting for it in preferences).

Oh, that must be why-- I use Windows exclusively and that option isn't there. I wonder why it's Mac only, Windows supports 10-bit output, and both AMD and nvidia have extended that capability from just their workstation cards to Radeons and GeForces as well for a while now. Maybe once Blackmagic has sold enough Decklink Mini Monitor cards they'll finally add support for proper GUI monitoring (full screen on a second/third monitor) without requiring extra hardware.
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 1:27 pm

Andrew Kolakowski wrote:
Micha Clazing wrote:Resolve only outputs "accurate" video over a playout card like the Decklink, and yes, 10-bit is supported there. Primary display cards like a GeForce or Quadro will only get "basic" 8-bit output in the GUI viewer.


On some Macs with correct GPU (and El Capitan+) Resolve GUI can be 10bit (this is the whole reason to have setting for it in preferences). Accuracy is a different matter :)

André, you have 5K Dell monitor? This should support 10bit. Are you running it over 2xDP cables? What if you try just 1 cable with UHD resolution?



I have dual boot CPU, mac and windows. And the monitor is this one:

https://www.bhphotovideo.com/c/product/ ... n_led.html


And yes, DP cable, 1 DP cable.
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 3:20 pm

It still should work as this monitor is also 10bit.
Probably only Apple's official AMD cards work on Mac.
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 3:35 pm

Andrew Kolakowski wrote:It still should work as this monitor is also 10bit.
Probably only Apple's official AMD cards work on Mac.



How I can be sure it is working at 10bits?
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 3:45 pm

I already answered this question. You need to see 30bit ARGB2101010 in GPU details.

Image
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 7:53 pm

Andrew Kolakowski wrote:I already answered this question. You need to see 30bit ARGB2101010 in GPU details.

Image



Thank you!
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Davis Saïd

  • Posts: 14
  • Joined: Mon Aug 07, 2017 11:05 pm

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 9:06 pm

Not to muddy the waters (too much), but as Andrew alluded, you can get a 10-bit signal from a more recent NVIDIA GPU under macOS 10.11 or later.

Since Apple currently only ships or sells AMD GPUs (not counting integrated graphics), automatic support for NVIDIA GPUs is not active by default. One of my systems is an “old” Mid-2010 Mac Pro. It has a GeForce GTX Titan X (Maxwell) GPU installed in it and I have it set (via SwitchResX) to output 10-bit signals.

Graphics-Displays System Info (DS).png
Graphics-Displays System Info (DS).png (473.32 KiB) Viewed 21481 times


A portion of the EDID report from SwitchResX:
Code: Select all
Display type:
-------------
   RGB 4:4:4 & YCrCb 4:4:4 & YCrCb 4:2:2 Color Encoding Formats
   Display is non continuous frequency
   Default color space is not sRGB standard
   Preferred timing mode includes Native Pixel Format


Input signal & sync:
--------------------
Digital Input
   10 Bits per Primary Color        <------------
   DisplayPort interface



All that said, I would agree with others who have said that using the Timeline Viewer in the Resolve GUI isn’t a good idea for color critical work. There are many reasons for this. Among them are:

  • The quality of image displayed in the Viewer of Resolve’s GUI is dependent on the quality of the physical display itself, and how well the display is calibrated (and how the display holds the calibration).
  • I haven't found specific mention of this in the Resolve manual, but the Viewer windows in Resolve are almost always scaled (not showing pixel for pixel, at least for non-SD sized projects), so some sort of scaling is used. And, it isn’t clear if the Viewer is using the systemwide ColorSync profile (or not).
  • Signals sent out via GPUs to monitors may not be representative of an actual video signal.
  • I would agree with others who’ve said that using a BMD video output device to a properly calibrated (and suitable) external video display is the only guaranteed way to know that the signal path is one that is proper for previewing video. Also, having a display that can show all the pixels from a video frame without any scaling is very valuable when scrutinizing the effects of filters and other grading choices.
  • (Edited/Added for clarity) As Andrew K. mentions a bit further down in the thread, it is possible for a monitor directly connected to a computer via a GPU to display a proper image (with a valid signal path). As Andrew also mentions, a benefit would be not having to convert from RGB to YUV/YCbCr to RGB. It’s more of a software issue than a hardware issue nowadays.

    Applications (at least on macOS) can bypass any systemwide (ColorSync) color profile in a Viewer window. QuickTime Player (7 and earlier), as well as FCP 7 (and earlier) were able to properly use a ColorSync profile (and allow for gamma changes) in order to present a valid image on a GUI monitor.

    One remaining issue with using a GUI monitor for displaying video images is the size of the Viewer window/widget (and if one cannot get a pixel accurate sized image/frame size…

    And, it’s a tough call for BMD, as they be adding a feature to Resolve and Fusion that would sort of obviate the need (in many cases) for someone to also have to buy a video output/monitor card. I/O cards would still be in demand. Again, using a BMD video monitor/output card to a suitable, calibrated display would provide a guaranteed and known valid signal path for preview, without having to rely on any black magic by the OS, GPU drivers, etc.
    ;)

Hope this helps…
Last edited by Davis Saïd on Wed Aug 09, 2017 1:55 am, edited 1 time in total.
Davis S.

MacPro5,1 - 128 GB RAM - 12-core 3.33 GHz - GTX Titan X 12 GB
macOS 10.13.6
Resolve Studio
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 9:53 pm

At least it's as easy as using SwitchResX (no need to use some deep hacks).
Accuracy of Resolve GUI preview is a 2nd and much bigger problem. It would need a proper validation against BM preview with some measuring probe.
Offline
User avatar

Craig Marshall

  • Posts: 949
  • Joined: Sun Apr 07, 2013 4:49 am
  • Location: Blue Mountains, Australia

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 10:12 pm

AndréRodrigues wrote:
...Blackmagic Deck Link - Have no Idea...



You probably should seriously consider using a Decklink (or mini-monitor) out of Resolve to your 10bit Monitor because it guarantees a 'baseband' Video signal, not a Computer signal. Although we succeeded in achieving a 10bit path from Quadro GPU to Display Port IPS on Win 10 64bit, we now use SDI exclusively with a simple SDI to Display Port converter to achieve 10bit 4:4:4 monitoring using a 10bit 'computer' IPS monitor calibrated with internal 3D Luts to Rec.709.

Jason Myres, professional Colorist and Moderator on the Lift/Gamma/Gain Forum expresses succinctly why so I will quote his recent comment to a similar question:

"GPU output (DP/HDMI) and a Decklink/mini monitor card output) ..are two very different things and it goes way beyond whether they are 8 or 10-bit. The first one is a standard graphics card output, the second (Mini Monitor) is a Baseband SDI/ HDMI video output. The difference comes from the fact that one is intended (and modified) to suit a computer display, and the other is a fully legitimate video signal intended for broadcast monitoring. They are two different signal types with different color spaces and signal paths. Don't try to compare them, as they literally have no relation to one another.

In certain cases it's possible for images to "look the same" on both graphics cards and video I/O cards, but they were really designed for two different purposes.

Graphics cards are based around computer signaling standards (VGA, DVI), and are purpose-built to render Graphical User Interfaces and text. They can display video, but that video has to be rasterized, or mapped to the appropriate pixels in your monitor, before it can be displayed. The output from graphics cards is generally limited to legal levels (16-235), which means no full range output (0-255), including sub-blacks and super whites. They also have no sense of interlacing.

Those things, along with any processing being applied by the graphics card, including sharpness, contrast, gamma, or color space adjustments (usually to sRGB), are what generally makes them unuseable for critical viewing. The signal has been embellished, so what you see is not what you get. And, even if you were to try, it's very difficult to completely defeat, especially in OS X.

Video I/O cards are built around SMPTE standards, and are designed for passing unaltered ("baseband"), video signals used in film and broadcast television, in and out of your computer. They are specifically designed not to modify the signal unless you specify it, and when they do it's usually limited to formatting tasks like changing frame rate, frame size, etc. They provide a "proper video signal", which is exactly what you want for critical monitoring."
4K Post Studio, Freelance Filmmaker, Media Writer
Win10/Lightworks/Resolve 15.1/X-Keys 68 Jog-Shuttle/OxygenTec ProPanel
12G SDI Decklink 4K Pro/Calibrated 10bit IPS SDI Monitor
HDvideo4K.com
Offline
User avatar

Uli Plank

  • Posts: 21107
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 10:43 pm

And all this is valid IF you work for TV ;-)
Maybe AI can help you. Or make you obsolete.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostTue Aug 08, 2017 11:02 pm

Craig Marshall wrote:....
Jason Myres, professional Colorist and Moderator on the Lift/Gamma/Gain Forum expresses succinctly why so I will quote his recent comment to a similar question:

[i]"GPU output (DP/HDMI) and a Decklink/mini monitor card output) ..are two very different things and it goes way beyond whether they are 8 or 10-bit. The first one is a standard graphics card output, the second (Mini Monitor) is a Baseband SDI/ HDMI video output.
....


Well- this is bit of myth and legacy approach.
Here is another view which is adjusted to current technology possibilities.

Resolve and other grading tools work in RGB and use GPU to do its magic. GPU is connected directly (in typical case) to a monitor. This is all what we need.
This link is actually more accurate and better than using video card because:
- data goes directly from GPU to monitor without any additional delay
- it's rather always 4:4:4, where probably 80% of typical studio setups still uses 4:2:2 YUV path (to save bandwidth)
- it avoids RGB->YUV->RGB conversion (which is never 100% lossless)- this happens on every YUV video chain
- it uses less resources- no need to copy data from GPU to card
- saves money and slot (no need for any additional card)
- it can be even 16bit pipe, where most video cards can do max 12bit
- it can use V-sync to guarantee proper sync, like video chain does
- it actually avoids problems on wrong conversion between RGB<->YUV (it's 1:1 RGB pipe from GPU to monitor)
- it's not restricted to specific refresh rates (just by connection bandwidth limits, e.g. HDMI 2.0 ect)- it can do about everything what your monitor will accept, e.g. 120Hz
- it's the only easy solution which allows atm. to monitor 8K (or 4K 50/60p 4:4:4)

Accuracy- it's just a matter of software. It's fairly easy to separate preview from any OS influence. There are software which already do it- just not Resolve.
When we talk about grading software which works in RGB and then about video pipe (which in most cases is YUV) then whole point of video pipe almost looses sense. RGB pipe to monitor is what you ideally want. YUV pipe is juts a compromise to save bandwidth.
If we were talking about some broadcast chain which operates in YUV then yes- you don't want to go to RGB anymore (we already left RGB world when YUV master was made). In case of Resolve, compositing, finishing tools you want RGB preview to your device and GPU is ideal for providing it.

Issues with GPU monitoring:
-because it uses HDMI/DP technology cable length is limited (use converter to SDI to gain distance if needed)
-maybe interlacing issue, although it can be sorted and it's soon will be gone anyway

Issue with refresh rate is very easy to sort out and some tools (e.g. Flame) already do it, even for PC monitors which has to be "forced" to different refresh rates. When you start Flame project it checks what fps you chosen and it tries to force monitor to correct refresh (you can even use e.g. 48Hz or 72Hz for 24p projects). All what you need is proper "video" monitor.

For small studios where monitors/TVs are next to machines GPU monitoring is ideal.
Video pipes are good for broadcast and places where signal has to go long way. This soon also going to be replaced by IP based workflows, which allow to send video (as data) at way longer distances. Times of old fashion SDI video chains are counted.
Offline

Peter Cave

  • Posts: 3754
  • Joined: Thu Aug 23, 2012 6:45 am
  • Location: Melbourne, Australia

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 10:55 am

Final rendered output is YUV in ProRes, Avid codecs (DNxHD etc), H264 and most broadcast formats. RGB is rare as a delivery format. THAT is the reason a proper YUV monitoring path is the best and most accurate solution.

I have a Flanders Scientific monitor fed by a BM Mini Monitor and running on a 5k iMac. I have the iMac screen closely matched to the Flanders. It IS possible to make them match but it requires the appropriate knowledge and skills to do it. It is not something that is easily explained on a forum.
I learnt during my training at our national broadcaster and used to line up production studio control room monitors every morning. Adjusting 20 to 30 monitors each morning (and matching them) with a proper probe REALLY sharpens up one's skills.

I also do NOT use a probe on the iMac except to set the white to an accurate 6500K.

I have also had my eyes tested for colour accuracy as 10% of the male population have some degree of colour blindness, which is most commonly in the red/green part of the spectrum, slap bang in the skin tone area!

I have followed a lot of these kind of forum discussions and most people have an incomplete understanding of how & why the technical side of things work. There are no simple explanations.
Resolve 18.6.2 Mac OSX 13.6 Ventura
Mac Studio Max 32GB UltraStudio Monitor 3G
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 11:12 am

Peter Cave wrote:Final rendered output is YUV in ProRes, Avid codecs (DNxHD etc), H264 and most broadcast formats. RGB is rare as a delivery format. THAT is the reason a proper YUV monitoring path is the best and most accurate solution.



Don't assume that YUV and only YUV=proper video chain- this is not the rule at all.
YUV is just a way of saving bandwidth and making signal more compression friendly. RGB doesn't compress so well due to luminance info been mixed with chrominance, so that's why all delivery codecs are rather YUV based.
Again- your work in Resolve in RGB. When you monitor with BM (over typical YUV pipe) your RGB pixels are send to BM card as RGB (or already converted YUV). Card converts them to YUV (just to save bandwidth) and sends to your monitor. At monitor YUV is converted back to RGB (with 4:2:2 being up-sampled back to 4:4:4) for the final display- hello we are back at RGB! This whole YUV conversion is actually not needed at all and it doesn't happen if you monitor in 4:4:4 on higher-end cards. End result on monitor is the same if all conversions are done correctly (except chroma being up-sampled). It's actually RGB chain which is more accurate.
Based on your logic if I use RGB 4:4:4 video monitoring this is not anymore "proper video monitoring" and my final YUV masters will look wrong on TVs (due to not using YUV monitoring chain) :) Tell this to high-end studios which use RGB monitoring as this is is the way to judge quality of keying/edges (YUV 4:2:2 smooth edges due to chroma subsampling).
For grading/composting etc YUV is actually useless and exists only mainly due to technology limitations (need of bandwidth saving).
When you export from Resolve for final/broadcast delivery your RGB is converted to YUV. This YUV meant to give same end result as your RGB, but in most cases this is actually the "worse signal/representation" of your work as it uses special techniques (chroma subs-smapling/compression) to save bandwidth. YUV is just a middle man which help with technology limitation. In ideal scenario it's not needed at all, but of course real world is a different story. This is the point where you trust your software- at the end this is simple step done using fixed math, but still so many apps mess it up. Once exported this is your master (YUV based) and from now one it should stay as YUV. Any conversion etc should be done in YUV (no real need to go to RGB anymore). If all possible conversions which it went through (until you see it on your TV) were correct it should look "the same" as on your grading screen (regardless of RGB or YUV monitoring chain as they also should look "the same"). You could "capture" it from TV signal, youtube etc. put into Resolve and it should (after it goes through YUV->RGB conversion on import) look the same as you original timeline.

You are making one big mistake in your judgment- you are using 2 different displays to prove that YUV chain is better. Properly calibrated display for YUV chain and random monitor for RGB chain. Send RGB signal to you FS monitor and it will be as good as YUV- well actually better as less conversion is happening on the way and it's 4:4:4. Now, remove BM card, use software which supports proper GUI preview (isolated from OS color engine, e.g. by using OpenGL surface) and your result will be 100% the same as using BM RGB monitoring+much simplified. These will be exactly the same RGB pixels which Resolve sends to BM card, which passes it to monitor (and actually "better" pixels compared to YUV chain).

It's all legacy. Now are are getting similar story as due to the chase of 8K pipes DP technology is proposing (actually already here) compression in the actual GPU->monitor link. This is equivalent of YUV appearance in old days- way of saving bits.

Proper video chain is mainly about proper/locked refresh rate. This is the biggest problem when using GPU with PC monitors for video display. Actually accuracy of color is another story. Both can be sorted quite easily and some software already do it.

Just a side note: about all pro monitors including Dolby have HDMI or DP ports. When you use correct software and feed signal from GPU it's as good as SDI feed (quite often even more accurate). Sony has just added HDMI port to their BVM-X300 v2.

And another one: data which travels over SDI isn't (when it comes to actual video data) any better than what travels over HDMI or DP. It may use different structure, packing scheme, but that's different story. At the end it forms exactly the same digital signal- there is no magic there. It's not analog world, but digital- 0 and 1. SDI 4:2:2 YUV data is actually (as far as I know) exactly the same (or about the same) structure as v210 (4:2:2 10bit uncompressed, which is so well supported in many apps). Just 3x 10bit YUV packed into 32bit words. Again- no magic, just bits. RGB will be probably very similar to BM RGB 10bit uncompressed (r210) or AJAs r10K. Maybe packing, bits order is different but these are details.

And another one: when you use crappy monitor fact that you have BM card won't help you much at all.

Last one- don't try to use GPU monitoring from software which is not designed (validated) for it, like Resolve. This is still not a prove that such a monitoring isn't any good or correct.
Last edited by Andrew Kolakowski on Wed Aug 09, 2017 1:06 pm, edited 2 times in total.
Offline
User avatar

Uli Plank

  • Posts: 21107
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 12:40 pm

Andrew Kolakowski wrote:Last one- don't try to use GPU monitoring from software which is not designed (validated) for it, like Resolve. This is still not a prove that such a monitoring isn't any good or correct.


Right. This is part of BM's business model. If you don't like it, BUY another software.
Maybe AI can help you. Or make you obsolete.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 1:31 pm

Perfect technical answers, thank you so much. :D

But, got me more confused.

So, when I buy a decklink and plug it on my computer and plug a 10 bits monior on it. What I have to do? Davinci will automatically give the signal to that monitor? Don't need config nothing? Or I need other steps?!

What I didn't understand is this "timeline preview" thing. U mean the monitor which u use when editing, is that? There's another monitoring window on davinci is that?

And one question: Could use a quadro 2000 for example instead the Decklink?! Because it's way cheaper.
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 1:52 pm

Your BM preview out of Resolve is automatic- well almost. You have different settings for it. Read manual!
Forget about Quadro (Quadro 2000 is old and crap, also new one is 2x more than BM card). Buy Mini Monitor 4K and you are done. It just won't do 50/60p UHD if you need it.
Also- make sure your monitor supports multiple refresh rates (if it's not pro one), but more PC alike.
Offline

Davis Saïd

  • Posts: 14
  • Joined: Mon Aug 07, 2017 11:05 pm

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 3:12 pm

AndréRodrigues wrote:What I didn't understand is this "timeline preview" thing. U mean the monitor which u use when editing, is that? There's another monitoring window on davinci is that?


At least in my post, the “Viewer” I referred to was the timeline Viewer (as referred to in the DR manual). That is, the timeline Viewer, or the source Viewer (unless you have only one Viewer active), is the window/viewport in the DR GUI that shows the video of the timeline (or source/Media Pool clips).

This contrasts with source/timeline “previews” shown on an external broadcast monitor connected via a BM video output device/card.
Davis S.

MacPro5,1 - 128 GB RAM - 12-core 3.33 GHz - GTX Titan X 12 GB
macOS 10.13.6
Resolve Studio
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 3:36 pm

Andrew Kolakowski wrote:Your BM preview out of Resolve is automatic- well almost. You have different settings for it. Read manual!
Forget about Quadro (Quadro 2000 is old and crap, also new one is 2x more than BM card). Buy Mini Monitor 4K and you are done. It just won't do 50/60p UHD if you need it.
Also- make sure your monitor supports multiple refresh rates (if it's not pro one), but more PC alike.



Perfect!

So, Decklink + Mini MOnitor 4k and is that. I'll have 4k + 10bit monitoring and will be happy right?

That's what I need. Thanks man!
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 3:47 pm

Decklink=Mini Monitor :)

All what you need is DeckLink Mini Monitor 4K or even DeckLink Mini Monitor (if you do just HD and no 50/60p HD).
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 4:17 pm

Andrew Kolakowski wrote:Decklink=Mini Monitor :)

All what you need is DeckLink Mini Monitor 4K or even DeckLink Mini Monitor (if you do just HD and no 50/60p HD).



OOOO I was thinking you are indicating some "video monitor" called mini monitor hahahahah

Didn't know the "second" name of the Decklink. Ok, I'll use it with the Dell I said before.

Thank you!
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 4:36 pm

Hmm...not 100% sure if this monitor will accept e.g. 24p or 23.976p. It's bit unclear.
Vertical scan range
50 Hz to 75 Hz - so this suggests it won't

Video display capabilities
(DP & HDMI & MHL playback)
480i, 480p, 576i, 576p, 720p, 1080i, 1080p, QHD - it maybe here under 1080p

Then MHL modes:
640 x 480p 60
720 x 480p 60
720 x 576p 50
720 (1440) x 480i 60
720 (1440) x 576i 50
1280 x 720p 60
1280 x 720p 50
1920 x 1080i 60
1920 x 1080i 50
1920 x 1080p 30
1920 x 1080p 60

1920x1080 23.976/24p/25p not listed here.

I think will, but you may ask Dell support. It definitely supports HD 50i/60i, PAL /NTSC SD.
24p/23.976p/25p is probably cover under 1080p entry.

This:

http://en.community.dell.com/support-fo ... t/19658478

suggests it should do 23.976p/24p/25p 1080.
Offline

AndréRodrigues

  • Posts: 48
  • Joined: Tue Apr 25, 2017 2:49 pm

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 4:44 pm

Andrew Kolakowski wrote:Hmm...not 100% sure if this monitor will accept e.g. 24p or 23.976p. It's bit unclear.
Vertical scan range
50 Hz to 75 Hz - so this suggests it won't

Video display capabilities
(DP & HDMI & MHL playback)
480i, 480p, 576i, 576p, 720p, 1080i, 1080p, QHD - it maybe here under 1080p

Then MHL modes:
640 x 480p 60
720 x 480p 60
720 x 576p 50
720 (1440) x 480i 60
720 (1440) x 576i 50
1280 x 720p 60
1280 x 720p 50
1920 x 1080i 60
1920 x 1080i 50
1920 x 1080p 30
1920 x 1080p 60

1920x1080 23.976/24p/25p not listed here.

I think will, but you may ask Dell support. It definitely supports HD 50i/60i, PAL /NTSC SD.
24p/23.976p/25p is probably cover under 1080p entry.

This:

http://en.community.dell.com/support-fo ... t/19658478

suggests it should do 23.976p/24p/25p 1080.


Perfect man! Thanks one more time! :D
Davinci Resolve Studio 17 - Windows 10 Updated - MOBO: Aorus Master X570 - PROC: Ryzen 9 3900x - GPU: RTX2070 8GB Super - RAM: 128GB 3200Mhz - STORAGE: 3 NVME 3500/3000mbps + HD's and SSD's
Offline
User avatar

Craig Marshall

  • Posts: 949
  • Joined: Sun Apr 07, 2013 4:49 am
  • Location: Blue Mountains, Australia

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 8:21 pm

Andrew Kolakowski wrote:Decklink=Mini Monitor :)

All what you need is DeckLink Mini Monitor 4K or even DeckLink Mini Monitor (if you do just HD and no 50/60p HD).


The 'original' mini monitor was HD 4:2:2 at 3G speed but the current Mini-Monitor 4K is 6G so should deliver 4:4:4 over SDI at up to 30P. We opted for the faster 12G Decklink SDI 4K Pro card which will deliver 4K at up to 60P to our PG2401PT monitor calibrated to Rec.709 via a Blackmagic 'HD-Link' SDI to Display Port Converter.

One handy feature of the Decklink SDI Pro is that output #2 scales 4K to HD so we can continue to grade on our calibrated HD monitor (with on-board 3D Luts) whilst clients watch their material on a big 4K TV fed from SDI #1 via a BMD SDI to HDMI 2 converter. Accurate 4K SDI 'video' monitors are still extremely expensive...

It took some fiddling to get aligned, configured and calibrated but we have not upgraded our hardware for some time now so we can accurately monitor virtually anything that clients bring in and we know what we see is an accurate representation of what was shot, not the computer's impression of a video signal.
Last edited by Craig Marshall on Wed Aug 09, 2017 10:03 pm, edited 3 times in total.
4K Post Studio, Freelance Filmmaker, Media Writer
Win10/Lightworks/Resolve 15.1/X-Keys 68 Jog-Shuttle/OxygenTec ProPanel
12G SDI Decklink 4K Pro/Calibrated 10bit IPS SDI Monitor
HDvideo4K.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 9:06 pm

Craig Marshall wrote:
Andrew Kolakowski wrote:Decklink=Mini Monitor :)

All what you need is DeckLink Mini Monitor 4K or even DeckLink Mini Monitor (if you do just HD and no 50/60p HD).


We opted for the faster 12G Decklink SDI 4K Pro card which will deliver RGB 4:4:4 4k at up to 60P to our PG2401PT monitor calibrated to Rec.709 via a Blackmagic 'HD-Link' SDI to Display Port Converter.


How are you getting 4K 60p 4:4:4 ? This requires dual 12G connection and then this needs to be converted to DP 1.2+ I don't know any monitor which supports dual 12G (or quad 6G).

12G Decklink SDI 4K Pro doesn't even support 4K 60p 4:4:4:
SDI Color Precision
8, 10, 12-bit RGB 4:4:4 in all modes up to UHD 30p, and 8, 10-bit YUV 4:2:2 in all modes

has this changed over some latest firmware update?
Offline
User avatar

Craig Marshall

  • Posts: 949
  • Joined: Sun Apr 07, 2013 4:49 am
  • Location: Blue Mountains, Australia

Re: 10 bits Monitor on Davinci

PostWed Aug 09, 2017 9:57 pm

Andrew Kolakowski wrote:...has this changed over some latest firmware update?


There may have been some improvements to the spec but thanks for clarifying this. We've only had 4K @ 24/25P in to date so have yet to see any 60P material yet.
4K Post Studio, Freelance Filmmaker, Media Writer
Win10/Lightworks/Resolve 15.1/X-Keys 68 Jog-Shuttle/OxygenTec ProPanel
12G SDI Decklink 4K Pro/Calibrated 10bit IPS SDI Monitor
HDvideo4K.com
Offline

Peter Cave

  • Posts: 3754
  • Joined: Thu Aug 23, 2012 6:45 am
  • Location: Melbourne, Australia

Re: 10 bits Monitor on Davinci

PostThu Aug 10, 2017 12:24 am

Andrew Kolakowski wrote:
Peter Cave wrote:Final rendered output is YUV in ProRes, Avid codecs (DNxHD etc), H264 and most broadcast formats. RGB is rare as a delivery format. THAT is the reason a proper YUV monitoring path is the best and most accurate solution.



Don't assume that YUV and only YUV=proper video chain- this is not the rule at all.
YUV is just a way of saving bandwidth and making signal more compression friendly. RGB doesn't compress so well due to luminance info been mixed with chrominance, so that's why all delivery codecs are rather YUV based.
Again- your work in Resolve in RGB. When you monitor with BM (over typical YUV pipe) your RGB pixels are send to BM card as RGB (or already converted YUV). Card converts them to YUV (just to save bandwidth) and sends to your monitor. At monitor YUV is converted back to RGB (with 4:2:2 being up-sampled back to 4:4:4) for the final display- hello we are back at RGB! This whole YUV conversion is actually not needed at all and it doesn't happen if you monitor in 4:4:4 on higher-end cards. End result on monitor is the same if all conversions are done correctly (except chroma being up-sampled). It's actually RGB chain which is more accurate.
Based on your logic if I use RGB 4:4:4 video monitoring this is not anymore "proper video monitoring" and my final YUV masters will look wrong on TVs (due to not using YUV monitoring chain) :) Tell this to high-end studios which use RGB monitoring as this is is the way to judge quality of keying/edges (YUV 4:2:2 smooth edges due to chroma subsampling).
For grading/composting etc YUV is actually useless and exists only mainly due to technology limitations (need of bandwidth saving).
When you export from Resolve for final/broadcast delivery your RGB is converted to YUV. This YUV meant to give same end result as your RGB, but in most cases this is actually the "worse signal/representation" of your work as it uses special techniques (chroma subs-smapling/compression) to save bandwidth. YUV is just a middle man which help with technology limitation. In ideal scenario it's not needed at all, but of course real world is a different story. This is the point where you trust your software- at the end this is simple step done using fixed math, but still so many apps mess it up. Once exported this is your master (YUV based) and from now one it should stay as YUV. Any conversion etc should be done in YUV (no real need to go to RGB anymore). If all possible conversions which it went through (until you see it on your TV) were correct it should look "the same" as on your grading screen (regardless of RGB or YUV monitoring chain as they also should look "the same"). You could "capture" it from TV signal, youtube etc. put into Resolve and it should (after it goes through YUV->RGB conversion on import) look the same as you original timeline.

You are making one big mistake in your judgment- you are using 2 different displays to prove that YUV chain is better. Properly calibrated display for YUV chain and random monitor for RGB chain. Send RGB signal to you FS monitor and it will be as good as YUV- well actually better as less conversion is happening on the way and it's 4:4:4. Now, remove BM card, use software which supports proper GUI preview (isolated from OS color engine, e.g. by using OpenGL surface) and your result will be 100% the same as using BM RGB monitoring+much simplified. These will be exactly the same RGB pixels which Resolve sends to BM card, which passes it to monitor (and actually "better" pixels compared to YUV chain).

It's all legacy. Now are are getting similar story as due to the chase of 8K pipes DP technology is proposing (actually already here) compression in the actual GPU->monitor link. This is equivalent of YUV appearance in old days- way of saving bits.

Proper video chain is mainly about proper/locked refresh rate. This is the biggest problem when using GPU with PC monitors for video display. Actually accuracy of color is another story. Both can be sorted quite easily and some software already do it.

Just a side note: about all pro monitors including Dolby have HDMI or DP ports. When you use correct software and feed signal from GPU it's as good as SDI feed (quite often even more accurate). Sony has just added HDMI port to their BVM-X300 v2.

And another one: data which travels over SDI isn't (when it comes to actual video data) any better than what travels over HDMI or DP. It may use different structure, packing scheme, but that's different story. At the end it forms exactly the same digital signal- there is no magic there. It's not analog world, but digital- 0 and 1. SDI 4:2:2 YUV data is actually (as far as I know) exactly the same (or about the same) structure as v210 (4:2:2 10bit uncompressed, which is so well supported in many apps). Just 3x 10bit YUV packed into 32bit words. Again- no magic, just bits. RGB will be probably very similar to BM RGB 10bit uncompressed (r210) or AJAs r10K. Maybe packing, bits order is different but these are details.

And another one: when you use crappy monitor fact that you have BM card won't help you much at all.

Last one- don't try to use GPU monitoring from software which is not designed (validated) for it, like Resolve. This is still not a prove that such a monitoring isn't any good or correct.


Incredibly detailed response that really misses the point. I don't need a lecture. I have been dealing with signal paths and digital systems at a technical level since we moved from analog to digital. I get a lot of requests to sort out monitoring accuracy in larger edit/grading suites because some IT guy thought that the computer graphic card output was a more modern and more accurate way to do it. When their output files were technically rejected by broadcasters, reality suddenly hit home. For non-broadcast computer screen destined output, this kind of accuracy is not absolutely necessary, but it certainly helps! It also helps to have proper scopes on the output which is much harder to achieve using computer GPU outputs.
Resolve 18.6.2 Mac OSX 13.6 Ventura
Mac Studio Max 32GB UltraStudio Monitor 3G

Return to DaVinci Resolve

Who is online

Users browsing this forum: Charles Bennett, Dominic Baumann, Ewald Hentze, Google [Bot], michaelh99 and 133 guests