HDR 'low' cost monitoring option?

Do you have questions about Desktop Video, Converters, Routers and Monitoring?
  • Author
  • Message
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

HDR 'low' cost monitoring option?

PostTue Sep 07, 2021 6:20 pm

https://www.lg.com/global/business/oled ... lg-65ep5g#

with SDI and SFP and many post related futures.

10K£ possibly less if you buy directly from LG. No official list price.
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostWed Sep 08, 2021 4:04 am

Andrew Kolakowski wrote:https://www.lg.com/global/business/oled-signage/lg-65ep5g#

with SDI and SFP and many post related futures.

10K£ possibly less if you buy directly from LG. No official list price.
How would that compare to FSI XM651U?
https://daejeonchronicles.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostWed Sep 08, 2021 8:30 am

No idea, but I think it could be better (if it uses newer panel).
FSI is not magic- they use LG panels most likely, so they will be bound to same core limitations of those OLED panels. Just guessing though.

FSI has new model coming soon, but it's zones based and not even 4K, but UHD resolution (and still only 31 inch). It clearly shows lack of panels availability suited for critical monitors. I don't think even 2300 (this is still very short of 8MLN individual pixels in UHD panel) zones is actually enough to get "zoning" problem minimise to the level when it's not "problematic".
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostWed Sep 08, 2021 10:01 pm

Andrew Kolakowski wrote:No idea, but I think it could be better (if it uses newer panel).
FSI is not magic- they use LG panels most likely, so they will be bound to same core limitations of those OLED panels. Just guessing though.

FSI has new model coming soon, but it's zones based and not even 4K, but UHD resolution (and still only 31 inch). It clearly shows lack of panels availability suited for critical monitors. I don't think even 2300 (this is still very short of 8MLN individual pixels in UHD panel) zones is actually enough to get "zoning" problem minimise to the level when it's not "problematic".
I was under the impression that we were talking about 65” OLED displays.

No magic required - using LG OLED panels, both Sony and Panasonic arguably make better televisions than LG themselves.

Whether or not you agree with that assessment, the point is, they're not limited to the same picture and feature set as LG any more than manufacturers who purchase Sony sensors are restricted to the identical color reproduction, in-camera noise reduction, or countless other attributes of Sony MILCs. Even the same sensor manufactured by Sony can be and is processed differently to yield different results with different camera bodies.

Returning to OLED displays - to take just one example - some of Sony’s more recent flagship OLED TVs reproduce richly saturated bright colors (for a WRGB panel) that uncannily resemble those of their BVM-HX310 reference monitor, while LG’s own displays reproduce those same colors as washed out.

“In my review of the Sony A90J, I questioned whether the company’s quad sub-pixel boosting technique which made brighter colors more saturated was still faithful to the artistic intent and today we have the answer. In this unforgettable scene from Mad Max Fury Road you can see that the Sony A90J came closer to matching the HX310 in terms of the roaring flames as well as the shiny reflections of the vehicles. Of course, the Sony HX310, which is capable of 1,000 nits full screen was still way out in front, but the A90J definitely fared better than the LG G1, which rendered the flames in a more muted manner owing to white sub-pixel dilution. Whatever processing Sony has implemented allowed the A90J to consistently outperform the LG G1 when it came to reproduction of bright saturated colors”. - HDTVTest

It's important to note that neither the LG nor the FSI OLEDs are suitable for use as HDR grading monitors.
https://daejeonchronicles.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostThu Sep 09, 2021 10:43 am

FSI uses 3rd party panels which are on the market and atm. there are really not that many OLED panels (Sony uses LG panels for consumer TVs, no?) All manufactures struggle with critical HDR TVs due to lack of proper panels. Where is Panasonic's dual layer MegaCon 55inch panel which was announced quite a time ago?
You can be Sony, Panasonic etc. but when you use same panel as others there is only limited amount of things which you can do different/better. One can look 'better' than other for sure. Core panel limitation will be always the same regardless of who makes actual TV, though.

You can use calibrated consumer OLEDs for HDR grading- all depends on what level projects you work. Everything has its price and compromises.
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostThu Sep 09, 2021 11:51 am

Andrew Kolakowski wrote:FSI uses 3rd party panels which are on the market and atm. there are really not that many OLED panels (Sony uses LG panels for consumer TVs, no?) All manufactures struggle with critical HDR TVs due to lack of proper panels. Where is Panasonic's dual layer MegaCon 55inch panel which was announced quite a time ago?
You can be Sony, Panasonic etc. but when you use same panel as others there is only limited amount of things which you can do different/better. Core panel limitation will be always the same regardless of who makes actual TV.

You can use calibrated consumer OLEDs for HDR grading- all depends on what level projects you work. Everything has its price and compromises.
First of all, there's a lot one can do when you've got brilliant engineers, as I just proved.

LG, Panasonic and Sony all make beautiful HDR televisions, good enough to please the most discriminating viewer.

You are free to do as you want, but when it comes to HDR grading, consumer televisions are not acceptable for high-end production.
Last edited by JonPais on Thu Sep 09, 2021 11:54 am, edited 1 time in total.
https://daejeonchronicles.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostThu Sep 09, 2021 11:53 am

No idea why so many assumes that we talk on this forum about "high-end productions" only, when 95% of this forum users represent anything than this area :D

I worked for high-end studio and even there not everything is done on 30K reference TVs or in few months they would bankrupt. As I said- every project has its budget and it's basically always done with some compromises.
That itself is a different matter anyway as so many projects can be done on calibrated consumer TV with good end result.
Offline
User avatar

Jack Fairley

  • Posts: 1863
  • Joined: Mon Oct 03, 2016 7:58 pm
  • Location: Los Angeles

Re: HDR 'low' cost monitoring option?

PostThu Sep 09, 2021 8:27 pm

55" LG C9/X/1s are everywhere in post production. I haven't read any pro reviews of this TV, but it would be great if it's passable for HDR.
Ryzen 5800X3D
32GB DDR4-3600
RTX 3090
DeckLink 4K Extreme 12G
Resolve Studio 17.4.1
Windows 11 Pro 21H2
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostFri Sep 10, 2021 12:44 am

Jack Fairley wrote:55" LG C9/X/1s are everywhere in post production. I haven't read any pro reviews of this TV, but it would be great if it's passable for HDR.
No consumer televisions are approved for Dolby Vision mastering. Consumer OLEDs are popular as client monitors and consumer reference displays only - not as HDR grading monitors. Neither the LG nor the FSI OLEDs discussed here are acceptable for HDR grading. The $10,000 LG OLED Pro 65EP5G is rated VESA DisplayHDR 400, the lowest ranking possible and not acceptable for grading by the remotest stretch of the imagination and FSI does not recommend their $14,000 65” panel as an HDR reference mastering monitor. Industry professionals do not recommend consumer TVs for color grading HDR. Consumer televisions suffer from auto-dimming and tone mapping. WOLED televisions cannot reproduce bright saturated colors, typically beyond 400 nits. Most consumer OLED televisions cannot hit the 1,000 nit peak brightness minimum requirement for HDR mastering. On the other hand, WOLEDs are good for SDR grading.

Re: Vesa DisplayHDR 400

"Our main concern lies at the lower entry-level end of the VESA scheme – the “DisplayHDR 400” standard. This we consider to be a very weak classification that has already encouraged shoddy, misleading marketing from display manufacturers. The whole point of the VESA certification scheme was to avoid misleading consumers about a screens HDR capability, but we feel that this entry level HDR 400 badge achieves the opposite. You will see it slapped on nearly every new display emerging, making consumers believe it offers an HDR experience (after all, it has the approved badge, right?), when in reality it doesn’t offer them anything meaningful or useful. In many cases the screen carries the badge, but lacks all the basic fundamental requirements that would constitute an HDR experience we talked about earlier!... The HDR 400 badge is in our opinion misleading, and we don’t feel it really delivers much beyond what can be achieved from most displays available in the market already – even before the advent of HDR. It’s not much more than a label to say the screen can accept an HDR input source, but can’t produce any meaningful HDR output from the display itself!... Labeling a screen as supporting HDR through this HDR 400 certification is meaningless and misleading we feel. We would love to see this level either dropped or revamped to a more useful standard if it is to continue to be used".
Last edited by JonPais on Fri Sep 10, 2021 2:02 am, edited 1 time in total.
https://daejeonchronicles.com
Offline
User avatar

Jack Fairley

  • Posts: 1863
  • Joined: Mon Oct 03, 2016 7:58 pm
  • Location: Los Angeles

Re: HDR 'low' cost monitoring option?

PostFri Sep 10, 2021 1:44 am

I didn't say they were everywhere and being used for HDR. DisplayHDR 400 should not even exist, and if that's all they got for this display, count me out.
Ryzen 5800X3D
32GB DDR4-3600
RTX 3090
DeckLink 4K Extreme 12G
Resolve Studio 17.4.1
Windows 11 Pro 21H2
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostFri Sep 10, 2021 2:55 am

Hundreds of millions of consumers are carrying in their pockets mobile devices superior in picture quality to any television set in their homes and surpassing anything found on the sets of multimillion dollar productions. Both the iPhone 12 Pro and 12 Pro Max have staggering picture quality, DisplayMate summarizing the performance of the Pro Max as follows: "In most respects close to text book perfect calibration accuracy and performance that is visually indistinguishable from perfect."

Of the iPhone 12 Pro, Vincent Teoh of HDTVTest exclaimed, "I'm absolutely bowled over by its color accuracy, bearing in mind that this is a store-bought retail unit with no calibration controls at all beyond a brightness slider. Our iPhone 12 Pro sample reached 1,100 nits in peak brightness and 800 nits full screen with faithful tracking of ST2084 PQ EOTF standard and accurate reproduction of DCI P3 colors. Its HDR presentation wasn't too far off a 30,000 pound mastering monitor [the Sony BVM-X300] in a side-by-side comparison. Not only does this lend tremendous depth and impact to HDR movies when watched on the phone..."

I don't have a Sony BVM-HX310 lying around, but comparing the picture quality of the iPhone 12 Pro Max to my Calman calibrated LG CX, the difference is appreciable, in favor of the iPhone. It's not even close. The CX is fine for review purposes, unfit for professional grading. If someone says that the most they're prepared to spend on a display is $1,300, what they're really saying is they're not equipped to take on high-end work.

When it comes to professional applications, grade 1 reference monitors are the standard for color critical work.

https://tech.ebu.ch/docs/tech/tech3320.pdf
https://daejeonchronicles.com
Offline
User avatar

Uli Plank

  • Posts: 21107
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: HDR 'low' cost monitoring option?

PostFri Sep 10, 2021 7:13 am

I second the notion about iPhones, or iPads too. Many professionals are using an iPad for presentation to team and clients because of their accuracy. I'm really hoping that Apple might aim for a similar quality with the next gen of MBPs with micro LED screens.
Maybe AI can help you. Or make you obsolete.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostFri Sep 10, 2021 10:47 am

Jack Fairley wrote:I didn't say they were everywhere and being used for HDR. DisplayHDR 400 should not even exist, and if that's all they got for this display, count me out.


Vesa and their standards :lol:
They create them "on request".
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostFri Sep 10, 2021 10:49 am

JonPais wrote:Hundreds of millions of consumers are carrying in their pockets mobile devices superior in picture quality to any television set in their homes and surpassing anything found on the sets of multimillion dollar productions. Both the iPhone 12 Pro and 12 Pro Max have staggering picture quality, DisplayMate summarizing the performance of the Pro Max as follows: "In most respects close to text book perfect calibration accuracy and performance that is visually indistinguishable from perfect."

Of the iPhone 12 Pro, Vincent Teoh of HDTVTest exclaimed, "I'm absolutely bowled over by its color accuracy, bearing in mind that this is a store-bought retail unit with no calibration controls at all beyond a brightness slider. Our iPhone 12 Pro sample reached 1,100 nits in peak brightness and 800 nits full screen with faithful tracking of ST2084 PQ EOTF standard and accurate reproduction of DCI P3 colors. Its HDR presentation wasn't too far off a 30,000 pound mastering monitor [the Sony BVM-X300] in a side-by-side comparison. Not only does this lend tremendous depth and impact to HDR movies when watched on the phone..."


Yes, but there is one key reasons why they can be so good- screen size.
This is also a reason why Sony BVM-X300 was only 31inch and bigger one was never made. Sony struggled with bigger panel development.
Offline

Steve Fishwick

  • Posts: 999
  • Joined: Wed Mar 11, 2015 11:35 am
  • Location: United Kingdom

Re: HDR 'low' cost monitoring option?

PostSat Sep 11, 2021 1:02 pm

JonPais wrote:When it comes to professional applications, grade 1 reference monitors are the standard for color critical work.

https://tech.ebu.ch/docs/tech/tech3320.pdf


A very good point but I should say that very few monitors even in SDR meet those requirements - I use FSI DM240s at work for onlining a UK broadcast show and strictly speaking they don't meet the contrast ratio specs of a grade 1 monitor but they are fantastic and very colour accurate for SDR and they are used everywhere here for the same. Even if a monitor doesn't have a tally lamp, strictly speaking it is not grade 1, which is a bit silly for finishing suites.

However even a modest reference monitor like the Flanders is about 4K and the reason is because of that colour accuracy, they always calibrate on the button identically. A reference monitor is not there to show a beautiful picture but an accurate one, which may not always be pleasant and I am sure the Apple screens show a much more beautiful picture but they cannot for the price point be any more accurately calibratable, if that's even a word.

When it comes to HDR there is a significant jump in price and there really is nothing 'low cost' about good accurate ones yet, sadly. We have the LGs too and whilst they are great for consumer HDR and will calibrate well for SDR, they are still nothing like as accurate or stable as the Flanders, even for that SDR use only.
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostSat Sep 11, 2021 2:26 pm

Steve Fishwick wrote:
JonPais wrote:When it comes to professional applications, grade 1 reference monitors are the standard for color critical work.

https://tech.ebu.ch/docs/tech/tech3320.pdf


A very good point but I should say that very few monitors even in SDR meet those requirements - I use FSI DM240s at work for onlining a UK broadcast show and strictly speaking they don't meet the contrast ratio specs of a grade 1 monitor but they are fantastic and very colour accurate for SDR and they are used everywhere here for the same. Even if a monitor doesn't have a tally lamp, strictly speaking it is not grade 1, which is a bit silly for finishing suites.

However even a modest reference monitor like the Flanders is about 4K and the reason is because of that colour accuracy, they always calibrate on the button identically. A reference monitor is not there to show a beautiful picture but an accurate one, which may not always be pleasant and I am sure the Apple screens show a much more beautiful picture but they cannot for the price point be any more accurately calibratable, if that's even a word.

When it comes to HDR there is a significant jump in price and there really is nothing 'low cost' about good accurate ones yet, sadly. We have the LGs too and whilst they are great for consumer HDR and will calibrate well for SDR, they are still nothing like as accurate or stable as the Flanders, even for that SDR use only.
You're right about the purpose of a grade 1 monitor, which, according to the document I linked to, refers to it as a ‘measuring instrument’ for visual evaluation of image quality. We want to see the unadorned signal, not a bunch of enhancements.

Which is why, prior to calibrating an LG OLED television, it's necessary to disable no fewer than a dozen [!] different settings, including: 1) Energy Saving, 2) OLED Panel Settings -> Screen Shift, 3) OLED Panel Settings -> Logo Luminance Adjustment, 4) Expert Settings > Dynamic Contrast, 5) Expert Settings > Super Resolution, 6) Additional Settings - Eye Comfort Mode, 7) Picture Options > Noise Reduction, 8) Picture Options > MPEG Noise Reduction, 9) Picture Options >Real Cinema, 10) Picture Options > Motion Eye Care, 11) Picture Options >TruMotion and a couple of auto dimming features in the service menu.

And even after going through most of that and calibrating my LG CX, the TV updated automatically overnight, and this dreadful apps launcher suddenly appeared every time I was about to begin grading - until I finally figured out how to disable it!

Few, if any, HDR reference monitors are able to display the full color gamut or 10000 nits specified in ITU-R BT.2100 PQ4, which is why two categories were created - 1A HDR and 1B HDR, with the expectation that the latter would be withdrawn at some future date.

As the LG OLED Pro is rated Vesa DisplayHDR 400, that would make it lower than a grade 3 monitor (≥500 cd/m2) and not suitable for color correction. Without knowing much about either the LG or the FSI, I'm still pretty certain that the Flanders is the better of the two.

While I guess it's not possible to calibrate the iPhone 12, it's evident that color accuracy is paramount. A couple of weeks ago, I listened to the fourth quarter earnings conference call of Pixelworks, a Silicon Valley company that among other things, specializes in mobile technologies, in which they identify color calibration as one of the major trends gaining momentum, explaining that "as mobile OEMs increasingly shift away from LCD panels and the majority of next-generation models feature high resolution high color OLED panels, customers are actively looking to differentiate these OLED displays from the competition. This trend is driving many OEMs to pay closer attention to color accuracy, which can only be achieved through color calibration of each individual handset display, a traditionally tedious, time-consuming and costly process that all but a few OEMs have historically chosen to forego. This represents a meaningful and growing opportunity for PixelWorks..."

I think the number of OLED smartphone models numbers in the hundreds now, which is pretty crazy. I bought mine 9 months ago just for use as a consumer reference display, since that's how most people watching my channel would be viewing my work anyhow - but since shooting and uploading a Dolby Vision video to Vimeo this afternoon, I just might start using it as a second camera from here on out. The quality is insane.
https://daejeonchronicles.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostSat Sep 11, 2021 10:41 pm

1A is basically theory for now. Nothing (with realistic price which you can put into your studio) will be able to do 10K nits with Rec.2020 full coverage for quite a time.
Also note that in that spec anything regardless brightness peak is always measured just at 1% of the screen ( I assume you never will have area with 10K nits which covers full or even eg. half of the screen).
I'm not sure if fact that monitor can do 1000 nits for full screen is actually that important (compared to some other features). I would say way more important/impactful would be fact that monitor can do 2K nits for 20% of the screen (or eg. that monitor has 'perfect' uniformity).
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostSun Sep 12, 2021 1:46 am

Andrew Kolakowski wrote:1A is basically theory for now. Nothing (with realistic price which you can put into your studio) will be able to do 10K nits with Rec.2020 full coverage for quite a time.
Also note that in that spec anything regardless brightness peak is always measured just at 1% of the screen ( I assume you never will have area with 10K nits which covers full or even eg. half of the screen).
I'm not sure if fact that monitor can do 1000 nits for full screen is actually that important (compared to some other features). I would say way more important/impactful would be fact that monitor can do 2K nits for 20% of the screen (or eg. that monitor has 'perfect' uniformity).
You're absolutely right - no monitor will be doing 10,000 nits for several years - but FSI manufactures the XM312U- a dizzying 5,000 nit mastering monitor with 2,300 individually controlled LED backlight zones and priced at $22,000.

Keep in mind though, that with PQ HDR, anything above 100-200 nits is reserved for spectral highlight detail only - like reflections off the chrome of an automobile, fire, lightbulb filaments, backlighted clouds, etc. After all, HDR is not about increasing the brightness of the entire picture, but increased luminance range. Which is why it shouldn't be necessary to reach for the remote to adjust brightness when switching between SDR and HDR content on TVs.

The average picture level (APL) of HDR PQ content should not depart radically from that of SDR content for the most part - and in fact, much HDR content actually has lower APL than the SDR version. The sample video I posted in my Vimeo Dolby Vision topic of some stuff scattered on my desk is in fact a good example of how not to expose HDR - if watched in a darkened room, you can feel your pupils contract when viewed on an iPhone 12 Pro Max because it is too intense - which is fine for very bright scenes that call for it, but not so cool for an entire clip, let alone an entire film.

Interestingly, when it comes to SDR monitors, grade 3 displays are actually required to be brighter than grade 1 monitors:

Grade 1 Monitor:
70 to at least 100 cd/m2.
Grade 2 Monitor:
70 to at least 200 cd/m2.
Grade 3 Monitor:
70 to 250 cd/m2 or to 400 cd/m2 in adverse conditions.

Fun fact: 100 nits is already 50% of the 10,000 nit PQ HDR peak level and to the human visual system (HVS), the jump from 5,000 nits to 10,000 nits is much smaller than the leap from SDR 100 nits to HDR 1,000 nits.

Screen Shot 2021-09-12 at 8.51.55 AM.png
Screen Shot 2021-09-12 at 8.51.55 AM.png (246.21 KiB) Viewed 4798 times


Picture from LightSpace.

“Technical details aside, the most important thing to understand about HDR is that it doesn’t represent an enhancement as much as the removal of an artificial limitation. In the realm of human vision and physical light, high dynamic range is a default condition, not an added gimmick”. – Cullen Kelly

700 nits is certainly enough to appreciate the enormous advantage HDR has over SDR, and the real reason most HDR content is not as impressive as it should be is not because consumer TVs can't reach 10,000 nits, but because the overwhelming majority of shows continue to be lit in an SDR environment, they’re monitored in SDR, and the very first time someone sees the footage on an HDR display is in the grading suite.

“To date, most HDR content has been an emulation of the existing SDR experience. This is a new aesthetic and I’m intrigued when filmmakers will embrace that aesthetic rather than try to just reproduce the existing low-con films onto the HDR format.” – Peter Doyle, Supervising Colorist at Technicolor (2016)

And it doesn’t end there. While colorists do have access to the tools to see HDR, that alone is no guarantee that the master file will end up preserving all the dynamic range, tonality and color envisioned by the director and the DP. For example, the decision in post-production to constrain the levels in the HDR pass to maintain consistency with the SDR version (for reasons both technical and aesthetic) can prevent HDR from taking wing. This is super common in the industry.

Not infrequently, a project gets the green light for an HDR master after the fact; both the post-production house and producer preemptively rule out a version that dramatically departs from the SDR version; the result being that HDR turns out to be little more than a marketing gimmick.

Another seldom discussed issue is that colorists are resorting to compromising their HDR grades in order to avoid judder artifacts.

HDR means not only lighting differently, but also (1) moving the camera differently, (2) framing differently, (3) exposing differently and (4) cutting differently. Panning more slowly for HDR to avoid judder artifacts. The 7-second rule (based on traditional theatrical viewing at 24 fps with a 180° shutter angle) no longer applies. Framing differently to either include or exclude intense light in the frame. Exposing to take the most advantage of the sensor's dynamic range. Cutting differently, because juxtaposing bright and dark scenes in HDR is much different from SDR, where the overall brightness differences are insignificant. The editing suite is one place where a consumer OLED TV can actually do some good.

It’s been five years since Peter Doyle challenged filmmakers to begin creating films expressly for HDR, but the industry still lags behind.
https://daejeonchronicles.com
Offline

Steve Fishwick

  • Posts: 999
  • Joined: Wed Mar 11, 2015 11:35 am
  • Location: United Kingdom

Re: HDR 'low' cost monitoring option?

PostSun Sep 12, 2021 9:17 am

JonPais wrote:Interestingly, when it comes to SDR monitors, grade 3 displays are actually required to be brighter than grade 1 monitors:

Grade 1 Monitor:
70 to at least 100 cd/m2.
Grade 2 Monitor:
70 to at least 200 cd/m2.
Grade 3 Monitor:
70 to 250 cd/m2 or to 400 cd/m2 in adverse conditions.


The reason why Gade2/3 monitors are specified for a higher nits values Jon, is for their intended uses in higher ambient lighting situations, e.g. galleries and OB trucks, or the offices of execs/producers for review. Finishing suites should be calibrated for a very low ambient light, they are quite often not. I like to arrange a low colour temp correct backlight in mine, if possible, if the suite isn't a dedicated, engineered installation.

The purpose of Grade 2 monitors are usually on set, in galleries, OB truck - where a fair deal of confidence is required but not colour critical accuracy necessarily. The purpose of grade 3 monitors is to replicate the consumer experience (usually around 200 nits SDR) to some extent and be fairly reliable for the aforementioned producer review.

In the case of HDR, there are very few, if any consumer monitors that can achieve the nit levels of colour critical reference monitors.
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostSun Sep 12, 2021 9:21 am

Steve Fishwick wrote:
JonPais wrote:Interestingly, when it comes to SDR monitors, grade 3 displays are actually required to be brighter than grade 1 monitors:

Grade 1 Monitor:
70 to at least 100 cd/m2.
Grade 2 Monitor:
70 to at least 200 cd/m2.
Grade 3 Monitor:
70 to 250 cd/m2 or to 400 cd/m2 in adverse conditions.


The reason why Gade2/3 monitors are specified for a higher nits values Jon, is for their intended uses in higher ambient lighting situations, e.g. galleries and OB trucks, or the offices of execs/producers for review. Finishing suites should be calibrated for a very low ambient light, they are quite often not. I like to arrange a low colour temp correct backlight in mine, if possible, if the suite isn't a dedicated, engineered installation.

The purpose of Grade 2 monitors are usually on set, in galleries, OB truck - where a fair deal of confidence is required but not colour critical accuracy necessarily. The purpose of grade 3 monitors is to replicate the consumer experience (usually around 200 nits SDR) to some extent and be fairly reliable for the aforementioned producer review.

In the case of HDR, there are very few, if any consumer monitors that can achieve the nit levels of colour critical reference monitors.
Thanks for that!
https://daejeonchronicles.com
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostSun Sep 12, 2021 12:35 pm

I was mistaken. Sony already built an 8K 10,000 nit HDR display.

https://www.avsforum.com/threads/best-o ... 998/page-2
https://daejeonchronicles.com
Offline
User avatar

Marc Wielage

  • Posts: 10852
  • Joined: Fri Oct 18, 2013 2:46 am
  • Location: Hollywood, USA

Re: HDR 'low' cost monitoring option?

PostMon Sep 13, 2021 12:46 am

JonPais wrote:I was mistaken. Sony already built an 8K 10,000 nit HDR display.

https://www.avsforum.com/threads/best-o ... 998/page-2

What I call "retina-searing levels." :o
marc wielage, csi • VP/color & workflow • chroma | hollywood
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostMon Sep 13, 2021 4:04 pm

JonPais wrote:You're absolutely right - no monitor will be doing 10,000 nits for several years - but FSI manufactures the XM312U- a dizzying 5,000 nit mastering monitor with 2,300 individually controlled LED backlight zones and priced at $22,000.


Sound very low number comparing to exiting display in 12.9 inch iPad Pro, which also has over 10,000 miniLEDs grouped in the way to create a total of 2,596 local dimming zones. FSI is 31 inch. I assume same number of zones work way better in smaller screen? Or maybe this is not true and number of zones should be correlated just with screen resolution?
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostMon Sep 13, 2021 4:55 pm

Andrew Kolakowski wrote:
JonPais wrote:You're absolutely right - no monitor will be doing 10,000 nits for several years - but FSI manufactures the XM312U- a dizzying 5,000 nit mastering monitor with 2,300 individually controlled LED backlight zones and priced at $22,000.


Sound very low number comparing to exiting display in 12.9 inch iPad Pro, which also has over 10,000 miniLEDs grouped in the way to create a total of 2,596 local dimming zones. FSI is 31 inch. I assume same number of zones work way better in smaller screen? Or maybe this is not true and number of zones should be correlated just with screen resolution?
Apparently, some users are noticing more blooming on the 12.9-inch iPad Pro's Liquid Retina XDR mini-LED display than expected.

Screen Shot 2021-09-13 at 11.35.43 PM.jpg
Screen Shot 2021-09-13 at 11.35.43 PM.jpg (789.44 KiB) Viewed 4551 times


Below, there's a side-by-side of the FS! XM310K (2,000 zones) (L), the Apple Pro Display XDR (576 zones) (M) and the FSI XM311K (dual panel) (R). The dual panel FSI XM311K fares the best of the bunch.

And here's an article about the iPad Pro blooming.
https://www.macrumors.com/2021/05/24/us ... r-display/

Of course, blooming is not noticeable all the time, and bias lighting helps I guess.

The FSI XM312U qualifies as a Dolby Vision mastering monitor.
Attachments
Screen Shot 2021-09-13 at 11.36.06 PM-2.jpg
Screen Shot 2021-09-13 at 11.36.06 PM-2.jpg (401.28 KiB) Viewed 4551 times
https://daejeonchronicles.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostMon Sep 13, 2021 5:03 pm

Yes, looks crap on iPad- maybe size doesn't matter, only zones/resolution ratio does. There is one more variable probably which is refresh rate. If iPad Pro has very high refresh rate then it's probably going to be impacting it a lot as well. I wonder how native 24p would look like.

Still visible a lot on FSI and that model had 2K zones as well.
New one has much higher peak brightens, so I expect similar blooming (newer technology, bit more zones against much higher peak).
FSI had some setting for minimising blooming, but it's all at the cost of other parameters (so in reality cheating) and sounds like a too big compromise in reference monitor.
I would say zoning approach should not be used in reference monitors, but then we won't have new reference monitors at all :D
Last edited by Andrew Kolakowski on Mon Sep 13, 2021 5:11 pm, edited 1 time in total.
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostMon Sep 13, 2021 5:11 pm

My Asus had 1,152 zones. You can see the blooming here.

https://www.google.com/amp/s/daejeonchr ... oming/amp/

I only noticed it off axis with bright text on black background.
https://daejeonchronicles.com
Offline

Andrew Kolakowski

  • Posts: 9206
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: HDR 'low' cost monitoring option?

PostMon Sep 13, 2021 5:14 pm

Also off-angle makes thighs waaaay worse.

All of this sounds 'manageable', but not a nice thing in reference monitor for 20K$+ for sure.
Funny enough that old Dolby SDR had this issues as well but almost no one talked about it :D
It was not a perfect monitor in any means at all, but nothing better was there, so (again) you had no choice except to treat it as reference.
Offline
User avatar

amiroxx

  • Posts: 29
  • Joined: Thu Nov 22, 2012 1:54 pm
  • Location: Deutschland
  • Real Name: Michael Radeck

Re: HDR 'low' cost monitoring option?

PostFri Sep 24, 2021 1:09 pm

JonPais wrote:
I don't have a Sony BVM-HX310 lying around, but comparing the picture quality of the iPhone 12 Pro Max to my Calman calibrated LG CX, the difference is appreciable, in favor of the iPhone. It's not even close. The CX is fine for review purposes, unfit for professional grading. If someone says that the most they're prepared to spend on a display is $1,300, what they're really saying is they're not equipped to take on high-end work.

When it comes to professional applications, grade 1 reference monitors are the standard for color critical work.

https://tech.ebu.ch/docs/tech/tech3320.pdf


Well which probe do you use to calibrate your LG OLED?
HP Z8G4 win11pro 64bit- Quadro RTX A4000 GUI, Geforce RTX 3080 - Decklink 4k extreme 12g
m2max laptop osx 14.1.2
18.6.4
Offline

Steve Fishwick

  • Posts: 999
  • Joined: Wed Mar 11, 2015 11:35 am
  • Location: United Kingdom

Re: HDR 'low' cost monitoring option?

PostSat Sep 25, 2021 8:37 am

amiroxx wrote:Well which probe do you use to calibrate your LG OLED?


The xRite i1 Display Pro OEM colorimeter is a very common and accurate probe outside of the Kleins which cost thousands. Our chief engineer uses a specific one from FSI that has been calibrated with offsets against a very expensive reference spectroradiometer:

https://www.shopfsi.eu/shop/product/i1d ... ategory=34
Offline

JonPais

  • Posts: 441
  • Joined: Fri Mar 21, 2014 2:17 am

Re: HDR 'low' cost monitoring option?

PostSat Sep 25, 2021 1:17 pm

"Rec.2020 was 72% but more impressively, Apple has nailed the DCI-P3 color tracking within the Rec.2020 container, putting the vast majority of consumer TVs to shame, even after calibration". - Comparison of iPhone 12 Pro to Sony BVM-X300 reference monitor, Vincent Teoh, HDTVTest

Vincent uses a £5,100 Klein K10-A colorimeter and CalMAN 5 Ultimate calibration software whose license runs US $2999.
https://daejeonchronicles.com

Return to Post Production

Who is online

Users browsing this forum: No registered users and 19 guests