Lightroom: Add 10-bit/channel color display support

  • 48
  • Idea
  • Updated 9 months ago
  • (Edited)
Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?
Photo of beeawwb

beeawwb

  • 2 Posts
  • 1 Reply Like

Posted 8 years ago

  • 48
Photo of Lee Jay

Lee Jay

  • 994 Posts
  • 137 Reply Likes
"At the moment, all that's missing is the application."

There are thousands of feature requests, so obviously that isn't all.
Photo of David Jensen

David Jensen

  • 131 Posts
  • 46 Reply Likes
He means that the driver, graphics card, and monitor supports 10 bit color. Only the app is missing 10 bit support.

OS X doesn't support 10 bit, though. Windows does.
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 846 Reply Likes
Also, we can only support 10 bit/channel through OpenGL - which requires a compatible card and driver version.
Photo of beeawwb

beeawwb

  • 2 Posts
  • 1 Reply Like
Correct, yes, the compatible card and driver are required, as well as a monitor.

My point is really:
If a user has invested in a 10 bit monitor, a 10 bit compatible card / driver version (e.g. FirePro from ATI) they should be able to get applications that support this.

The prime example being that CS5 does support 10 bit (if all the other pieces are in place) through OpenGL, so as a related product it would be great if Lightroom did too.
Photo of Brianna Byman

Brianna Byman

  • 1 Post
  • 0 Reply Likes
DSLR's are increasingly pushing the 8 bit envelope. Why would Adobe be content letting their software limit my photography?
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 842 Reply Likes
The bit depth of the data doesn't matter here -- Lightroom already processes images in the depth of the camera or 32 bit floating point format. This topic is about supporting newer video cards and displays that can use 10 bit/channel framebuffers.
Photo of LRuserXY

LRuserXY

  • 426 Posts
  • 41 Reply Likes
Furthermore, bit depth of cameras (raw) and monitor output are not comparable at all: Camera bit depth refers to a linear tonal scale (gamma = 1) and monitor output to a non-linear scale (mostly gamma = 2.2).

Existing 8 bit output resolution will not really limit your photography (e.g. in terms of dynamic range). The advantage of 10 Bit is "only" that it gives you more tonal resolution on the display. For critical photos, this can improve display accuracy by reducing banding/posterization in regions with subtle tonal or color changes. So the feature is of course desirable, but not essential.
Photo of LRuserXY

LRuserXY

  • 426 Posts
  • 41 Reply Likes
Additional question: Does LR use dithering in the develop module? I just loaded the 10 bit test ramp from http://www.imagescience.com.au/kb/que... into Photoshop and LR (8 bit output). In Photoshop, I see the 8-bit-banding, as well as in LR loupe (because previews are 8 bit JPEGs). But in LR develop, there is no visible banding!

This would be actually great, because it is the next best thing to 10 it support.

Edit: Yes, LR develop really seems to use dithering. Here is a small portion of the ramp, with extremly amplified contrast (screenshot from develop module, contrast amplified in PS):

Photo of Dorin Nicolaescu-Musteață

Dorin Nicolaescu-Musteață, Champion

  • 703 Posts
  • 39 Reply Likes
Yes, Develop and Export (to 8 bpp) dithers.
http://forums.adobe.com/message/3655226
Photo of LRuserXY

LRuserXY

  • 426 Posts
  • 41 Reply Likes
Thanks... yes, you already mentioned that here, too: http://feedback.photoshop.com/photosh... ... But I seem to have overlooked it, or simply forgotten ;-)

So... is it safe to assume that because of this, LR does not really need 10 bit output? I mean, if the dithering works so well on artificial images, it should work nearly perfectly on most real world photos. The only case when 10 bit would have an advantage is when there are sublte tonal *patterns* (e.g. fabrics, etc.) with small size (some pixels), which cannot be shown using dithering. But I doubt that such patterns can really be seen even with 10 bit.
Photo of Lyndon Heap

Lyndon Heap

  • 6 Posts
  • 2 Reply Likes
This reply was created from a merged topic originally titled
Lightroom: 10 bit monitor support.


I fully understand that you can't give Lightroom too many of the attributes that Photoshop has but a 10 Bit Monitor is just superb [using PSCC] and dare I say the defacto standard for sill digital image editing. It would be really nice to see Lightroom 6 having the same facility. It would also give the two Applications greater synergy when viewing/editing.
Photo of Giorgia Battecca

Giorgia Battecca

  • 1 Post
  • 2 Reply Likes
I am a very satisfied Lightroom user, but the lack of 30 bit color display capability it's a major shortcoming for a pro grade software.
I hope this feature will be added soon.
Photo of ali fatemi

ali fatemi

  • 6 Posts
  • 0 Reply Likes
This reply was created from a merged topic originally titled Lightroom: 10-bit display support.

we, photographers, need Lightroom to support 10-bit display output for having our advance displays like models from Eizo or NEC producing 10-bit billion color depth
Photo of iain malcolm

iain malcolm

  • 2 Posts
  • 0 Reply Likes
Having just tested 10 bit support in video media with a good - but not ridiculous - rig and seen the difference it makes I am very keen to get 10 bit support on ALL my photo apps.

windows 10 - nvidia 980ti - Dell U2713H. I know this monitor is doing slightly fake 10  bit (as many laptops do fake 8 bit), but the improvement is significant.
Photo of Phil Burton

Phil Burton

  • 44 Posts
  • 9 Reply Likes
My Nikon D3 produces RAWs with 14-bit color depth.  With 8-bit processing in LR, I'm throwing away 6 bits of color depth smoothness and gradation.  As others have pointed out, Photoshop has 10-bit color support, so it's not something that Adobe can't do. And nowadays, both Windows and OS X support 10-bit color.
Photo of Steve Sprengel

Steve Sprengel, Champion

  • 2675 Posts
  • 348 Reply Likes
I'm pretty sure that LR uses a 16-bit internal workspace, while the display is only 8-bits, so the term 8-bit-processing isn't quite right. 
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1673 Posts
  • 575 Reply Likes
Official Response
Steve is correct.

On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.

On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.
(Edited)
Photo of Stefan Klein

Stefan Klein

  • 144 Posts
  • 76 Reply Likes
Simon,
I guess I don`t understand. You mean LR is rendering in only 8bit on Windows??? I`m not talking about display, but the internal processing.
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1673 Posts
  • 575 Reply Likes
Internal image pipeline process it in 16-bit on Windows also. Since the thread is about color display, so my comment above is purely about the last stage of display color transform for the monitor.
Photo of Steve Sprengel

Steve Sprengel, Champion

  • 2675 Posts
  • 348 Reply Likes

So on Windows LR truncates the 16-bit PPRGB to 8-bit PPRGB then transforms to 8-bit "Monitor RGB"?    Isn't this non-optimal and will lead to unnecessary banding and posterization?

If I was converting color spaces in Photoshop from 16-bit PPRB to an 8-bit narrower monitor colorspace, I'd transform to the monitor colorspace while still in 16-bit then truncate to the monitor color-depth.

Is this non-optimal transform being done for speed purposes or because the GPU can't do the transform in 16-bit?  Or am I wrong that the order of the color-space conversion and bit-depth-conversion makes a difference?

(Edited)
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1673 Posts
  • 575 Reply Likes
For the GPU rendering path, the display color transform is entirely handled within the internal OpenGL rendering pipeline. So the behavior there is just as you have expected. The 8-bit display color transform path is only applicable to the legacy CPU rendering path on Windows only. 
(Edited)
Photo of Steve Sprengel

Steve Sprengel, Champion

  • 2675 Posts
  • 348 Reply Likes
So the CPU-rendering is non-optimal, why?
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1673 Posts
  • 575 Reply Likes
Maybe I did not make it clear. 

Lr always uses the ProPhotoRGB color space and 16-bit math internally for the processing (it will use floating point math if the source is a HDR). This is true for both the CPU and GPU renders. In general, Lightroom is also responsible for the final color transform from the 16-bit ProPhotoRGB to the destination monitor color space to avoid the sub-optimal result as you mentioned. The only exception to this is the CPU rendering on Mac, where Lightroom would leave it to the Mac OS X to do the final monitor display color transform. In that case, Lightroom is passing the 16-bit ProPhotoRGB image data directly to the Mac OS X so there is no loss of fidelity there. On Windows, the image data as a result of the final monitor color transform is currently limited to 8-bit. On Mac, it is 16-bit.
Photo of Steve Sprengel

Steve Sprengel, Champion

  • 2675 Posts
  • 348 Reply Likes

This answer is less specific than the previous one, so let me be clear about what I'm saying, and yes it is about the conversion of the 16-bit ProPhotoRGB colorspace to the 8-bit monitor colorspace of Windows which would be close to either AdobeRGB for wide-gamut monitors or sRGB for non-wide-gamut monitors, and neither is as wide as ProPhotoRGB.

It was the particular ordering of the color-space conversion and the 16-bit to 8-bit truncation that seemed wrong to me.  Let me explain in more detail so there is no confusion:

If I was doing the 16-bit ProPhotoRGB to 8-bit monitor-RGB in PS I would do it in two steps:

1)  16-bit ProPhotoRGB (64K colors per RGB channel) to 16-bit sRGB (for example) 64K minus the colors out of gamut which is less but still many more than 256 colors per channel).

2)  16-bit sRGB to 8-bit sRGB - 256 colors per channel


Now if from what you said happens in the CPU computations of internal LR colors to monitor colors, this is the order:

1)  16-bit ProPhotoRGB (64K colors per RGB channel) to 8-bit ProPhotoRGB (256 colors per RGB channel).

2)  8-bit ProPhotoRGB (256 colors per RGB channel) to 8-bit sRGB (256 colors per RGB channel minus the out-of-gamut colors for sRGB, so less than 256 colors per RGB channel).

So in the optimal way we get 256 colors per each of the three RGB channels.

And in the non-optimal way we get less then 256 colors per channel--only the colors left after the PPRGB colors that are out-of-gamut in sRGB are thrown away. 

Each level in each channel leads to a band on the screen.  In a smooth gradient such as the sky, those bands are visible.  With less than 256-colors per RGB channel there are fewer bands and they are more prominent.

So my question is, why is this non-optimal conversion of 16-bit PPRGB to 8-bit monitor-RGB being done?   Or is the analysis of it being non-optimal for the colors-per-channel not right, or is the interpretation of what you said happens not right and the ordering is actually the optimal ordering--colorspace conversion occurs in 16-bits in both the GPU and CPU and then the CPU conversion truncates to 8-bits afterwards?
---
As an appendix, here is a real-world example comparing the number of colors in a document converted optimally and non-optimally from 16-bit ProPhotoRGB to 8-bit sRGB. 

To create this example document, I drag-sized the PS color-palette to maybe 1/6th of the screen, then did a screenshot, copy-pasted that screenshot into a PS document, converted to 16-bits, assigned my monitor profile, converted to ProPhotoRGB, bi-linear resampled up to 1024x1024 pixels, g-blurred it by 8-pixels, increased the saturation by 20, g-blurred it by 8-pixel radius again, resulting in the starting document with the following properties:

1024x1024 dimensions = 1048576 total pixels, 1045410 total colors as counted by this plug-in: http://telegraphics.com.au/sw/product/CountColours

If I convert optimally from 16-bit PPRGB to 16-bit sRGB then to 8-bits there are 305598 colors.

If I convert non-optimally from 16-bit PPRGB to 8-bits then convert to sRGB there are 165603 colors, which is a bit more than half as many colors compared to doing it the optimal way.

I had dithering turned off for the ProPhotoRGB to sRGB color profile conversion assuming this is what happens when converting to the monitor profile via the CPU to make it faster.  If not then please enlighten me.

Photo of Eric Chan

Eric Chan, Camera Raw Engineer

  • 631 Posts
  • 133 Reply Likes
Steve, when saving 8-bit images to disk or opening them in Photoshop from ACR or Lightroom, the quantization to 8 bits is always done as the final step.  Any color space conversions (e.g., from raw to Adobe RGB or sRGB) is done before that.

When displaying images to the screen inside of ACR or Lr's Develop module, the exact math and precision level will depend on a few things (including CPU vs GPU use), but the color space conversion to the display profile happens before any needed quantization to the display precision.  For example, for a "standard" 8-bit display pipeline we first convert to the display profile (with >= 16-bit math) and then quantize to 8 bits (with dither).
Photo of Steve Sprengel

Steve Sprengel, Champion

  • 2675 Posts
  • 348 Reply Likes
Ok that's good.  I interpreted Simon's statement:  "But the result image data of the final monitor color transform (for display) is currently limited to 8-bit" incorrectly to mean that the monitor profile conversion happened in 8-bit.
Photo of iain malcolm

iain malcolm

  • 2 Posts
  • 0 Reply Likes
Also remembering that lightroom manages the color profiles on windows (good thing too as windows is - still -the village idiot when it comes to color management), so lightroom *could* do conversion in the best order to drive an 8 bit monitor. Having said that, even with dithering, there is a distinct (visible) quality improvement on going from
  • 10 bit source to 8 bit monitor interface
    -to-
  •  10 bit source to 10 bit monitor interface
Even the projector used by my camera club is 10 bit capable and is visibly better with full 10 bit interface.

This seems to apply just as much to nature (trees, leaves, stone etc. - especially when distant) as it does to artificial patterns. I assume that our internal image processing pipeline has evolved to pull as much detail as possible from the photons entering the eyeball.

There is also some interesting stuff happening around video as well as the extended video formats arrive. It is obvious that the eye brain processing that goes on picks up on some remarkably fine detail given the opportunity, The difference between yuyv (4:2:2) video and ful yuv (4:4:4) can be quite dramatic, creating a richer and more immersive image without apparently 'looking' much different. In some ways the result is similar to moving from 8 bit to 10 bit monitor depth.
Photo of Phil Burton

Phil Burton

  • 44 Posts
  • 9 Reply Likes
So if I have LR 6.5.1 on Windows, will I get 10-bit display output if I have the right graphics card?  If yes, can anyone recommend a suitable card.  I don't want to spend "too little" but I don't need the crazy-fast performance that gamers crave.

Thanks.
Photo of Nicola Tullio Cataldo

Nicola Tullio Cataldo

  • 3 Posts
  • 0 Reply Likes
With current version of LR, no chance of having 10 bit display. With Photoshop, a Nvidia Quadro  K1200 is more than enough. Only workstation cards like Quadro enable 10 bit/channel through OpenGL (that is what Photoshop supports). 10 bit display on LR would be very welcome, they lag considerably behind Photoshop team.
Photo of Po-Ting Huang

Po-Ting Huang

  • 4 Posts
  • 0 Reply Likes
No need Quadro anymore. Mine is an older GTX670, and it supports 10bit now. Go to NVidia Control Panel and there is an option to select color depth. You need a 10 bit capable monitor and most importantly connect with Displayport cable.
Photo of Nicola Tullio Cataldo

Nicola Tullio Cataldo

  • 3 Posts
  • 0 Reply Likes
Nope. You do need a Quadro.
You can enable the color depth option in your GTX670, but you will still see banding on your 10 bit capable monitor connected with Displayport  in Photoshop (check it with "10bit test ramp" file).
Geforce cards can output 10 bit color but they only support that with applications that use Direct X. Quadro cards support 10 bit color via Open GL. Photoshop uses Open GL so you need a Quadro card for 10 bit with Photoshop.
More info here:
http://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce...
(Edited)
Photo of Po-Ting Huang

Po-Ting Huang

  • 4 Posts
  • 0 Reply Likes
Thanks for the info. It's a shame that NVidia refuses to support 10 bit OpenGL on mainstream cards. I got a feeling that INTEL, MS, NVidia, ATI are the ones who stop the PC from progress. 
Photo of Denis Protopopov

Denis Protopopov

  • 1 Post
  • 0 Reply Likes
This reply was created from a merged topic originally titled Lightroom: Please, add 10 bit per color (30 bit per channel) support.

Application is the only thing missing in true 10-bit workflow now.
Photo of Виталий Протасов

Виталий Протасов

  • 1 Post
  • 0 Reply Likes
Add please 10 and 12 bit  per channel output to driver.
My monitor can handle 12 bit per channel and only Photoshop can use it within 30 bit option.
Photo of Po-Ting Huang

Po-Ting Huang

  • 4 Posts
  • 0 Reply Likes
There is no 12bit panel. Some TV may be able to accept 12bit data, but displaying 12bit color is another story.
Photo of Maximilian Heinrich

Maximilian Heinrich

  • 2 Posts
  • 0 Reply Likes
After reading this thread I summarize it as:

Even when having a pro 10bit capable display AND a pro 10bit graphics card like NVIDIA quattro or ATI FirePro you can only use 8bit color depth in lightroom. At least when you're on windows.

Is that correct? And if yes - WHY? I know I might not be able to tell the difference between two nearly identical colors, no matter whether I have 8bit or 10bit per channel BUT I indeed see the difference when having a highly increasing dynamic range. Compare a 256 shades (8bit) greyscale to a 1024 shades (10bit) greyscale - that's a massive (visible) difference.
Photo of Roy McLaren

Roy McLaren

  • 44 Posts
  • 7 Reply Likes

I'm in the process of considering upgrading my computer/monitor(s)/printer to enable full 30-bit workflow. I am currently using a PC with Windows 7 SP1, Radeon R9 390 graphics card and two 8-bit (6+2 in reality) 23" Dell monitors as well as Lightroom 6. I am relatively new to Lightroom 6 and still learning, but really like it so far - very easy and intuitive software. Having just spent $150 on my single, perpetual license I am reluctant to change to new photo editing software again.

So, it is clear that Lightroom 6 will not support 10-bit display output with Windows, not even the latest standalone version 6.14. However, after reading this thread I am left with the impression (rightly or wrongly) that Lightroom 6 will support 10-bit final monitor color transform (for display) for a Mac using 4k or 5k 10-bit display with latest OS. Is this impression correct?

What I am really trying to understand here is if there is a distinct advantage to changing to a Mac with 4k-5k 10-bit display as opposed to upgrading my existing Windows based system? Will I unlock potential for a true 30-bit end to end workflow with Lightroom 6 or do I need to change to Photoshop even with a Mac?

Looking at this from a different angle, will I be able to perceive differences in photos generated by Lightroom 6 with a true end-to-end 30-bit system compared to a 24-bit system? How much will this in practice affect color range (i.e., ability to utilize and display sRGB vs aRGB spaces), eliminating banding in softly graduated tones, etc. Put another way does Lightroom 6 already somehow compensate for these issues in the event that it is in fact restricted to 8-bit output even with Mac?

(Edited)
Photo of Skippi

Skippi

  • 3 Posts
  • 1 Reply Like
When Adobe runs out of all the ideas for LR and when the whole and often misunderstood HDR hype takes over the world, then and only then this now "nobody needs this" feature will become "essential must have". Many modern image formats, including HEIF do have, unlike JPEG, more than 8bit per channel support. In the situation where you phone can display it but your mega rig with latest Lightroom can not, we might see some change.
Than being said, my latest Win10 with Quadro Card and 10bit display just would not get 10bit through in any application no matter what. It worked before, but not anymore.  
Photo of Axel Roland

Axel Roland

  • 1 Post
  • 0 Reply Likes
Sad, indeed. I'm looking to move away from LR/Adobe since that Lightroom Classic, Lightroom CC, Lightroom bitchmove, anyway. 
(Edited)
Photo of Geoff Murray

Geoff Murray

  • 5 Posts
  • 4 Reply Likes
What would be interesting to know is WHY Adobe has limited Lightroom in this way. There must be a justifiable reason otherwise I am sure 10 bit would have been enabled. However, it is definitely a desirable attribute and I would add my voice to the request for a full 10 bit monitor output.
Photo of Skippi

Skippi

  • 3 Posts
  • 1 Reply Like
I think reason one is only a teeny-tiny portion of LR users is actually capable of running 10bit display workflow.
Reason two: they did not really make it work even in Photoshop yet (it works at particular zoom levels without any tools, including selection / marque tool, being used).
Reason three: your human vision: on a photograph it would be a challenge to see difference from a viewing distance between dither and true 10bit - UNLESS in a specific use case scenario (black and white photography, monochromatic photography of whatever color).
Reason four: as it was mentioned, internally all the calculations are 16/32 bit so the display depth is not a limit
Reason five: possible performance taxation on GPU acceleration. Essentially: GPU acceleration would not be able to share the same code for Radeon/GeForce with FirePro/Quadro cards. 

And at last but not at least: you would have to hire extra support staff just to trouble-shoot people's buggy 10bit workflows over which you have very little control but in the end the trouble will land in Lightroom.

Do I want 10bit in Lightroom? Hell yes!
Do I need it? Hm....
Would I will be able to see the difference on my pictures? Oh, I wish...
Are there any other more important issues I would benefit from more if Lightroom got them addressed? DON'T GET ME STARTED.... :-D

Get a high quality wide-gamut display with hardware calibration and with guaranteed uniform back-light (SpectraView or equivalent).
Make sure you have means of monthly HW calibration.
Secure a high-quality photo-printing workflow with custom calibrated printer for all the media you plan to use.
Now, you have done a lot for a high quality output. 99% percent.
You want that extra one percent, that is your 10bit display in Lightroom (and believe me, there are way better apps with much deeper control to print from than Lightroom).
Photo of Dennis Hyde

Dennis Hyde

  • 4 Posts
  • 0 Reply Likes
I agree that a wide gamut monitor is important, but as near as I can tell they all promote 10-bit color as an integral part of their improved display. The number of photos I would edit would be much greater that the number I would print. I would not put a printer in front of 10-bit color.
Photo of Geoff Murray

Geoff Murray

  • 5 Posts
  • 4 Reply Likes
Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq. I don't understand what you mean by not putting a printer in front of 10 bit colour.
Photo of Dennis Hyde

Dennis Hyde

  • 4 Posts
  • 0 Reply Likes
I was referencing Skippi's hierarchy of importance where he saw purchasing a custom calibrated printer as something that should be done prior to purchasing a 10-bit color capable wide gamut monitor. That would only hold if you printed every photo you edited. For most, the monitor would be used for editing a much larger number of photos than what would be printed, it seems to me. Also, it is not unusual to farm out the printing to a specialist.

There's no doubt that the vast majority of monitors in use are not 10-bit color wide gamut. But they are considered the best for a variety of reasons, and I am convinced of those reasons. It seems obvious, if Adobe sees itself as the best editing software, it needs to support the best monitors. I am not a programmer, but I assume supporting 10-bit color monitors would not prevent folks with standard monitors from using the programs.
Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq.
All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color and a 16 bit/color internal LUT for calibration. This sets them apart from lesser displays that need to be calibrated using the monitor's external controls and profiled  using the 8 bit/color LUT in the graphics card. This reduces the displays effective bit depth.

More here on the subject:

https://forums.adobe.com/message/10958873#10958873



(Edited)
Photo of Skippi

Skippi

  • 3 Posts
  • 1 Reply Like
First of all, glad the conversation on this topic is going on. 10bit display is deeply in my heart and I would love to see it just as you do - if not more, cause I have a 10bit workflow on my desk for over three years now.

@Dennis> I was referring to the printer because making prints significantly improves your photography. Nobody prints every picture they take. But spending time on finessing this very last step (OK, the very last step is framing and display) will get you thinking about your picture a lot more.

Based on my experience (and the experience of my friends who are gallery specialists doing scans of artworks priced at 6+ figures and saving them in 16bit uncompressed TIFF files), if I put it in laymen terms: For general photography, 10bit output is a p0rn only YOUR MIND can appreciate. With the LUT tables, high precision internal processing and properly calibrated screen, the difference between FRC dither and true 10bit output is indiscernible to your vision. Yes, you can tell in artificial monochromatic gradients. And (maybe) yes a highly skilled radiologist would be able tell the monitors apart. But for the rest of the cases...

@Todd> You are misinterpreting the facts. Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels with the micro controller really sending the 10bit value to the subpixel and really being able to display all the discreet levels of brightness in each of the subpixels does not requiring any dithering etc.

Actually, going 10bit is NOT that hard. What is hard (expensive) is achieving more-less linear wide gamut in a useful spectrum of colors. What is even harder and even more expensive is achieving uniform back-lighting and acceptable black levels on LCD screens. But without these, 10bit would make even less sense than it does now.
Photo of Dennis Hyde

Dennis Hyde

  • 4 Posts
  • 0 Reply Likes
I'm not certain that is correct, Todd. I know that Viewsonic gives its color space support spec as 8-Bit+A-FRC, with FRC being dithering that adds a 2 bit equivalent.
Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
Skippi said:
Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels
Interesting, please provide links to specific models so we can give them consideration. Thank you Skippi.

Dennis Hyde said:

I'm not certain that is correct, Todd. I know that Viewsonic gives its color space support spec as 8-Bit+A-FRC, with FRC being dithering that adds a 2 bit equivalent.
That's exactly what I said, "All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color."
What's not correct?
Photo of Cameron Rad

Cameron Rad

  • 161 Posts
  • 48 Reply Likes
@Todd



See the NEC Manual: http://www.support.nec-display.com/dl_service/data/display_en/manual/pa271q/PA271Q_manual_EN_v4.pdf

The iMac Pro has 10Bit with dithering but it's also advertised as such.


Edit:

I've found some conflicting reports. NEC states in this press release that the PA271Q features  true 10 bit color. https://www.necdisplay.com/about/press-release/nec-display-updates-popular-professional-desktop-d/797

However this site states the panel is 8bit + FRC: https://www.displayspecifications.com/en/model/73ae1389

This other model though is 10bit 
https://www.displayspecifications.com/en/model/178b1656
(Edited)
Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
The NEC PA271Q and PA272W monitors use an 8 bit/color panel with FRC dithering to achieve 10 bit/color depth, which is 30 bit RGB.....same as the iMac Pro.

https://www.displayspecifications.com/en/comparison/115762319
Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
OK, good to know–Thanks Cameron!
Photo of Geoff Murray

Geoff Murray

  • 5 Posts
  • 4 Reply Likes
So we do. Have quite a few genuine 10 bit panels out there. From what I understand is that it gives a truer rendition of the image without the gaps created by the inability to show intermediate colours, whether we can actually see them or not :)
Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
It's debatable whether you can actual see the difference between 10 bit/color and 8 bit/color with FRC dithering. However, depending on how the dithering is implemented it may cause eyestrain for some users.  I've been using my NEC PA272w monitor with 8 bit/color + FRC dithering for three years now with no issues.

Keep in mind that the FRC dithering is only used by the monitor with 10bit/color enabled applications. Currently (for me) only PS has this capability, which is used for no than 10% of my daily screen time (Web, Email, Editing, etc.).
Photo of Dennis Hyde

Dennis Hyde

  • 4 Posts
  • 0 Reply Likes
Todd, I meant to suggest with my Viewsonic comment that a reputable manufacturer would specify if it uses 8 bit + AFC to achieve a 10-bit equivalent. I am surprised that NEC and presumably others would not, and it is yet another complication. I am thankful to learn of the displayspecifications website and that knowledgeable folks rely on it. Tell me if I am naive to think that.
Photo of Cameron Rad

Cameron Rad

  • 161 Posts
  • 48 Reply Likes
I'm finding conflicting reports about the NEC PA271Q's panel. I think it might actually be a true 10-bit panel and displayspecifications might need to update their site/verify with NEC. 

https://www.bhphotovideo.com/c/compare/NEC_PA271Q-BK-SV_27%22_16%3A9_Color-Critical_IPS_Monitor_with...


Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
Agreed. I don' know why it should be so hard to find the "real" panel specification and not just stated as 10 bit/color 1.07 Billion colors. I found two search pages at TFT Central for panel model lookup by monitor model with specifications. Unfortunately the Panel Search pulls up incomplete panel part numbers such as LM270WQ3 for the NEV PA272w and the even less precise 27"WS LG.Display AH-IPS for the NEC PA271Q.

http://www.tftcentral.co.uk/panelsearch.htm

http://www.tftcentral.co.uk/articles/monitor_panel_parts.htm

Below are all of the 27" 10 bit/color panels listed in the Monitor Panel Parts database.

(Click on picture to see full-size)






(Edited)
Photo of Cameron Rad

Cameron Rad

  • 161 Posts
  • 48 Reply Likes
Yea this panel is quite frustrating to find info for... I suspect this new one might actually be a 10 Bit panel. The "True 10 Bit Color" phrase in the press release leads me to believe that. 

I'd love to know as I currently have a PA272W and it might be time to replace mine soon. It's starting to become less and less uniform. It lived a good life though. 
Photo of Todd Shaner

Todd Shaner, Champion

  • 1589 Posts
  • 537 Reply Likes
I can't find anything on the NEC PA271Q-BK that is definitive. Try contacting NEC Tech Support and let us know if they even have a clue–Thanks!

Desktop Monitors
Large Screen Displays
Multimedia Projectors
Telephone Support 7:00 AM to 7:00 PM CT
(800) 632-4662

Email Support
techsupport@necdisplay.com