Skip to main content
Adobe Photoshop Family

2 Messages

 • 

652 Points

Fri, Aug 5, 2011 2:11 AM

62

Lightroom: Add 10-bit/channel color display support

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

Responses

4 Messages

 • 

90 Points

4 years ago

There is no 12bit panel. Some TV may be able to accept 12bit data, but displaying 12bit color is another story.

2 Messages

 • 

72 Points

4 years ago

After reading this thread I summarize it as:

Even when having a pro 10bit capable display AND a pro 10bit graphics card like NVIDIA quattro or ATI FirePro you can only use 8bit color depth in lightroom. At least when you're on windows.

Is that correct? And if yes - WHY? I know I might not be able to tell the difference between two nearly identical colors, no matter whether I have 8bit or 10bit per channel BUT I indeed see the difference when having a highly increasing dynamic range. Compare a 256 shades (8bit) greyscale to a 1024 shades (10bit) greyscale - that's a massive (visible) difference.

44 Messages

 • 

750 Points

3 years ago

I'm in the process of considering upgrading my computer/monitor(s)/printer to enable full 30-bit workflow. I am currently using a PC with Windows 7 SP1, Radeon R9 390 graphics card and two 8-bit (6+2 in reality) 23" Dell monitors as well as Lightroom 6. I am relatively new to Lightroom 6 and still learning, but really like it so far - very easy and intuitive software. Having just spent $150 on my single, perpetual license I am reluctant to change to new photo editing software again.

So, it is clear that Lightroom 6 will not support 10-bit display output with Windows, not even the latest standalone version 6.14. However, after reading this thread I am left with the impression (rightly or wrongly) that Lightroom 6 will support 10-bit final monitor color transform (for display) for a Mac using 4k or 5k 10-bit display with latest OS. Is this impression correct?

What I am really trying to understand here is if there is a distinct advantage to changing to a Mac with 4k-5k 10-bit display as opposed to upgrading my existing Windows based system? Will I unlock potential for a true 30-bit end to end workflow with Lightroom 6 or do I need to change to Photoshop even with a Mac?

Looking at this from a different angle, will I be able to perceive differences in photos generated by Lightroom 6 with a true end-to-end 30-bit system compared to a 24-bit system? How much will this in practice affect color range (i.e., ability to utilize and display sRGB vs aRGB spaces), eliminating banding in softly graduated tones, etc. Put another way does Lightroom 6 already somehow compensate for these issues in the event that it is in fact restricted to 8-bit output even with Mac?

3 Messages

 • 

104 Points

2 years ago

When Adobe runs out of all the ideas for LR and when the whole and often misunderstood HDR hype takes over the world, then and only then this now "nobody needs this" feature will become "essential must have". Many modern image formats, including HEIF do have, unlike JPEG, more than 8bit per channel support. In the situation where you phone can display it but your mega rig with latest Lightroom can not, we might see some change.
Than being said, my latest Win10 with Quadro Card and 10bit display just would not get 10bit through in any application no matter what. It worked before, but not anymore.  

1 Message

 • 

60 Points

2 years ago

Sad, indeed. I'm looking to move away from LR/Adobe since that Lightroom Classic, Lightroom CC, Lightroom bitchmove, anyway. 

5 Messages

 • 

194 Points

2 years ago

What would be interesting to know is WHY Adobe has limited Lightroom in this way. There must be a justifiable reason otherwise I am sure 10 bit would have been enabled. However, it is definitely a desirable attribute and I would add my voice to the request for a full 10 bit monitor output.

3 Messages

 • 

104 Points

I think reason one is only a teeny-tiny portion of LR users is actually capable of running 10bit display workflow.
Reason two: they did not really make it work even in Photoshop yet (it works at particular zoom levels without any tools, including selection / marque tool, being used).
Reason three: your human vision: on a photograph it would be a challenge to see difference from a viewing distance between dither and true 10bit - UNLESS in a specific use case scenario (black and white photography, monochromatic photography of whatever color).
Reason four: as it was mentioned, internally all the calculations are 16/32 bit so the display depth is not a limit
Reason five: possible performance taxation on GPU acceleration. Essentially: GPU acceleration would not be able to share the same code for Radeon/GeForce with FirePro/Quadro cards. 

And at last but not at least: you would have to hire extra support staff just to trouble-shoot people's buggy 10bit workflows over which you have very little control but in the end the trouble will land in Lightroom.

Do I want 10bit in Lightroom? Hell yes!
Do I need it? Hm....
Would I will be able to see the difference on my pictures? Oh, I wish...
Are there any other more important issues I would benefit from more if Lightroom got them addressed? DON'T GET ME STARTED.... :-D

Get a high quality wide-gamut display with hardware calibration and with guaranteed uniform back-light (SpectraView or equivalent).
Make sure you have means of monthly HW calibration.
Secure a high-quality photo-printing workflow with custom calibrated printer for all the media you plan to use.
Now, you have done a lot for a high quality output. 99% percent.
You want that extra one percent, that is your 10bit display in Lightroom (and believe me, there are way better apps with much deeper control to print from than Lightroom).

4 Messages

 • 

90 Points

I agree that a wide gamut monitor is important, but as near as I can tell they all promote 10-bit color as an integral part of their improved display. The number of photos I would edit would be much greater that the number I would print. I would not put a printer in front of 10-bit color.

5 Messages

 • 

194 Points

Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq. I don't understand what you mean by not putting a printer in front of 10 bit colour.

4 Messages

 • 

90 Points

I was referencing Skippi's hierarchy of importance where he saw purchasing a custom calibrated printer as something that should be done prior to purchasing a 10-bit color capable wide gamut monitor. That would only hold if you printed every photo you edited. For most, the monitor would be used for editing a much larger number of photos than what would be printed, it seems to me. Also, it is not unusual to farm out the printing to a specialist.

There's no doubt that the vast majority of monitors in use are not 10-bit color wide gamut. But they are considered the best for a variety of reasons, and I am convinced of those reasons. It seems obvious, if Adobe sees itself as the best editing software, it needs to support the best monitors. I am not a programmer, but I assume supporting 10-bit color monitors would not prevent folks with standard monitors from using the programs.

Champion

 • 

2.2K Messages

 • 

37.4K Points

Very few manufacturers promote true 10 but monitors. Mostly Eizo, NEC and Benq.
All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color and a 16 bit/color internal LUT for calibration. This sets them apart from lesser displays that need to be calibrated using the monitor's external controls and profiled  using the 8 bit/color LUT in the graphics card. This reduces the displays effective bit depth.

More here on the subject:

https://forums.adobe.com/message/10958873#10958873



3 Messages

 • 

104 Points

First of all, glad the conversation on this topic is going on. 10bit display is deeply in my heart and I would love to see it just as you do - if not more, cause I have a 10bit workflow on my desk for over three years now.

@Dennis> I was referring to the printer because making prints significantly improves your photography. Nobody prints every picture they take. But spending time on finessing this very last step (OK, the very last step is framing and display) will get you thinking about your picture a lot more.

Based on my experience (and the experience of my friends who are gallery specialists doing scans of artworks priced at 6+ figures and saving them in 16bit uncompressed TIFF files), if I put it in laymen terms: For general photography, 10bit output is a p0rn only YOUR MIND can appreciate. With the LUT tables, high precision internal processing and properly calibrated screen, the difference between FRC dither and true 10bit output is indiscernible to your vision. Yes, you can tell in artificial monochromatic gradients. And (maybe) yes a highly skilled radiologist would be able tell the monitors apart. But for the rest of the cases...

@Todd> You are misinterpreting the facts. Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels with the micro controller really sending the 10bit value to the subpixel and really being able to display all the discreet levels of brightness in each of the subpixels does not requiring any dithering etc.

Actually, going 10bit is NOT that hard. What is hard (expensive) is achieving more-less linear wide gamut in a useful spectrum of colors. What is even harder and even more expensive is achieving uniform back-lighting and acceptable black levels on LCD screens. But without these, 10bit would make even less sense than it does now.

4 Messages

 • 

90 Points

I'm not certain that is correct, Todd. I know that Viewsonic gives its color space support spec as 8-Bit+A-FRC, with FRC being dithering that adds a 2 bit equivalent.

Champion

 • 

2.2K Messages

 • 

37.4K Points

Skippi said:
Yes, all of those manufacturers use 8bit LCD panels. But, for their premium priced prosumer and higher lines of products they also have 10bit LCD panels
Interesting, please provide links to specific models so we can give them consideration. Thank you Skippi.

Dennis Hyde said:

I'm not certain that is correct, Todd. I know that Viewsonic gives its color space support spec as 8-Bit+A-FRC, with FRC being dithering that adds a 2 bit equivalent.
That's exactly what I said, "All of these manufacturers use 8 bit/color LCD screen panels with FRC dithering to achieve 10 bit/color."
What's not correct?

163 Messages

 • 

2.7K Points

@Todd



See the NEC Manual: http://www.support.nec-display.com/dl_service/data/display_en/manual/pa271q/PA271Q_manual_EN_v4.pdf

The iMac Pro has 10Bit with dithering but it's also advertised as such.


Edit:

I've found some conflicting reports. NEC states in this press release that the PA271Q features  true 10 bit color. https://www.necdisplay.com/about/press-release/nec-display-updates-popular-professional-desktop-d/797

However this site states the panel is 8bit + FRC: https://www.displayspecifications.com/en/model/73ae1389

This other model though is 10bit 
https://www.displayspecifications.com/en/model/178b1656

Champion

 • 

2.2K Messages

 • 

37.4K Points

The NEC PA271Q and PA272W monitors use an 8 bit/color panel with FRC dithering to achieve 10 bit/color depth, which is 30 bit RGB.....same as the iMac Pro.

https://www.displayspecifications.com/en/comparison/115762319

Champion

 • 

2.2K Messages

 • 

37.4K Points

OK, good to know–Thanks Cameron!

1 Message

 • 

66 Points

5 Messages

 • 

194 Points

So we do. Have quite a few genuine 10 bit panels out there. From what I understand is that it gives a truer rendition of the image without the gaps created by the inability to show intermediate colours, whether we can actually see them or not :)

Champion

 • 

2.2K Messages

 • 

37.4K Points

It's debatable whether you can actual see the difference between 10 bit/color and 8 bit/color with FRC dithering. However, depending on how the dithering is implemented it may cause eyestrain for some users.  I've been using my NEC PA272w monitor with 8 bit/color + FRC dithering for three years now with no issues.

Keep in mind that the FRC dithering is only used by the monitor with 10bit/color enabled applications. Currently (for me) only PS has this capability, which is used for no than 10% of my daily screen time (Web, Email, Editing, etc.).

4 Messages

 • 

90 Points

Todd, I meant to suggest with my Viewsonic comment that a reputable manufacturer would specify if it uses 8 bit + AFC to achieve a 10-bit equivalent. I am surprised that NEC and presumably others would not, and it is yet another complication. I am thankful to learn of the displayspecifications website and that knowledgeable folks rely on it. Tell me if I am naive to think that.

163 Messages

 • 

2.7K Points

I'm finding conflicting reports about the NEC PA271Q's panel. I think it might actually be a true 10-bit panel and displayspecifications might need to update their site/verify with NEC. 

https://www.bhphotovideo.com/c/compare/NEC_PA271Q-BK-SV_27%22_16%3A9_Color-Critical_IPS_Monitor_with...


Champion

 • 

2.2K Messages

 • 

37.4K Points

Agreed. I don' know why it should be so hard to find the "real" panel specification and not just stated as 10 bit/color 1.07 Billion colors. I found two search pages at TFT Central for panel model lookup by monitor model with specifications. Unfortunately the Panel Search pulls up incomplete panel part numbers such as LM270WQ3 for the NEV PA272w and the even less precise 27"WS LG.Display AH-IPS for the NEC PA271Q.

http://www.tftcentral.co.uk/panelsearch.htm

http://www.tftcentral.co.uk/articles/monitor_panel_parts.htm

Below are all of the 27" 10 bit/color panels listed in the Monitor Panel Parts database.

(Click on picture to see full-size)






163 Messages

 • 

2.7K Points

Yea this panel is quite frustrating to find info for... I suspect this new one might actually be a 10 Bit panel. The "True 10 Bit Color" phrase in the press release leads me to believe that. 

I'd love to know as I currently have a PA272W and it might be time to replace mine soon. It's starting to become less and less uniform. It lived a good life though. 

Champion

 • 

2.2K Messages

 • 

37.4K Points

I can't find anything on the NEC PA271Q-BK that is definitive. Try contacting NEC Tech Support and let us know if they even have a clue–Thanks!

Desktop Monitors
Large Screen Displays
Multimedia Projectors
Telephone Support 7:00 AM to 7:00 PM CT
(800) 632-4662

Email Support
techsupport@necdisplay.com