Skip to main content
Adobe Photoshop Family

2 Messages

 • 

652 Points

Fri, Aug 5, 2011 2:11 AM

61

Lightroom: Add 10-bit/channel color display support

Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?

Responses

Official Solution

Employee

 • 

1.7K Messages

 • 

32.4K Points

4 years ago

Steve is correct.

On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.

On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.

Principal Scientist, Adobe Lightroom

165 Messages

 • 

3.8K Points

Simon,
I guess I don`t understand. You mean LR is rendering in only 8bit on Windows??? I`m not talking about display, but the internal processing.

Employee

 • 

1.7K Messages

 • 

32.4K Points

Internal image pipeline process it in 16-bit on Windows also. Since the thread is about color display, so my comment above is purely about the last stage of display color transform for the monitor.

Principal Scientist, Adobe Lightroom

Champion

 • 

2.6K Messages

 • 

33.7K Points

So on Windows LR truncates the 16-bit PPRGB to 8-bit PPRGB then transforms to 8-bit "Monitor RGB"?    Isn't this non-optimal and will lead to unnecessary banding and posterization?

If I was converting color spaces in Photoshop from 16-bit PPRB to an 8-bit narrower monitor colorspace, I'd transform to the monitor colorspace while still in 16-bit then truncate to the monitor color-depth.

Is this non-optimal transform being done for speed purposes or because the GPU can't do the transform in 16-bit?  Or am I wrong that the order of the color-space conversion and bit-depth-conversion makes a difference?

Employee

 • 

1.7K Messages

 • 

32.4K Points

For the GPU rendering path, the display color transform is entirely handled within the internal OpenGL rendering pipeline. So the behavior there is just as you have expected. The 8-bit display color transform path is only applicable to the legacy CPU rendering path on Windows only. 

Principal Scientist, Adobe Lightroom

Champion

 • 

2.6K Messages

 • 

33.7K Points

So the CPU-rendering is non-optimal, why?

946 Messages

 • 

13.8K Points

9 years ago

"At the moment, all that's missing is the application."

There are thousands of feature requests, so obviously that isn't all.

129 Messages

 • 

3.2K Points

He means that the driver, graphics card, and monitor supports 10 bit color. Only the app is missing 10 bit support.

OS X doesn't support 10 bit, though. Windows does.

15.1K Messages

 • 

195.8K Points

Also, we can only support 10 bit/channel through OpenGL - which requires a compatible card and driver version.

2 Messages

 • 

652 Points

Correct, yes, the compatible card and driver are required, as well as a monitor.

My point is really:
If a user has invested in a 10 bit monitor, a 10 bit compatible card / driver version (e.g. FirePro from ATI) they should be able to get applications that support this.

The prime example being that CS5 does support 10 bit (if all the other pieces are in place) through OpenGL, so as a related product it would be great if Lightroom did too.

1 Message

 • 

60 Points

8 years ago

DSLR's are increasingly pushing the 8 bit envelope. Why would Adobe be content letting their software limit my photography?

15.1K Messages

 • 

195.8K Points

The bit depth of the data doesn't matter here -- Lightroom already processes images in the depth of the camera or 32 bit floating point format. This topic is about supporting newer video cards and displays that can use 10 bit/channel framebuffers.

427 Messages

 • 

7.7K Points

Furthermore, bit depth of cameras (raw) and monitor output are not comparable at all: Camera bit depth refers to a linear tonal scale (gamma = 1) and monitor output to a non-linear scale (mostly gamma = 2.2).

Existing 8 bit output resolution will not really limit your photography (e.g. in terms of dynamic range). The advantage of 10 Bit is "only" that it gives you more tonal resolution on the display. For critical photos, this can improve display accuracy by reducing banding/posterization in regions with subtle tonal or color changes. So the feature is of course desirable, but not essential.

427 Messages

 • 

7.7K Points

8 years ago

Additional question: Does LR use dithering in the develop module? I just loaded the 10 bit test ramp from http://www.imagescience.com.au/kb/que... into Photoshop and LR (8 bit output). In Photoshop, I see the 8-bit-banding, as well as in LR loupe (because previews are 8 bit JPEGs). But in LR develop, there is no visible banding!

This would be actually great, because it is the next best thing to 10 it support.

Edit: Yes, LR develop really seems to use dithering. Here is a small portion of the ramp, with extremly amplified contrast (screenshot from develop module, contrast amplified in PS):

677 Messages

 • 

8.7K Points

Yes, Develop and Export (to 8 bpp) dithers.
http://forums.adobe.com/message/3655226

427 Messages

 • 

7.7K Points

Thanks... yes, you already mentioned that here, too: http://feedback.photoshop.com/photosh... ... But I seem to have overlooked it, or simply forgotten ;-)

So... is it safe to assume that because of this, LR does not really need 10 bit output? I mean, if the dithering works so well on artificial images, it should work nearly perfectly on most real world photos. The only case when 10 bit would have an advantage is when there are sublte tonal *patterns* (e.g. fabrics, etc.) with small size (some pixels), which cannot be shown using dithering. But I doubt that such patterns can really be seen even with 10 bit.

6 Messages

 • 

672 Points

6 years ago

This reply was created from a merged topic originally titled
Lightroom: 10 bit monitor support.


I fully understand that you can't give Lightroom too many of the attributes that Photoshop has but a 10 Bit Monitor is just superb [using PSCC] and dare I say the defacto standard for sill digital image editing. It would be really nice to see Lightroom 6 having the same facility. It would also give the two Applications greater synergy when viewing/editing.

1 Message

 • 

100 Points

5 years ago

I am a very satisfied Lightroom user, but the lack of 30 bit color display capability it's a major shortcoming for a pro grade software.
I hope this feature will be added soon.

7 Messages

 • 

152 Points

5 years ago

This reply was created from a merged topic originally titled Lightroom: 10-bit display support.

we, photographers, need Lightroom to support 10-bit display output for having our advance displays like models from Eizo or NEC producing 10-bit billion color depth

2 Messages

 • 

72 Points

5 years ago

Having just tested 10 bit support in video media with a good - but not ridiculous - rig and seen the difference it makes I am very keen to get 10 bit support on ALL my photo apps.

windows 10 - nvidia 980ti - Dell U2713H. I know this monitor is doing slightly fake 10  bit (as many laptops do fake 8 bit), but the improvement is significant.

82 Messages

 • 

1.6K Points

4 years ago

My Nikon D3 produces RAWs with 14-bit color depth.  With 8-bit processing in LR, I'm throwing away 6 bits of color depth smoothness and gradation.  As others have pointed out, Photoshop has 10-bit color support, so it's not something that Adobe can't do. And nowadays, both Windows and OS X support 10-bit color.

Champion

 • 

2.6K Messages

 • 

33.7K Points

4 years ago

I'm pretty sure that LR uses a 16-bit internal workspace, while the display is only 8-bits, so the term 8-bit-processing isn't quite right. 

Employee

 • 

1.7K Messages

 • 

32.4K Points

4 years ago

On Windows, it is still 8-bit ProPhotoRGB.

Principal Scientist, Adobe Lightroom

2 Messages

 • 

72 Points

4 years ago

Also remembering that lightroom manages the color profiles on windows (good thing too as windows is - still -the village idiot when it comes to color management), so lightroom *could* do conversion in the best order to drive an 8 bit monitor. Having said that, even with dithering, there is a distinct (visible) quality improvement on going from
  • 10 bit source to 8 bit monitor interface
    -to-
  •  10 bit source to 10 bit monitor interface
Even the projector used by my camera club is 10 bit capable and is visibly better with full 10 bit interface.

This seems to apply just as much to nature (trees, leaves, stone etc. - especially when distant) as it does to artificial patterns. I assume that our internal image processing pipeline has evolved to pull as much detail as possible from the photons entering the eyeball.

There is also some interesting stuff happening around video as well as the extended video formats arrive. It is obvious that the eye brain processing that goes on picks up on some remarkably fine detail given the opportunity, The difference between yuyv (4:2:2) video and ful yuv (4:4:4) can be quite dramatic, creating a richer and more immersive image without apparently 'looking' much different. In some ways the result is similar to moving from 8 bit to 10 bit monitor depth.

82 Messages

 • 

1.6K Points

4 years ago

So if I have LR 6.5.1 on Windows, will I get 10-bit display output if I have the right graphics card?  If yes, can anyone recommend a suitable card.  I don't want to spend "too little" but I don't need the crazy-fast performance that gamers crave.

Thanks.

3 Messages

 • 

102 Points

With current version of LR, no chance of having 10 bit display. With Photoshop, a Nvidia Quadro  K1200 is more than enough. Only workstation cards like Quadro enable 10 bit/channel through OpenGL (that is what Photoshop supports). 10 bit display on LR would be very welcome, they lag considerably behind Photoshop team.

4 Messages

 • 

90 Points

No need Quadro anymore. Mine is an older GTX670, and it supports 10bit now. Go to NVidia Control Panel and there is an option to select color depth. You need a 10 bit capable monitor and most importantly connect with Displayport cable.

3 Messages

 • 

102 Points

Nope. You do need a Quadro.
You can enable the color depth option in your GTX670, but you will still see banding on your 10 bit capable monitor connected with Displayport  in Photoshop (check it with "10bit test ramp" file).
Geforce cards can output 10 bit color but they only support that with applications that use Direct X. Quadro cards support 10 bit color via Open GL. Photoshop uses Open GL so you need a Quadro card for 10 bit with Photoshop.
More info here:
http://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce...

4 Messages

 • 

90 Points

Thanks for the info. It's a shame that NVidia refuses to support 10 bit OpenGL on mainstream cards. I got a feeling that INTEL, MS, NVidia, ATI are the ones who stop the PC from progress. 

2 Messages

 • 

82 Points

4 years ago

This reply was created from a merged topic originally titled Lightroom: Please, add 10 bit per color (30 bit per channel) support.

Application is the only thing missing in true 10-bit workflow now.

1 Message

 • 

60 Points

4 years ago

Add please 10 and 12 bit  per channel output to driver.
My monitor can handle 12 bit per channel and only Photoshop can use it within 30 bit option.