This would be actually great, because it is the next best thing to 10 it support.
Edit: Yes, LR develop really seems to use dithering. Here is a small portion of the ramp, with extremly amplified contrast (screenshot from develop module, contrast amplified in PS):
Lightroom: 10 bit monitor support.
I fully understand that you can't give Lightroom too many of the attributes that Photoshop has but a 10 Bit Monitor is just superb [using PSCC] and dare I say the defacto standard for sill digital image editing. It would be really nice to see Lightroom 6 having the same facility. It would also give the two Applications greater synergy when viewing/editing.
windows 10 - nvidia 980ti - Dell U2713H. I know this monitor is doing slightly fake 10 bit (as many laptops do fake 8 bit), but the improvement is significant.
On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.
On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.
- 10 bit source to 8 bit monitor interface
- 10 bit source to 10 bit monitor interface
This seems to apply just as much to nature (trees, leaves, stone etc. - especially when distant) as it does to artificial patterns. I assume that our internal image processing pipeline has evolved to pull as much detail as possible from the photons entering the eyeball.
There is also some interesting stuff happening around video as well as the extended video formats arrive. It is obvious that the eye brain processing that goes on picks up on some remarkably fine detail given the opportunity, The difference between yuyv (4:2:2) video and ful yuv (4:4:4) can be quite dramatic, creating a richer and more immersive image without apparently 'looking' much different. In some ways the result is similar to moving from 8 bit to 10 bit monitor depth.
Application is the only thing missing in true 10-bit workflow now.
Even when having a pro 10bit capable display AND a pro 10bit graphics card like NVIDIA quattro or ATI FirePro you can only use 8bit color depth in lightroom. At least when you're on windows.
Is that correct? And if yes - WHY? I know I might not be able to tell the difference between two nearly identical colors, no matter whether I have 8bit or 10bit per channel BUT I indeed see the difference when having a highly increasing dynamic range. Compare a 256 shades (8bit) greyscale to a 1024 shades (10bit) greyscale - that's a massive (visible) difference.
I'm in the process of considering upgrading my computer/monitor(s)/printer to enable full 30-bit workflow. I am currently using a PC with Windows 7 SP1, Radeon R9 390 graphics card and two 8-bit (6+2 in reality) 23" Dell monitors as well as Lightroom 6. I am relatively new to Lightroom 6 and still learning, but really like it so far - very easy and intuitive software. Having just spent $150 on my single, perpetual license I am reluctant to change to new photo editing software again.
So, it is clear that Lightroom 6 will not support 10-bit display output with Windows, not even the latest standalone version 6.14. However, after reading this thread I am left with the impression (rightly or wrongly) that Lightroom 6 will support 10-bit final monitor color transform (for display) for a Mac using 4k or 5k 10-bit display with latest OS. Is this impression correct?
What I am really trying to understand here is if there is a distinct advantage to changing to a Mac with 4k-5k 10-bit display as opposed to upgrading my existing Windows based system? Will I unlock potential for a true 30-bit end to end workflow with Lightroom 6 or do I need to change to Photoshop even with a Mac?
Looking at this from a different angle, will I be able to perceive differences in photos generated by Lightroom 6 with a true end-to-end 30-bit system compared to a 24-bit system? How much will this in practice affect color range (i.e., ability to utilize and display sRGB vs aRGB spaces), eliminating banding in softly graduated tones, etc. Put another way does Lightroom 6 already somehow compensate for these issues in the event that it is in fact restricted to 8-bit output even with Mac?
Than being said, my latest Win10 with Quadro Card and 10bit display just would not get 10bit through in any application no matter what. It worked before, but not anymore.