2 Messages
•
652 Points
Fri, Aug 5, 2011 2:11 AM
62
Lightroom: Add 10-bit/channel color display support
Lightroom: Add 10-bit support for end-to-end 10-bit workflow. That is, Application->Driver->Graphics Card->Monitor for true 10-bit color. At the moment, all that's missing is the application. CS5 supports 10-bit color, why not Lightroom?
Ideas
•
Updated
2 years ago
385
21
62
Helpful Widget
How can we improve?
Tags
wide gamut
10-bit
lightroom
Responses
Official Solution
SimonChen
Employee
•
1.7K Messages
•
32.4K Points
5 years ago
On Mac: Lightroom has always uses the 16-bit math throughout in the develop module (with ProPhotoRGB as the internal working color space). With the GPU acceleration, it requires at least Lr 6.2 and later.
On Windows: It also uses the same 16-bit ProPhotoRGB math for internal image renders. But the result image data of the final monitor color transform (for display) is currently limited to 8-bit.
Principal Scientist, Adobe Lightroom
9
lee_jay_fingersh
946 Messages
•
13.8K Points
9 years ago
There are thousands of feature requests, so obviously that isn't all.
3
brianna_byman
1 Message
•
60 Points
9 years ago
2
0
lrsuer24
427 Messages
•
7.7K Points
9 years ago
This would be actually great, because it is the next best thing to 10 it support.
Edit: Yes, LR develop really seems to use dithering. Here is a small portion of the ramp, with extremly amplified contrast (screenshot from develop module, contrast amplified in PS):
2
0
lyndon_heap
6 Messages
•
672 Points
7 years ago
Lightroom: 10 bit monitor support.
I fully understand that you can't give Lightroom too many of the attributes that Photoshop has but a 10 Bit Monitor is just superb [using PSCC] and dare I say the defacto standard for sill digital image editing. It would be really nice to see Lightroom 6 having the same facility. It would also give the two Applications greater synergy when viewing/editing.
0
0
giorgia_battecca
1 Message
•
100 Points
5 years ago
I hope this feature will be added soon.
0
ali_fatemi
7 Messages
•
152 Points
5 years ago
we, photographers, need Lightroom to support 10-bit display output for having our advance displays like models from Eizo or NEC producing 10-bit billion color depth
0
0
iain_malcolm
2 Messages
•
72 Points
5 years ago
windows 10 - nvidia 980ti - Dell U2713H. I know this monitor is doing slightly fake 10 bit (as many laptops do fake 8 bit), but the improvement is significant.
0
0
phil_burton_7735307
83 Messages
•
1.6K Points
5 years ago
0
0
steve_sprengel
Champion
•
2.6K Messages
•
33.7K Points
5 years ago
0
0
SimonChen
Employee
•
1.7K Messages
•
32.4K Points
5 years ago
Principal Scientist, Adobe Lightroom
0
0
iain_malcolm
2 Messages
•
72 Points
5 years ago
- 10 bit source to 8 bit monitor interface
- 10 bit source to 10 bit monitor interface
Even the projector used by my camera club is 10 bit capable and is visibly better with full 10 bit interface.-to-
This seems to apply just as much to nature (trees, leaves, stone etc. - especially when distant) as it does to artificial patterns. I assume that our internal image processing pipeline has evolved to pull as much detail as possible from the photons entering the eyeball.
There is also some interesting stuff happening around video as well as the extended video formats arrive. It is obvious that the eye brain processing that goes on picks up on some remarkably fine detail given the opportunity, The difference between yuyv (4:2:2) video and ful yuv (4:4:4) can be quite dramatic, creating a richer and more immersive image without apparently 'looking' much different. In some ways the result is similar to moving from 8 bit to 10 bit monitor depth.
0
0
phil_burton_7735307
83 Messages
•
1.6K Points
5 years ago
Thanks.
4
0
denis_protopopov
2 Messages
•
82 Points
4 years ago
Application is the only thing missing in true 10-bit workflow now.
0
0
4eycz2ew78qnj
1 Message
•
60 Points
4 years ago
My monitor can handle 12 bit per channel and only Photoshop can use it within 30 bit option.
0
0