Lightroom Classic: GPU for processing turned off by an update

  • 1
  • Question
  • Updated 3 months ago
  • (Edited)
Noticed a slow down with Develop slider adjustments that others were not experiencing. Problem started sometime over the past year, can't pinpoint it any closer than that.
 
Checked USE Graphic Process and it was set to Auto. For my system, Lightroom chose to use the GPU only for the display when Auto is the setting. Switched to Custom and checked Use GPU for image processing. Most of the sliders are now real-time with Luminance Noise Reduction down from 3 seconds to a max of 1 second. 
 
Folks were blaming my video card (Quadro K1200) but it seems that this setting is the problem. 
 
Having set this years ago, I'd not thought to check it again. 
Photo of Bill

Bill

  • 254 Posts
  • 36 Reply Likes

Posted 3 months ago

  • 1
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 5648 Posts
  • 2257 Reply Likes
Glad to hear it's working well for you. When you looked at it previously, it was a simple checkbox that only enabled display processing - the same as the auto setting was doing for you - so that wouldn't explain it getting slower. However, the new image processing option was added in 8.4 (August 2019) and the main aim is to speed up the sliders, so it's great to hear that you're seeing the benefits now.
Photo of Bill

Bill

  • 254 Posts
  • 36 Reply Likes
But sad that Adobe thought I wouldn't want the GPU to speed up the Develop module. 
 
Thanks, 
Bill 

Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 5648 Posts
  • 2257 Reply Likes
The image processing setting is much more likely to hit big bugs, so they've erred on the side of caution for graphics card/driver combinations they haven't been able to test. Many cards are on full automatically, but apparently not that one.
Photo of Stefan Klein

Stefan Klein

  • 163 Posts
  • 85 Reply Likes
The folks that were blaming your videocard also told you to take a look at the taskmanager. If you had done that, you would have seen that the GPU isn ́t used as much as one would expect, if it was set to do the image calculation.
Those folks were also told that LR is set to use the GPU for image processing, which it obviousely was not.

And btw. 1 sec. ist still pretty slow for Luminance noise reduction, which is due to your K1200.

Photo of Bill

Bill

  • 254 Posts
  • 36 Reply Likes
I've ordered a P2200 which is over 3 times faster so we'll see if that helps. As only the Luminance noise reduction slider is a bit slow, I can live with that. All the other sliders are real time like they were when the K1200 was on a FHD monitor. Just needed something upon which to spend my stimulus check. 
 
Later I'll probably get the next generation RTX 4000. As it is the same price as the P4000 which is much slower, I'm expecting a comparable price for a card which should be 25-50% faster. Though if the P2200 delivers real time sliders then I'll save the money use the money for a lens.