Lightroom Classic: GPU upgrade without performance improvement for Luminance slider

  • 1
  • Question
  • Updated 3 months ago
  • (Edited)
After all the harassment I've received about my GPU (K1200), I upgraded to a P2200 which is rated to be over 3 times as fast. My focus is on the sliders which, except for Luminance Noise Reduction, are all real time on my 4K monitor with the K1200.
 
It turns out that this particular slider makes very little use of the GPU. It drives all the cpu threads (12) to 100%.
 
Yes, I did make sure that the GPU preference is still set correctly. 
 
Since the K1200 was never running at 100% busy, installing a faster GPU couldn't help with GPU related tasks as it always had extra capacity available. 
 
Oh well, at least I made a contribution to the economy. 
 
Now to wait for a 128 thread 20 GHz processor with 64 memory lanes. I'll probably die first.  
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes

Posted 3 months ago

  • 1
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1738 Posts
  • 601 Reply Likes
Have you updated to the latest LrClassic 9.2.x? Based on our internal test, luminance NR performance has improved a lot comparing with earlier versions.
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes
Yes, on 9.2.1. Since all the other sliders were operating in real time with the K1200 (at 4K), no visible improvement in them. Of course looking at the GPU monitor, the GPU usage goes down with the faster card. Where the worst case would hit about 40% busy, it now tops out around 10%. If a process is not waiting on a GPU then installing a fast one can't help, that is just the way hardware works. 
 
It's just like horsepower on a car engine. If a car with a 100HP engine, cruising at 50MPH is using 25HP. Replacing the engine with one that is much more powerful, doesn't change the car's speed. 
 
Not that the Luminance slider is slow, just not real time. For a 12 MP D300S RAW that has been converted to DNG, typically between 0.5 to 1.5 seconds. 
 
As all 12 3.6 GHz threads are running at 100%, they can't deliver data any faster to the GPU. So the GPU is not waiting. 
 
At least there will likely be fewer driver bugs with the newer card though they've been rare over the past 6 months. 
 
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1738 Posts
  • 601 Reply Likes
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes
I was the originator of that thread. It doesn't contradict anything I said above.
 
Adobe has done some terrific work in making use of the GPU and hopefully will continue. Monitoring demonstrates that the Luminance slider makes almost no use of the GPU and that is a HARD problem. I have a mathematician friend who worked on image process for the military and he shared the difficulty of parallel processing of image enhancements. Since Adobe has managed to figure this out for parallelizing on the CPU, they are probably close to getting it figured out for the GPU. Odds are that some developer has it running on their development machine and hasn't gotten the funding to move it into the development stream.