Lightroom Classic: No. 1 Performance Improvement, my vote

  • 1
  • Idea
  • Updated 3 months ago
  • (Edited)
Assuming a reasonably powerful computer, not an i5 with 8GB of RAM and an embedded GPU, the number one performance improvement (for me) would be real time sliders for the Develop module. Those that are already close though slightly slower with the larger Fuji raw (timed for Nikon D300s and Fuji X-T3 raw images):
  • Exposure
  • Contrast 
  • Texture 
  • Clarity 
  • Dehaze
  • Vibrance 
  • Sharpening Radius 
A bit slow:
  • Sharpening Amount
Snail's pace:
  • Luminance Noise Reduction, 2-3 second delay per adjustment 
Technical Baseline:
  • 2020-03-29 
  • Windows 10 Pro, all updates
  • Lightroom 9.2, all updates
  • HP Z440 Workstation
  • Intel Xeon E5-1650, 6-core, 12-thread, 3.6/4.0 GHz
  • 64GB ECC RAM
  • NVMe system drive on the PCIe bus, Samsung Pro
  • SATA drive dedicated to Lightroom on the PCIe bus, RAPID mode enabled (library, catalog and cache), Samsung EVO 2TB  
  • Microsoft Defender for security 
  • Nvidia Quadro K1200 
  • dual 4K monitors 
AND, no quality reductions please. HDR stacking is now excellent in quality though a bit slow. Faster would be nice but not with a quality reduction. It's not perfect, attempting noise reduction stacking does not reduce noise; but Adobe doesn't claim that it does. 
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes

Posted 3 months ago

  • 1
Photo of Stefan Klein

Stefan Klein

  • 163 Posts
  • 85 Reply Likes
Your GPU (Quadro 1200) is pretty slow compared to what ́s possible today:
https://gpu.userbenchmark.com/Compare/Nvidia-Quadro-K1200-vs-Nvidia-RTX-2080-Ti/m28490vs4027

I for one have a GTX970 and an I7 5930K (6-core@4ghz) and Luminance Noise Reduction is realtime, but I "only" have a 1200x1600px monitor (and 2 additional 1280x1024). GPU usage goes up to about 80% when I drag the Luminance Noise Reduction slider.
So on your 4K display you need something faster than my GTX 970 (which is already faster than your Quadro1200).
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes
There are 2 possible uses for a GPU in a program like the Lightroom Develop module:
  1. calculate the effect of slider adjustments
  2. final render of the image for display on the monitor
Adobe has stated that it only uses the GPU for calculations for the Texture slider, no others. The GPU usage you are seeing is usage 2 and because you only need to render less than 1/4 as many pixels, you get a much faster response. All the sliders performed in real time for me on FHD monitors too. 
 
Furthermore the benchmark you reference is for gaming and is almost meaningless for Lightroom.  Here is the one that Adobe recommends for Lightroom users. Though it is important to keep in mind, this is only for the "send to monitor" phase of the process. For processing, only the Texture slider uses the GPU. 
 
https://www.videocardbenchmark.net/directCompute.html 


 
Photo of Rikk Flohr

Rikk Flohr, Official Rep

  • 7535 Posts
  • 1726 Reply Likes
Reference: "Adobe has stated that it only uses the GPU for calculations for the Texture slider, no others." What is your source? 
Photo of Stefan Klein

Stefan Klein

  • 163 Posts
  • 85 Reply Likes
Bill, I partly disagree with you.

Yes the benchmark is gaming related, but it gives a hint on general performance. If you look at the comparison you posted, the result is similar: The RTX2800TI is about 10 times as fast as your K1200.

As to the 2 use cases. It ́s not true anymore that only the texture slider is using the GPU for image calculations. Nearly every slider uses the GPU for both cases (you can ask the developers, it ́s a fact) and you can check that very easily. Go the prefs and disable the GPU for calculations and use it only for display purposes. You will see a MASSIVE difference in GPU usage, no matter, what slider you use. Look at the taskmanager for GPU usage (you have to configure it correctly, because there a many different measures for GPU usage) or use any other tool like desktop widgets etc. 


Just the example with luminance noise reduction:

1:GPU only for display purposes active:
draging the slider: 

CPU usage: 65%
GPU usage: 3%

2: GPU for display and computation:
CPU usage: 25%
GPU usage: 85%

I could do this for every slider with similar results.

Additionaly I made another test for you with a Fuji X-T3 raw file:

I spreaded LR over my 3 monitors so that the image in the Develop mudule nearly filled all 3 monitors and of course the panel also needed a little space.
That ́s total of about 3600x1100px. Not as much pixels as your 4K monitor, but at least nearly 50%.


GPU for display and computation:
draging the slider: 

CPU usage: 30%
GPU usage: 90%

luminance noise reduction slider: STILL REALTIME!





(Edited)
Photo of Stefan Klein

Stefan Klein

  • 163 Posts
  • 85 Reply Likes
Bill,

I simply do know it, although am not allowed to tell you more....

Why don ́t you just ask a developer or try it by yourself? It ́s pretty simple. Change your prefs, look at the taskmanager or any other tool you want to use and take a look at the onscreen response.

All those things are simple and proove that not only the texture slider is supported.

I don ́t need understanding of marketing, but of some technical basics. And it would not be the first time btw. that Adobe marketing is less then optimal.


That ́s it for me, end of discussion. I wanted to help you, it ́s up to you to draw your own conclusions.


My recommendation is to buy a faster GPU. Yours is slooow. According to your link, even my GTX970 is about 4 times as fast and is already nearly maxxed out for some LR interactions.
No surprise that your GPU isn ́t up to the task on a 4K monitor for LR compute functionality.

Btw. regarding the link, Jerry gave you:
ACR and LR share the same engine. But then you probably don ́t even believe that, right....?
(Edited)
Photo of Bill

Bill

  • 247 Posts
  • 32 Reply Likes
You are wasting everybody's time. Let's just end this. 
Photo of Stefan Klein

Stefan Klein

  • 163 Posts
  • 85 Reply Likes
Of course, Bill, the one with arguments and the will to help is wasting time of someone, who ignores facts, doesn ́t have knowledge and no clue, how to test things.....yeah....sure....

Yes, let ́s end it and have fun with your "2-3 sec. delay".
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 5640 Posts
  • 2250 Reply Likes
Bill, Stefan is correct. GPU improvements were made in 8.4 and apply to many Develop sliders, not just Texture. Much of the information in that article also applies to Lightroom, as it's based on the camera raw engine.
Photo of Stefan Klein

Stefan Klein

  • 163 Posts
  • 85 Reply Likes
Thanks Victoria!
But be careful not to waste Bill ́s precious time....;)
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes
The #1 complaint about Lightroom is performance, it is on forums all over the web. If this was true about 8.4 affecting many sliders, why is Adobe keeping it a secret? 

Photo of Todd Shaner

Todd Shaner, Champion

  • 1945 Posts
  • 653 Reply Likes
Your RTX 4000 Passmark G3D MArk is incorrect (see below).

"I don't think the file size affects the time it takes to move the pixels to the monitor. I've compared performance between 6MP and 26MP files and performance seems to be identical.

That may be the case now, but I remember years ago (LR 2) moving from a 6 megapixel Canon 300D to a 21 megapixel 5D MKII brought LR to its knees.

" It's the number of pixels to be displayed that matters."
To test that I maximized the Develop module loupe by closing the top, bottom, and left side panels with Full Screen mode and adjusted the Luminance slider with a 50 mp 5DS raw file. There is no perceptible delay. Next I unchecked 'Use GPU for Image Processing' and there was ~1.0 second delay. So the GPU is clearly making a big difference!
Just as an FYI the Secondary display does NOT use the GPU for any image processing function. That coupled with a larger loupe size will greatly slow down screen updates with your dual 4K display setup.


Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes
There are 2 uses for a GPU, processing and presentation. Unfortunately there is no way for us, without access to specialized monitoring tools, to measure the time spent for each process on our machines. 
 
For processing with the GPU, the number of megapixels has a huge impact; double the number of pixels and double (roughly, more complicated than that, 4X for something like noise reduction) the processing time. Until recently, Lightroom made essentially no use of the GPU for processing so the processing time was only affected by the CPU. 
 
Presentation depends on:
  1. the amount of preparation done by Lightroom
  2. the amount of image adjustment performed by the GPU 
Sometime back I was working on a project which had a performance problem when displaying the final result of our processing. In photo terms, we were sending about 100 megapixels worth of data per image to the video card. As the monitor was FHD (~2 MP), the GPU had to process that 100MP down to 2MP. In the days of slow video cards, this was taking 10-15 seconds per frame. By resizing the image within our software to match the resolution of the monitor, the GPU processing time became invisible to the human eye (< 1/60th of a second). 
 
As I'm not a Lightroom developer, I don't know whether Lightroom processes the image to be displayed to match the size of the window on the monitor. But based on some very rough measurements I've conducted with small and large files, my guess is that Lighroom is doing this processing. 
  
BTW: G3D Mark is essentially useless for evaluating video card performance for Lightroom. G2D Rating underestimates the effect of a GPU on Lightroom performance. PassMark GPU Compute Benchmark Chart is far more accurate for Lightroom, Photoshop and Premiere. 
 
A response from a Lightroom developer would be interesting, we could all learn a lot more. 
 
This is the link provided by Adobe for comparing video cards for their image / video processing software: 

 https://www.videocardbenchmark.net/directCompute.html
(Edited)
Photo of Todd Shaner

Todd Shaner, Champion

  • 1945 Posts
  • 653 Reply Likes
"BTW: G3D Mark is essentially useless for evaluating video card performance for Lightroom. G2D Rating underestimates the effect of a GPU on Lightroom performance. "

I agree the G2D Mark rating underestimates the effect of GPU on Lightroom performance. But the G3D Mark and Compute Benchmark are pretty close to the same. I'm willing to bet you will see near the Compute Benchmark concerning improvement in Luminance Noise Reduction and other slider delay. Based on that the Quadro P2000 should provide significant improvement, but you may need the Quadro RTX 4000 with a 4K display to completely eliminate slider lag. Another tip is to make sure the image you are editing is set to Process Version 5, which uses all available graphics acceleration.


GPU                 Compute Benchmark      G3D Mark           G2D Mark

Quadro K1200     1263   (Base)               2769 (Base)      615 (Base)

Quadro P2000      3102 (245%)             6747 (244%)       759 (123%)

Quadro RTX 4000 6488 (513%)           16920  (611%)    1042 (169%)
Photo of Bill

Bill

  • 252 Posts
  • 36 Reply Likes
Silly me, I was looking at absolute numbers rather than percentages. Still I would ignore G3D as it is only accidental that the relationships are similar (correlation is not causation). Thanks for the reminder about Process Version 5. I try to remember to change that each time I edit an image, perhaps it is time to just update the whole library as I've not done that in a couple years. I recognize that it will change some already edited images but I'll re-edit them before I use them again anyway.  
 
BTW, a comparison of the M4000 to the P2000 is interesting. While the Passmark of the P2000 if 15% higher, the M4000 has 8GB of RAM vs 5, 60% more CUDA cores, 30+% greater memory bandwidth and is 40% cheaper. I wonder which would be faster on a 4K monitor? A problem with the PassMark scores (Compute, G3D or whatever) is the wide range of systems upon which the tests were run. 
 
The responsible thing for me to do would be to download the benchmark and run it on my machine against the current and whatever card I settle on. 
 
With the next version of the 4000 series cards imminent and the much faster RTX selling for the same price as the P, I think it might be wise to wait for that one and just go cheap for now. 
Photo of Todd Shaner

Todd Shaner, Champion

  • 1945 Posts
  • 653 Reply Likes
Puget Systems ran a GPU benchmark with LR 8.2 for Enhanced Details performance. That's probably the closest you'll find for a published LR benchmark on GPU performance. They tested with 16MP .RAF - Fuji X-Pro1, 22MP .CR2 - Canon EOS 5K M3, and 45MP .NEF - Nikon D850 raw files. This should correlate closely with improvement in the Develop module Global controls slider lag.

https://www.pugetsystems.com/labs/articles/Lightroom-Classic-CC-2019-Enhanced-Details-GPU-Performanc...