bill_3305731's profile

1.2K Messages

 • 

13K Points

Sun, Mar 29, 2020 7:36 PM

1

Lightroom Classic: No. 1 Performance Improvement, my vote

Assuming a reasonably powerful computer, not an i5 with 8GB of RAM and an embedded GPU, the number one performance improvement (for me) would be real time sliders for the Develop module. Those that are already close though slightly slower with the larger Fuji raw (timed for Nikon D300s and Fuji X-T3 raw images):
  • Exposure
  • Contrast 
  • Texture 
  • Clarity 
  • Dehaze
  • Vibrance 
  • Sharpening Radius 
A bit slow:
  • Sharpening Amount
Snail's pace:
  • Luminance Noise Reduction, 2-3 second delay per adjustment 
Technical Baseline:
  • 2020-03-29 
  • Windows 10 Pro, all updates
  • Lightroom 9.2, all updates
  • HP Z440 Workstation
  • Intel Xeon E5-1650, 6-core, 12-thread, 3.6/4.0 GHz
  • 64GB ECC RAM
  • NVMe system drive on the PCIe bus, Samsung Pro
  • SATA drive dedicated to Lightroom on the PCIe bus, RAPID mode enabled (library, catalog and cache), Samsung EVO 2TB  
  • Microsoft Defender for security 
  • Nvidia Quadro K1200 
  • dual 4K monitors 
AND, no quality reductions please. HDR stacking is now excellent in quality though a bit slow. Faster would be nice but not with a quality reduction. It's not perfect, attempting noise reduction stacking does not reduce noise; but Adobe doesn't claim that it does. 

Responses

187 Messages

 • 

4K Points

1 y ago

Your GPU (Quadro 1200) is pretty slow compared to what ́s possible today:
https://gpu.userbenchmark.com/Compare/Nvidia-Quadro-K1200-vs-Nvidia-RTX-2080-Ti/m28490vs4027

I for one have a GTX970 and an I7 5930K (6-core@4ghz) and Luminance Noise Reduction is realtime, but I "only" have a 1200x1600px monitor (and 2 additional 1280x1024). GPU usage goes up to about 80% when I drag the Luminance Noise Reduction slider.
So on your 4K display you need something faster than my GTX 970 (which is already faster than your Quadro1200).

1.2K Messages

 • 

13K Points

1 y ago

There are 2 possible uses for a GPU in a program like the Lightroom Develop module:
  1. calculate the effect of slider adjustments
  2. final render of the image for display on the monitor
Adobe has stated that it only uses the GPU for calculations for the Texture slider, no others. The GPU usage you are seeing is usage 2 and because you only need to render less than 1/4 as many pixels, you get a much faster response. All the sliders performed in real time for me on FHD monitors too. 
 
Furthermore the benchmark you reference is for gaming and is almost meaningless for Lightroom.  Here is the one that Adobe recommends for Lightroom users. Though it is important to keep in mind, this is only for the "send to monitor" phase of the process. For processing, only the Texture slider uses the GPU. 
 
https://www.videocardbenchmark.net/directCompute.html 


 

187 Messages

 • 

4K Points

1 y ago

Bill, I partly disagree with you.

Yes the benchmark is gaming related, but it gives a hint on general performance. If you look at the comparison you posted, the result is similar: The RTX2800TI is about 10 times as fast as your K1200.

As to the 2 use cases. It ́s not true anymore that only the texture slider is using the GPU for image calculations. Nearly every slider uses the GPU for both cases (you can ask the developers, it ́s a fact) and you can check that very easily. Go the prefs and disable the GPU for calculations and use it only for display purposes. You will see a MASSIVE difference in GPU usage, no matter, what slider you use. Look at the taskmanager for GPU usage (you have to configure it correctly, because there a many different measures for GPU usage) or use any other tool like desktop widgets etc. 


Just the example with luminance noise reduction:

1:GPU only for display purposes active:
draging the slider: 

CPU usage: 65%
GPU usage: 3%

2: GPU for display and computation:
CPU usage: 25%
GPU usage: 85%

I could do this for every slider with similar results.

Additionaly I made another test for you with a Fuji X-T3 raw file:

I spreaded LR over my 3 monitors so that the image in the Develop mudule nearly filled all 3 monitors and of course the panel also needed a little space.
That ́s total of about 3600x1100px. Not as much pixels as your 4K monitor, but at least nearly 50%.


GPU for display and computation:
draging the slider: 

CPU usage: 30%
GPU usage: 90%

luminance noise reduction slider: STILL REALTIME!





1.2K Messages

 • 

13K Points

1 y ago

The #1 complaint about Lightroom is performance, it is on forums all over the web. If this was true about 8.4 affecting many sliders, why is Adobe keeping it a secret?