17 Messages
•
408 Points
Fri, Nov 27, 2020 6:22 PM
Lightroom Classic: debunking efficiency using the hardware and a plugin 🤦♂️ (Lua + SDK)
I made a plugin that run a benchmark to test the efficiency of different hardware configuration. For now, the plugin just play with sliders of one picture in a consistent loop.
The problem is that when hardware acceleration (GPU) is FULL (ON), it takes 7.07 seconds, and when hardware acceleration is OFF it takes 3,61 seconds. 🤦♂️ . I'll add exporting soon to the plugin, but initial tests are not bright.
Either LRc devs do not know how to use the GPU, don't know how to code and update the code to use the GPU or are underdevelopping (no ressources, no time, etc) or don't give a f*** about LRc?
For real, I am really looking for answers here.
Current Specs CPU i7-3770 (from 2013) , GPU 2070.
Test with machine #1
- CPU i7-3770 (from 2013) , GPU 2070.
- GPU FULL ON 7.07 seconds.
- GPU OFF 3,61 seconds.
Edit:Test with machine #2
- CPU AMD A10-9600P (4/4) 2.4Ghz + GPU Radeon R5:
- with GPU FULL 11,64s
- GPU OFF 5,74s.
With LRc version 10.
Problems
•
Updated
2 months ago
• Edited
65
7
0
Helpful Widget
How can we improve?
Tags
performance
plugin
bug
lua
sdk
api
windows
lightroom
Responses
steven_friedman_7261016
16 Messages
•
304 Points
2 months ago
I'll ask the obvious question of clarification: Are the stats you posted for the Intel or the AMD CPU. You list two systems but only show one set of numbers. Also, what feature did you test? It's kind of hard for anyone to assess whether there are really any issues here or exactly which feature is not showing improvement.
I have not always had great performance gains with GPU acceleration. , Even sometimes, I have had lockups when using GPU acceleration. I think there is still a lot of issues which can be traced to specific CPU/graphics board configurations. Quite possibly, it may involve the motherboard and its chipset as well. I am not an expert here by any means, but I think using GPU acceleration is still in its infancy and doesn't really do that much yet.
1
0
bill_3305731
799 Messages
•
9.5K Points
2 months ago
With the Quadro P2200 on my system all develop sliders on LRC V10 are now instantaneous on a 4K monitor. They were very slightly slower with a Quadro K1200. V9.4 had already achieved this for most of them but noise reduction had still lagged by 1/2-2 seconds depending on the image. At least one that doesn't use the GPU is instantaneous, I don't remember which one. While never slower, GPU fully enabled is as much as 10-20 times faster for some sliders. They are not making up for a slow cpu as the system is a 6-core Xeon, 3.6/4.0 GHz.
The system currently has two 4K monitors though only one at a time is used for Lightroom. Both are calibrated. I plan to add a 5K monitor to see if Nvidia has fixed their Windows 5K support. Moving 75% more pixels should be somewhat slower but the larger color space would make it worthwhile.
Your results are not consistent with what most others are experiencing. Nor are they consistent with benchmarks such as those conducted by Puget Systems on dozens of video cards. Their benchmark can be downloaded for you to run on your system.
0
0
david_golding_1mvs2lwngpf9v
162 Messages
•
2K Points
2 months ago
Inquiry (either not stated or I overlooked)
Your display resolution in use?
3
0
John_R_Ellis
Champion
•
5.5K Messages
•
97.3K Points
2 months ago
"In my case, I am using LRc on a 1440p (2560x1440 x10bits)"
Adobe has stated that on displays smaller than 4K, the GPU will often provide little improvement and may slow things down. Its current FAQ says,
"Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors."
It would be interesting to see your benchmark results on a 4K or 5K display.
(edited)
1
0
John_R_Ellis
Champion
•
5.5K Messages
•
97.3K Points
2 months ago
"Nor are they consistent with benchmarks such as those conducted by Puget Systems on dozens of video cards."
The Pugetbench benchmark does very limited performance testing of Develop:
"Develop Module Auto WB & Tone (How long the image takes to update after applying the Auto Tone and White Balance)"
So it's not surprising that Alexandre is getting much different results. I don't think Puget's benchmark is sufficient to provide useful predictions of how well LR might run in real life on a particular graphics card.
2
0
david_golding_1mvs2lwngpf9v
162 Messages
•
2K Points
2 months ago
Instead of assuming, although would probably be a good assumption considering the skill set of all members involved, I will ask.
So NVIDIA GTX, applying maximum performance improvements (power)
https://www.winhelp.info/boost-lightroom-performance-on-systems-with-nvidia-graphics-chip.html
I assume AMD has similar
And for Windows, Power setting at High Performance?
https://www.winhelp.info/boost-lightroom-performance-on-systems-with-nvidia-graphics-chip.html Tip 8)
1
0
david_golding_1mvs2lwngpf9v
162 Messages
•
2K Points
2 months ago
An additional inquiry, involving different expectations.
Lower Display Resolution
For the Windows rig, where you can actually select a lower resolution, not just scaling. If you reduce to less than 4K, Yes accept using an expensive monitor at lower resolution. How does LrC in your test perform?
And when you do, go bonkerss on the adjustment brrushes to try to slow LrC down.
0
0