Lightroom: GPU & Multiprocessor acceleration

  • 105
  • Idea
  • Updated 3 years ago
  • Implemented
  • (Edited)
It would be great if Lightroom 4 had GPU support for CUDA enabled video cards similar to the Mercury Playback Engine in CS5. That would really speed up performance! Every couple of seconds helps when you are editing 1000s of files.

Saving 2 seconds between images comes out to be an hour of saved time when you are editing 2000 images. I have 12 GB of RAM, and 12 core processors at my disposal and sometimes i have 4 seconds between images.

Multiprocessor support would be great as well!
Photo of ken staab

ken staab

  • 2 Posts
  • 0 Reply Likes

Posted 8 years ago

  • 105
Photo of Diko

Diko

  • 79 Posts
  • 12 Reply Likes
Additionally to GPGPU features there are some advices for LR speed up that I found here:

http://feedback.photoshop.com/photosh...
Photo of rumovoneuropa

rumovoneuropa

  • 1 Post
  • 1 Reply Like
me again, please speed up importing and converting to dng by processing mutible pictures at once. My CPU utilisation is always between 45 and 65% average 50%. With 3 raw-pictures at a time it would be 100% i think. I use 8 GB Ram (29% are used overall)Quad Phenom II at 3,2Ghz, SSD and USB 3.0 /UHS Card. So certainly no bottleneck on the Hardwareside (aside from the two-core-performance of the CPU).

Also i'd like two hint to the hsa-advantages of the new Kaveri-CPUs from AMD (maybe intel will adopt it in the future?). It doubles the performance when decoding jpeg for example. Maybe your are working on opencl-support anyway so maybe you could implement it too ;). I at least would by the kaveri and the new lightroom that supports it :).

http://www.extremetech.com/computing/...
Photo of Pascal Renauld

Pascal Renauld

  • 2 Posts
  • 1 Reply Like
This reply was created from a merged topic originally titled
Optimize Lightroom using the graphic card capabilities.


I have Wins 7-64, 2 hard disk 7200trs/, intel icore5, catalog and cache on the non OS disk,display 27 inches widegamut, XRite i1pro calibration device, NVIDIA Quadro k600 graphic card. I do not understand why Lightroom does not use the graphic card capabilities to improve how it works and to give some slack to the cpu. The only solutions to optimize Lightroom the solutions are, for the most part, to use less of the capabilities of the software instead of programming it properly or upgrade to i7core. At least give us the option to use the card when we have good one like Quadro series.
Photo of Jason Dunn

Jason Dunn

  • 8 Posts
  • 3 Reply Likes
It's unfortunate that in the three years since this topic has been started, Lightroom 5.5 is still pretty sluggish even on great hardware. I'm somewhat terrified of how bad it will get when I move to a newer DSLR later this year with images 2x the resolution.

I'll note, though, that when I was using Lightroom on an older Macbook Air, it felt quite a bit faster than my Core i7 desktop, which surprised me. Is the Windows version somehow hobbled compared to the OS X version?
Photo of Eric Chan

Eric Chan, Camera Raw Engineer

  • 627 Posts
  • 132 Reply Likes
Are you using a SSD on your Windows machine?
Photo of Jason Dunn

Jason Dunn

  • 8 Posts
  • 3 Reply Likes
Who are you asking this question to Eric - me? If so, yes, I have my OS on one SSD and my Lightroom catalog and all the images in that catalog on another SSD. I even created a 12GB RAMdrive this weekend and loaded my entire catalog into that. Slight speed increase in some functions, but the worst slowdowns are still there and are seem to be CPU and software-bound.

I can only describe it as...latency. It's like it takes the program 1-2 seconds so start processing the development settings on an image before it will complete them and show you the image.
Photo of seanhoyt-dot-art

seanhoyt-dot-art

  • 315 Posts
  • 51 Reply Likes
My basic understanding is that performance isn't changing (or is getting worse) because they are optimizing at the same pace as they are adding great features. Really, these features are *almost* on par with PS. The more adjustments you apply serially to an image in the Dev module, the longer it takes for them to be applied EACH time you land on the image.

With 32-64GB of ram, couldn't LR pre-render a ton of raw files in the develop module using idle CPU cores while I do adjustments on a file? Or could I prerender a set then come back and work from RAM?

Or, can't you guys give me proxy DNGs like Smart Previews where I'm loading and executing on small files then applying to full DNG on export?
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 387 Reply Likes
Once upon a time, Lr kept the last 4 renderings for develop in what was called the "negative" cache - selecting any of the last 4 images in develop module was instant (no loading - ready in ~0.0 seconds). Now, it only keeps the last one. My sense is that it was a bother for them (kinda like concurrent rendering for slideshow), and so they nix'd it.

+1 for resurrection (with ability to do "look-ahead" instead of just "remember-behind"), with some good scheme for controlling ram consumption (currently Lr just uses virtual memory (disk as ram) when out of real memory, nearly always bringing the entire system down when that happens - not a good scheme in my opinion).
Photo of seanhoyt-dot-art

seanhoyt-dot-art

  • 315 Posts
  • 51 Reply Likes
Yeah, I'd like Adobe to stop catering to the least common denominator. I just put together a massive system that blows away new mac pros and it's still not that impressive in LR. Make the min specs be a top end machine with 32GB+ ram and optimize for the professionals who use this software to chew through literally 100k images a year, not the weekend warrior crowd that want to get by with celeron + 1GB ram Dell machines using 5400 rpm drives and 15" monitors. Sure, it would make the software cost more as they'd lose a great deal of their install base, but I'd pay thousands for software that actually was intelligent enough to utilize my hardware vs putting it to waste.
Photo of Michael Robertson

Michael Robertson

  • 17 Posts
  • 15 Reply Likes
Amen Sean. I've not commented on these forums for three years, and overall LR performance has waned. I'm routinely generating thousands of RAW files per day on many assignments and all my ongoing hardware investments can't compensate for LR's sluggish performance. All the wasted seconds add up, and clients want even faster turnaround (social media for starters).

Call it Lightroom Pro... charge more and I'll happily pay for it with no new features just 3X-5X performance.
Photo of Scott Martin

Scott Martin

  • 199 Posts
  • 17 Reply Likes
Right on Sean and Michael. I don't know what the solutions are but I'd pay a lot more for LRPro as well. Importing, screen redraw, fast switching between images in the Library and Develop modules without interim previews are pain points for me. I've super fast cards, card readers, the latest hardware, SSDs, etc and the bottleneck is LR. Capture One and PhotoMechanic have clear advantages in these area - not that I want to use them but they prove the potential for this to work better. Would love to provide further feedback and help test...
Photo of seanhoyt-dot-art

seanhoyt-dot-art

  • 315 Posts
  • 51 Reply Likes
I think one of the devs commented before about GPU support and how it would be hard to make it work with many systems. So.... have certified GPUs for the pro version of LR: I have dual R9 280x which are OpenCL beasts.

BTW... http://www.musemage.com/ <-GPU
Photo of Michael Robertson

Michael Robertson

  • 17 Posts
  • 15 Reply Likes
Sean to your point about volume. I've shot approx. 350,000 images since 2011, all RAW. I'd have no problem complying with a short list of certified hardware... it's that important.

LR began as a stealth development effort to address the workflow demands of high volume shooting, that raison d'etre seems to have been lost along the way. These threads stand as sorry testimony to the lack of progress in this critical area.
Photo of seanhoyt-dot-art

seanhoyt-dot-art

  • 315 Posts
  • 51 Reply Likes
I don't know, like I said, they've continually added new features such as perspective/lens correction which are actually useful. The adjustment brushes, clone tools are quite close to perfect though I'd rather have PS-like healing vs LR's version. That said, it seems like a PERFECT time for LR to really hit the 'optimize' button and dive deeper into parallelizing and look-ahead rendering like what you see in FCPx when you idle the mouse. (in FCPx my CPU and GPU are rarely idle but the UX is snappy). I mean, look at a typical 1000-shot export where you can export 100% in 10 minutes, 500/500 (parallel) in 5 minutes and 333/333/333 in 3 minutes. Sure, the 3-up split does practically lock up the CPU but I'm making coffee. This has been exposed and talked about for years.... give me the TURBO button!
Photo of seanhoyt-dot-art

seanhoyt-dot-art

  • 315 Posts
  • 51 Reply Likes
Photo of jpresumido

jpresumido

  • 2 Posts
  • 3 Reply Likes
HoHoHo !!! So !!! It is very hard ... HaHaHa. Learn from this company.
It seems that Adobe is struggling !!!
Photo of Ope Gato Cedo

Ope Gato Cedo

  • 30 Posts
  • 4 Reply Likes
Actually there are some simple improvements which could implemented very easily. For example the import of photos does not use all available resources.

Importing e.g. 1200 RAW images, converting them into DNG on a Mac Pro with 6 Cores, 3.5 GHz and 32 GB RAM, from a SD Card into a RAID Array ... you would simple expect a quite quick progress.

But Lightroom is slow. It doesn't make use of the RAM, it also doesn't make use of the CPU power. It seems it processes each image in a sequence, instead of processing 8-16 Images in parallel. It's frustrating to see that there is no actual bottleneck, just Lightroom which does process the images sequentially. It could also start generating the previews after the import, but no... the preview generation starts _after_ the import of all images.

- A big issue on import. While import you can not work on Lightroom, so this slowness blocks actually your workflow.
- Another big issue on export. It takes so much time to export e.g. 100 images. At least here it doesn't block your workflow.

If at least Lightroom would process multiple images in parallel, it would compensate for the lack of multithreading in the processing itself.
Photo of Scott Martin

Scott Martin

  • 199 Posts
  • 17 Reply Likes
I completely hear you about the need for multiple import sessions. As we speak/type I'm importing ~2000 images and waiting to start another session of another 2000+ images. If I could just start them both it would be a huge time saver for me. As it is I need to stick around at the studio another 35 minutes so I can start the second session and then head home.

A know a lot of people that would like to be able to start one session with one set of parameters (file renaming, keywords, etc) on a group of images, and then immediately start a second import session on a second group of images.

Your card reader could be a bottleneck, but either way multiple import sessions (from several card readers if necessary) could help!
Photo of Myron MacLeod

Myron MacLeod

  • 5 Posts
  • 0 Reply Likes
This reply was created from a merged topic originally titled Nvidia Optimus on Laptop cannot use Dedicated GPU for Lightroom 5?.

I just purchased a powerful laptop with hopes of taking advantage of the dedicated GTX 860m GPU white using Lightroom 5 instead of the Intel CPU graphics. The laptop uses Nvidia Optimus technology and I have a feeling that Lightroom 5 isn't an application profile that is supported yet. Photoshop does appear to be supported in this situation though.
Does anyone have insight into this issue? Below are the steps I've taken in Windows 8.1.

-Open Nvidia Control Panel
-Select Manage 3D Settings
-Select Program Settings tab
-In 1st Drop down field, Adobe Lightroom is not available (Adobe Photoshop is)
-Clicked 'Add', selected lightroom.exe.
-Switched 2nd drop-down field to "high-performance Nvidia Processor"
-Clicked apply
-Opened Lightroom 5 and the Dedicated GPU did not turn on.

I applied these same steps for Adobe Photoshop and everything worked fine.

I spoke with an adobe tech support rep and he mentioned that Lightroom hasn't implemented support for this yet? Why would photoshop be supported and not Lightroom?

Windows 8.1
Intel i7-4720 cpu
Nvidia GTX 860m 4GB gpu
8GB DDR3 memory

Photo of SBP

SBP

  • 3 Posts
  • 0 Reply Likes
Vote for this: 
https://feedback.photoshop.com/photoshop_family/topics/lightroom-preload-in-develop-module

I thint this is an easy solution for the 2 second delay that can bee seen on every machine.