Lightroom/Camera Raw: Suggestions for selecting a Video Card

  • 1
  • Question
  • Updated 4 months ago
  • Answered
  • (Edited)
Video cards are expensive and it would be nice if we could understand how much we should be spending when we upgrade. So what is Adobe's statement of direction? 
 
Sample question, will we benefit from an 8 GB card over a 4 GB card on a 4K monitor? What about on a dual monitor system, dual 4K? 
 
It has been true for several years that any modest video card would match the fastest cards for Lightroom. I've personally experience this confirming the benchmarks. However with significant use of the card by the Texture Slider and Import/Export, is this still true? 
 
Puget Systems found a huge improvement for the Texture Slider between the fastest (2080 Ti) and more modest cards. But I have an old Quadro K1200 4GB card yet this slider performs in real time on my 4K monitor. There is no delay, I can watch the image change as I adjust the slider. Whereas the Noise Reduction Luminance slider has about a 3 sec delay on Fuji X-T1 RAW files. 
 
Decision example, ONLY for Lightroom and not other applications: I'm getting ready to replace my Quadro K1200 4GB card and considering the Quadro RTX 4000 8 GB vs the GTX 2080 Ti 11 GB. Ignoring batch operations and price, will I see any differences between these 2 cards over the next 3 years? 1) performance? 2) true 10 bit processing? 
Photo of Bill

Bill

  • 260 Posts
  • 38 Reply Likes

Posted 4 months ago

  • 1
Photo of Eric Chan

Eric Chan, Camera Raw Engineer

  • 636 Posts
  • 140 Reply Likes
Official Response
Hi Bill,

Here are some things to keep in mind as you consider your GPU choices.

For a large display (4K and beyond) I definitely recommend more video RAM. 8 GB is better than 4 GB in this case. Lightroom caches a lot of data on the video card when doing interactive edits, and the bigger the screen the more data it has to hold.

At present, Lightroom's GPU acceleration applies to editing images interactively in Develop and to the Enhance Details feature. It doesn't apply to import/export. If you invest in a strong GPU you won't see any changes to import/export performance, at least not with current versions of Lightroom. To get improved batch preview generation & batch save (export) perf your best option at the moment is to invest in a faster CPU with more cores (within reasonable limits); Puget has some extensive benchmarks in this area.

As far as GPU choices for performance goes in Develop, we have generally found that this page's benchmark scores are representative of the relative performance gains we observe in our tests:

https://www.videocardbenchmark.net/directCompute.html

Both the cards you mention (Quadro RTX 4000 and 2080 Ti) are there. Possible typo correction: you mentioned "GTX 2080" whereas I think you meant "RTX 2080".

Also, be aware that at present, not all of the edit controls in Develop are GPU accelerated. For example, local corrections & spot adjustments are not currently GPU accelerated, so if you're used to adding a lot of these adjustments to your images and these are the primarily bottleneck in performance for you, then adding a strong GPU is not going to help, at least not right away.

Longer term, our direction is to accelerate as much of the processing pipeline as possible (as many sliders and tools as we can) and as many workflows as we can. So, despite the caveats in applicability that I've noted above, be assured that GPU acceleration (and performance in general) is a very active area of development for us.