Skip to main content
Adobe Photoshop Family
J

3 Messages

 • 

92 Points

Wed, Jan 13, 2021 7:10 PM

Lightroom Desktop: Graphics card

I am looking to buy a new computer just to use for photography. A friend has one for sale and it has everything on your list except the GPU with Direct X support. It shows :  Intel(R) UHD Graphics 620. Will this work with Lightroom?

Responses

795 Messages

 • 

9.5K Points

6 days ago

Whose list?

 

  • As the GPU (currently) is only used by the develop module, those sliders will lag a bit. The 620 is inadequate if you are using anything above a FHD monitor. 
  • 16gb of RAM is a good minimum, anything less is problematic. If you intend to use the machine for a few years then you'll want 32.
  • the minimum processor should be the hyperthreading version of the i7.

3 Messages

 • 

92 Points

Thank you. Any advice is appreciated. 

602 Messages

 • 

11.1K Points

@bill_3305731 As pointed out in another post, your info about where the GPU is used is incorrect.  GPU comes into play in more areas than just the Develop module.

Lightroom Classic GPU FAQ

How does Lightroom Classic use the graphics processor?

When configured (Preferences > Performance), Lightroom Classic can use a compatible graphics processor (also called a graphics card, video card, or GPU) to speed up tasks of displaying and adjusting images in the Develop module, the Library module's Grid view, Loupe view, and Filmstrip. Enhance Details is also accelerated by the GPU. Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors.

795 Messages

 • 

9.5K Points

Technically true but the use outside of the develop module is trivial, at least on Windows. I've monitored all those non-develop functions and can't get GPU usage above 8%, typically only 2-3% and this just represents the GPU driving the monitor. In fact, just displaying a tab in Chrome or EDGE uses more of the GPU. So if there is any GPU usage outside of Develop, at least on Windows, it is too low to be measured. 

 

47 Messages

 • 

632 Points

Bill, this was somewhat true in the past but is rapidly changing as Adobe increases functionality that relies on the GPU and, as you noted below, 4k monitors can also have a huge impact as well as increasing megapixels as cameras evolve.

795 Messages

 • 

9.5K Points

I tested before entering my prior response so my statement is still accurate. I'm using 4K monitors. I agree with you that someday we'll see improvement outside of the develop module but currently it is invisible. 

47 Messages

 • 

632 Points

6 days ago

Others may have a different opinion but I don't think the built in Intel GPU is powerful enough for serious LR work. 

Also, Adobe's minimum requirements are more of a marketing spec than real world requirements, i.e., yes LR will run but if your computer only meets the minimum requirements performance will suffer especially if you're doing any serious work in LR or PS. Even more so if you have both apps open simultaneously.

3 Messages

 • 

92 Points

Thank you. I appreciate your advice. 

795 Messages

 • 

9.5K Points

@walter_thirion 

 

When my powerful Xeon desktop died, I used HP and LG laptops with the integrated GPU. Both machines had different Nvidia MX video cards but Lightroom wouldn't run with either activated so I had the cards turned off. Therefore, just using the internal GPU. 

 

Was going from a fast Xeon and a medium speed video card to relatively slow i7 machines without video cards noticeable? Yes. But Lightroom was still usable especially on the internal FHD display, calibrated. Too slow to use on the external 4K monitor.  

 

Everybody has their own definition of usable. As I'm not being paid for my work, I found all functions usable. 

 

A much bigger issue is that very few laptop monitors, even after calibration, deliver high quality color rendition. More and more all the time but not cheap and they tend to be on heavy gaming systems with less than stunning battery life. 

  

(edited)