Lightroom Classic: Not using notebook's GPU...(?)

  • 2
  • Problem
  • Updated 1 year ago
  • (Edited)

I have a HP Spectre x360 Convertible with a 4K display and a GeForce 940MX GPU. I can select it in Lightroom Preferences>Performance ("use graphics processor) and Lightroom seems to accept this card (i.e., doesn't complain or show errors). Also, the 940MX by all evidence is compatible with Lightroom according to Adobe.

Through the NVIDIA Control Panel, I've set Lightroom to use the high-performance GPU (not the motherboard GPU), and it does seem to show loads when I move the Lightroom window around on the Windows desktop, but Lightroom itself chooses CPU for everything leaving the GPU unused.

When using Lightroom in any module (develop, grid, full-screen previews, etc.) the Task Manager shows that all work is done by the CPU, and my GPU load  and memory remains near-zero. However, if I use the adjustment brush in the develop module, I do see GPU spike up to 20% or so, but I cannot tell if this is Windows drawing work or Lightroom.

According to the Adobe GPU FAQ page (Adobe Lightroom GPU Troubleshooting and FAQ ), I should see GPU work being done when opening a grid of previously-uncached photos, but I only see the CPU at 100%, and no GPU work being done (same with flipping photos in the Loupe or full-screen views).

In general, by far, the bottleneck is the CPU (i7-7500U 2.7GHz overclocked until too hot), but I though more work would be offloaded to the GPU. Like everyone else, I want my Lightroom to perform faster, especially when switching between photos. I do generate 1:1 previews upon import and walk away while the CPU maxes-out for 10 minutes, bit it's still sluggish on my 4K monitor.

The Lightroom "system info" dump shows some oddities: "GPUDeviceEnabled: false"

Any suggestions on how to get more out of my GPU, or make Lightroom not a pain to use? I've tried a "clean install" of the graphics driver, and deleting the preferences file.

Thanks for your help!

Here's my system info dump and screenshot, if it helps.


Lightroom Classic version: 8.2.1 [ 1206193 ]
License: Creative Cloud
Language setting: en
Operating system: Windows 10 - Business Edition
Version: 10.0.17763
Application architecture: x64
System architecture: x64
Logical processor count: 4
Processor speed: 2.9 GHz
Built-in memory: 16266.1 MB
Real memory available to Lightroom: 16266.1 MB
Real memory used by Lightroom: 2790.9 MB (17.1%)
Virtual memory used by Lightroom: 2832.6 MB
GDI objects count: 683
USER objects count: 2279
Process handles count: 2135
Memory cache size: 3058.1MB
Internal Camera Raw version: 11.2.1 [ 159 ]
Maximum thread count used by Camera Raw: 3
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 1495MB / 8133MB (18%)
Camera Raw real memory: 1498MB / 16266MB (9%)
System DPI setting: 240 DPI (high DPI mode)
Desktop composition enabled: Yes
Displays: 1) 3840x2160, 2) 1920x1200, 3) 1920x1200
Input types: Multitouch: Yes, Integrated touch: Yes, Integrated pen: Yes, External touch: No, External pen: No, Keyboard: Yes
Graphics Processor Info:
DirectX: NVIDIA GeForce 940MX (
Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic CC
Library Path: C:\Users\username\Pictures\Lightroom\Lightroom Catalog-2.lrcat
Settings Folder: C:\Users\username\AppData\Roaming\Adobe\Lightroom
Installed Plugins:
1) AdobeStock
2) Facebook
3) Flickr
4) Nikon Tether Plugin
Config.lua flags: None
Adapter #1: Vendor : 10de
Device : 134d
Subsystem : 82c1103c
Revision : a2
Video Memory : 2010
Adapter #2: Vendor : 8086
Device : 5916
Subsystem : 82c1103c
Revision : 2
Video Memory : 80
Adapter #3: Vendor : 1414
Device : 8c
Subsystem : 0
Revision : 0
Video Memory : 0
AudioDeviceIOBlockSize: 1024
AudioDeviceName: Speaker/Headphone (Realtek High Definition Audio(SST))
AudioDeviceNumberOfChannels: 2
AudioDeviceSampleRate: 48000
Build: 12.1x4
Direct2DEnabled: false
GL_VERSION: 4.6.0 NVIDIA 430.39
GPUDeviceEnabled: false
OGLEnabled: true
GL_EXTENSIONS: .... a lot of misc extensions, probably not important
Photo of bitpusher5000hd


  • 3 Posts
  • 0 Reply Likes

Posted 1 year ago

  • 2
Photo of David Converse

David Converse

  • 937 Posts
  • 274 Reply Likes
I'm gonna make a wild guess that running three displays maxes out the GPU and it can't handle more? Try your experiment with one display and see what happens.
Photo of bitpusher5000hd


  • 3 Posts
  • 0 Reply Likes
That sounded interesting, so I tried it out.... I told Win 10 NOT to expand my desktop on to other screens, and then unplugged the other monitors from my notebook. Neither of the changes made a difference in the LR GPU use.

Interestingly, the LR "system info" still showed 3 video adapters (same as I copied above) after unplugging all cables, then restarting LR, so I don't know if that matters. I'm not going to spend the time to re-install LR to try to remove them from the "system info" as I'll end up plugging them back in anyway.

Here's a snapshot of the Task Manager while opening an old folder of un-cached photos... all CPU, no GPU.

Photo of John R. Ellis

John R. Ellis, Champion

  • 5123 Posts
  • 1452 Reply Likes
"Interestingly, the LR "system info" still showed 3 video adapters (same as I copied above) after unplugging all cables,"

That's normal for a Windows system with two hardware adapters.  The PCI vendor ids  of your adapters are:

Adapter #1: Vendor : 10de
NVIDIA Corporation

Adapter #2: Vendor : 8086
Intel Corporation

Adapter #3: Vendor : 1414
Microsoft Corporation

Unplugging the display cables doesn't remove the hardware or its drivers.

Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1739 Posts
  • 601 Reply Likes

Your system info reports that Lr is using "DirectX: NVIDIA GeForce 940MX (" for GPU acceleration. It just that Lr is not yet utilize the GPU resource to its fullest extent possible.

Your performance issue could be that your 4 core laptop is driving a 4K display, it could be marginally underpowered. I am not sure about the type and size of raw files that you process, which could also make a big difference.