Lightroom Classic: Slower performance in 8.4 vs 8.3

  • 1
  • Problem
  • Updated 2 months ago
  • Solved
  • (Edited)
I'm clearly seeing slower performance in the Develop module since upgrading to Lightroom 8.4 from 8.3. Many actions in Develop are slower, such as moving from image to image, changing exposure, adjusting white balance, etc.

GPU acceleration is enabled (though it doesn't seem to matter if I toggle it on or off), but only the Intel UHD Graphics 630 is showing up instead of the Radeon Pro WX 4150 that I'd like to use for Lightroom.   And although the "use GPU for display" is checked, the second option to "use GPU for image processing" is disabled.

It appears that the Radeon Pro WX 4150 isn't being detected by Lightroom.  I'm almost positive it was in 8.3 and prior versions.


How can I troubleshoot the slower performance in 8.4?

Dell 7730 laptop with 64 GB RAM.  Windows 10.0.17763.



Here's the system info from Lightroom:

Lightroom Classic version: 8.4 [ 201908011719-03751b60 ]

License: Creative Cloud

Language setting: en

Operating system: Windows 10 - Business Edition

Version: 10.0.17763

Application architecture: x64

System architecture: x64

Logical processor count: 12

Processor speed: 2.5 GHz

Built-in memory: 65293.4 MB

Real memory available to Lightroom: 65293.4 MB

Real memory used by Lightroom: 7619.5 MB (11.6%)

Virtual memory used by Lightroom: 7864.4 MB

GDI objects count: 760

USER objects count: 2303

Process handles count: 2800

Memory cache size: 18900.9MB

Internal Camera Raw version: 11.4 [ 273 ]

Maximum thread count used by Camera Raw: 5

Camera Raw SIMD optimization: SSE2,AVX,AVX2

Camera Raw virtual memory: 3747MB / 32646MB (11%)

Camera Raw real memory: 3749MB / 65293MB (5%)

System DPI setting: 96 DPI

Desktop composition enabled: Yes

Displays: 1) 1707x960, 2) 2560x1600, 3) 1152x2048

Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No



Graphics Processor Info: 

DirectX: Intel(R) UHD Graphics 630 (25.20.100.6519)







Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic

Library Path: D:\[ LIGHTROOM ]\Catalog Main\Lightroom Main Catalog CC.lrcat

Settings Folder: C:\Users\mail\AppData\Roaming\Adobe\Lightroom



Installed Plugins: 

1) jb ListView

2) Loupedeck2

3) LR/Mogrify 2



Config.lua flags: None



Adapter #1: Vendor : 8086

Device : 3e9b

Subsystem : 8321028

Revision : 0

Video Memory : 128

Adapter #2: Vendor : 1002

Device : 67e8

Subsystem : 8321028

Revision : 0

Video Memory : fec

Adapter #3: Vendor : 1414

Device : 8c

Subsystem : 0

Revision : 0

Video Memory : 0

AudioDeviceIOBlockSize: 1024

AudioDeviceName: Speakers (Realtek USB Audio)

AudioDeviceNumberOfChannels: 2

AudioDeviceSampleRate: 48000

Build: 12.1x4

Direct2DEnabled: false

GL_ACCUM_ALPHA_BITS: 16

GL_ACCUM_BLUE_BITS: 16

GL_ACCUM_GREEN_BITS: 16

GL_ACCUM_RED_BITS: 16

GL_ALPHA_BITS: 8

GL_BLUE_BITS: 8

GL_DEPTH_BITS: 24

GL_GREEN_BITS: 8

GL_MAX_3D_TEXTURE_SIZE: 2048

GL_MAX_TEXTURE_SIZE: 16384

GL_MAX_TEXTURE_UNITS: 8

GL_MAX_VIEWPORT_DIMS: 16384,16384

GL_RED_BITS: 8

GL_RENDERER: Intel(R) UHD Graphics 630

GL_SHADING_LANGUAGE_VERSION: 4.50 - Build 25.20.100.6519

GL_STENCIL_BITS: 8

GL_VENDOR: Intel

GL_VERSION: 4.5.0 - Build 25.20.100.6519

GPUDeviceEnabled: false

OGLEnabled: true

GL_EXTENSIONS: GL_3DFX_texture_compression_FXT1 GL_AMD_depth_clamp_separate GL_AMD_vertex_shader_layer GL_AMD_vertex_shader_viewport_index GL_ARB_ES2_compatibility GL_ARB_ES3_1_compatibility GL_ARB_ES3_compatibility GL_ARB_arrays_of_arrays GL_ARB_base_instance GL_ARB_bindless_texture GL_ARB_blend_func_extended GL_ARB_buffer_storage GL_ARB_cl_event GL_ARB_clear_buffer_object GL_ARB_clear_texture GL_ARB_clip_control GL_ARB_color_buffer_float GL_ARB_compatibility GL_ARB_compressed_texture_pixel_storage GL_ARB_compute_shader GL_ARB_conditional_render_inverted GL_ARB_conservative_depth GL_ARB_copy_buffer GL_ARB_copy_image GL_ARB_cull_distance GL_ARB_debug_output GL_ARB_depth_buffer_float GL_ARB_depth_clamp GL_ARB_depth_texture GL_ARB_derivative_control GL_ARB_direct_state_access GL_ARB_draw_buffers GL_ARB_draw_buffers_blend GL_ARB_draw_elements_base_vertex GL_ARB_draw_indirect GL_ARB_draw_instanced GL_ARB_enhanced_layouts GL_ARB_explicit_attrib_location GL_ARB_explicit_uniform_location GL_ARB_fragment_coord_conventions GL_ARB_fragment_layer_viewport GL_ARB_fragment_program GL_ARB_fragment_program_shadow GL_ARB_fragment_shader GL_ARB_fragment_shader_interlock GL_ARB_framebuffer_no_attachments GL_ARB_framebuffer_object GL_ARB_framebuffer_sRGB GL_ARB_geometry_shader4 GL_ARB_get_program_binary GL_ARB_get_texture_sub_image GL_ARB_gl_spirv GL_ARB_gpu_shader5 GL_ARB_gpu_shader_fp64 GL_ARB_half_float_pixel GL_ARB_half_float_vertex GL_ARB_indirect_parameters GL_ARB_instanced_arrays GL_ARB_internalformat_query GL_ARB_internalformat_query2 GL_ARB_invalidate_subdata GL_ARB_map_buffer_alignment GL_ARB_map_buffer_range GL_ARB_multi_bind GL_ARB_multi_draw_indirect GL_ARB_multisample GL_ARB_multitexture GL_ARB_occlusion_query GL_ARB_occlusion_query2 GL_ARB_pipeline_statistics_query GL_ARB_pixel_buffer_object GL_ARB_point_parameters GL_ARB_point_sprite GL_ARB_polygon_offset_clamp GL_ARB_post_depth_coverage GL_ARB_program_interface_query GL_ARB_provoking_vertex GL_ARB_query_buffer_object GL_ARB_robust_buffer_access_behavior GL_ARB_robustness GL_ARB_robustness_isolation GL_ARB_sample_shading GL_ARB_sampler_objects GL_ARB_seamless_cube_map GL_ARB_seamless_cubemap_per_texture GL_ARB_separate_shader_objects GL_ARB_shader_atomic_counter_ops GL_ARB_shader_atomic_counters GL_ARB_shader_bit_encoding GL_ARB_shader_draw_parameters GL_ARB_shader_group_vote GL_ARB_shader_image_load_store GL_ARB_shader_image_size GL_ARB_shader_objects GL_ARB_shader_precision GL_ARB_shader_stencil_export GL_ARB_shader_storage_buffer_object GL_ARB_shader_subroutine GL_ARB_shader_texture_image_samples GL_ARB_shading_language_100 GL_ARB_shading_language_420pack GL_ARB_shading_language_packing GL_ARB_shadow GL_ARB_spirv_extensions GL_ARB_stencil_texturing GL_ARB_sync GL_ARB_tessellation_shader GL_ARB_texture_barrier GL_ARB_texture_border_clamp GL_ARB_texture_buffer_object GL_ARB_texture_buffer_object_rgb32 GL_ARB_texture_buffer_range GL_ARB_texture_compression GL_ARB_texture_compression_bptc GL_ARB_texture_compression_rgtc GL_ARB_texture_cube_map GL_ARB_texture_cube_map_array GL_ARB_texture_env_add GL_ARB_texture_env_combine GL_ARB_texture_env_crossbar GL_ARB_texture_env_dot3 GL_ARB_texture_filter_anisotropic GL_ARB_texture_float GL_ARB_texture_gather GL_ARB_texture_mirror_clamp_to_edge GL_ARB_texture_mirrored_repeat GL_ARB_texture_multisample GL_ARB_texture_non_power_of_two GL_ARB_texture_query_levels GL_ARB_texture_query_lod GL_ARB_texture_rectangle GL_ARB_texture_rg GL_ARB_texture_rgb10_a2ui GL_ARB_texture_stencil8 GL_ARB_texture_storage GL_ARB_texture_storage_multisample GL_ARB_texture_swizzle GL_ARB_texture_view GL_ARB_timer_query GL_ARB_transform_feedback2 GL_ARB_transform_feedback3 GL_ARB_transform_feedback_instanced GL_ARB_transform_feedback_overflow_query GL_ARB_transpose_matrix GL_ARB_uniform_buffer_object GL_ARB_vertex_array_bgra GL_ARB_vertex_array_object GL_ARB_vertex_attrib_64bit GL_ARB_vertex_attrib_binding GL_ARB_vertex_buffer_object GL_ARB_vertex_program GL_ARB_vertex_shader GL_ARB_vertex_type_10f_11f_11f_rev GL_ARB_vertex_type_2_10_10_10_rev GL_ARB_viewport_array GL_ARB_window_pos GL_ATI_separate_stencil GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_equation_separate GL_EXT_blend_func_separate GL_EXT_blend_minmax GL_EXT_blend_subtract GL_EXT_clip_volume_hint GL_EXT_compiled_vertex_array GL_EXT_direct_state_access GL_EXT_draw_buffers2 GL_EXT_draw_range_elements GL_EXT_fog_coord GL_EXT_framebuffer_blit GL_EXT_framebuffer_multisample GL_EXT_framebuffer_object GL_EXT_geometry_shader4 GL_EXT_gpu_program_parameters GL_EXT_gpu_shader4 GL_EXT_multi_draw_arrays GL_EXT_packed_depth_stencil GL_EXT_packed_float GL_EXT_packed_pixels GL_EXT_polygon_offset_clamp GL_EXT_rescale_normal GL_EXT_secondary_color GL_EXT_separate_specular_color GL_EXT_shader_framebuffer_fetch GL_EXT_shader_integer_mix GL_EXT_shadow_funcs GL_EXT_stencil_two_side GL_EXT_stencil_wrap GL_EXT_texture3D GL_EXT_texture_array GL_EXT_texture_compression_s3tc GL_EXT_texture_edge_clamp GL_EXT_texture_env_add GL_EXT_texture_env_combine GL_EXT_texture_filter_anisotropic GL_EXT_texture_integer GL_EXT_texture_lod_bias GL_EXT_texture_rectangle GL_EXT_texture_sRGB GL_EXT_texture_sRGB_decode GL_EXT_texture_shared_exponent GL_EXT_texture_snorm GL_EXT_texture_storage GL_EXT_texture_swizzle GL_EXT_timer_query GL_EXT_transform_feedback GL_IBM_texture_mirrored_repeat GL_INTEL_conservative_rasterization GL_INTEL_fragment_shader_ordering GL_INTEL_framebuffer_CMAA GL_INTEL_map_texture GL_INTEL_multi_rate_fragment_shader GL_INTEL_performance_query GL_KHR_blend_equation_advanced GL_KHR_blend_equation_advanced_coherent GL_KHR_context_flush_control GL_KHR_debug GL_KHR_no_error GL_KHR_texture_compression_astc_hdr GL_KHR_texture_compression_astc_ldr GL_NV_blend_square GL_NV_conditional_render GL_NV_primitive_restart GL_NV_texgen_reflection GL_SGIS_generate_mipmap GL_SGIS_texture_edge_clamp GL_SGIS_texture_lod GL_SUN_multi_draw_arrays GL_WIN_swap_hint WGL_EXT_swap_control
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes

Posted 2 months ago

  • 1
Photo of Bill

Bill

  • 7 Posts
  • 1 Reply Like
Hopefully you just need a driver upgrade. The experience of just one person and from reading forums: laptop GPUs are problematic with Lightroom (personal experience) and ATI video cards seem to have more problems than Nvidia. Net: I'd check the ATI forums for a driver fix.
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
Hi Bill, I just updated the driver but that did not solve the problem.
Photo of David Golding

David Golding

  • 92 Posts
  • 9 Reply Likes
As you can note in your posting at the forum site, it has occurred to me that I was attempting to solve for you the wrong problem. (the integrated graphics card) As opposed to the true problem, where did the GPU go!

See solution 4 in: https://helpx.adobe.com/lightroom-cla...
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
Hi David, no problem.  I did look at solution 4 but that hasn't solved the problem.

Here's more info:

- I updated the Radeon Pro driver using the link that Edmund provided (below), which is a newer version that what Dell lists for my computer.
- Lightroom still did not detect the Radeon Pro -- only the Intel.
- HOWEVER, I thought I'd disable the Intel adapter in Device Manager and when I did, Lightroom detects the Radeon Pro and allows me to use it for GPU processing (screenshot below).
- Unfortunately, that also disabled my multi-monitor setup so it's not a long-term solution.
- When I re-enable the Intel adapter, Lightroom again does not show the Radeon Pro as an option.

Any other ideas?


Photo of Edmund Gall

Edmund Gall

  • 63 Posts
  • 19 Reply Likes
Well done, Leroy, on spotting the potential root cause regading the Win 10 Graphics Settings for LR.

Regarding future multi-monitor setup, I cannot recall where but on another recent thread on this forum (reporting an issue with LR Classic 8.4 not spotting their high performance eGPU) I recall someone stating that LR might latch onto whatever GPU is used for the main monitor when LR launches. If that's true, then I recommend:
  1. With LR closed, try forcing the external monitor (presuming it's attached to the high-performance GPU you'd prefer LR to use) to be the primary monitor (even if it meant setting Windows to display only to that monitor/GPU temporarily).
  2. Once the external monitor is running as the main, launch LR and check the Performance tab to confirm that the high perf GPU is engaged.
  3. Once Step 2 is done, enable the laptop's built-in monitor to be used in Windows as the secondary monitor in an extended desktop arrangement.
  4. With LR's main app window still on the external, primayr monitor, re-perform Step 2 to see if LR kept its engagement with the high performance GPU (in theory, it should, because LR is not displaying anything through the low perf internal GPU).
  5. Move the main LR application window to the low perf built-in monitor, and re-perform Step 2. If LR now says it's using the low perf GPU, then you'd know that in a pinch you'd need to move the LR main app window to the high performance GPU's monitor to force its use.
  6. Finally, try moving the main LR app window to the low perf GPU/monitor and then force LR to keep a secondary app window open on the high perf GPU/monitor. The control for this appears on the lower left side above the film strip in LR, where you'd see two rectangles with a 1 - for your main monitor - and 2 - for your secondary monitor. You can control what that secondary window displays (e.g. Loupe, Survey, Grid). Now LR will be using both GPUs, but I have no idea which one it will use for advanced processing (or if the new controls provided with v8.4 would allow you to select different GPUs for different purposes by changing Use Graphics Processor from Auto to Custom).
Probably not ideal, and I have only a single screen iMac setup so I can't test it myself, but hopefully it gives you some insights for a potential solution. If the theory is right, then you can decide which monitor is best to host your main LR app window for your processing workflow.

If 8.4 currently only uses the GPU that its main window is displayed through, then hopefully in a future update Adobe enables the user to control via Preferences which GPU in a multi-GPU setup should be used for advanced processing (e.g. Develop changes, exports).

Cheers!
Ed
(Edited)
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
Even with my three-monitor setup, with today's fix I'm seeing the performance that I was expecting to see.  So all is well.

Thanks again for taking the time to comment and provide suggestions.
Photo of Anshul Patel

Anshul Patel, Employee

  • 3 Posts
  • 0 Reply Likes
once driver update is done please go to "%appdata%\Adobe\CameraRaw\GPU\Adobe Photoshop Lightroom Classic"   and delete the file "TempDisableGPU" and see whether you are able to switch on "use GPU for image processing".
Photo of Leroy Schulz

Leroy Schulz

  • 38 Posts
  • 13 Reply Likes
Hi Anshul, that folder is empty and a search of my system for "TempDisableGPU" comes up empty as well.
Photo of Edmund Gall

Edmund Gall

  • 63 Posts
  • 19 Reply Likes
Hi Leroy! As provided by Dave Golding above, Adobe's GPU Troubleshooting & FAQ states that AMD GPU cards with version 17.4.4 drivers aren't supported on Windows. AMD released a new driver for your Radeon Pro WX 4150 on Dell Mobile runing Windows 10 64-bit two weeks ago (14th Aug 2019). Check if you have this latest version installed: https://www.amd.com/en/support/professional-graphics/mobile-platforms/radeon-pro-wx-x100-mobile-seri...

Hope this helps...
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
Hi Edmund, thanks for that suggestion.  I'm fairly certain that the Radeon Pro drivers I had installed were much more recent than v17, but I just tried installing the newest from the link you provided and unfortunately that did not solve the problem.  See my reply to David, above.

Any other ideas?
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
Folks, I think I found the problem and the solution.

I while ago in Windows 10 > Settings > System > Graphics Settings > Classic App I set up an entry specifying that Lightroom should use the "high performance GPU".   Of course.

Well, as I was double-checking that entry I noticed that Adobe must have changed the app folder when they renamed Lightroom from Adobe Lightroom Classic CC to just Adobe Lightroom Classic.

The Graphics Setting entry was set to \Adobe Lightroom Classic CC\lightroom.exe.

But I noticed that the executable is now located at \Adobe Lightroom Classic\lightroom.exe.  Note that the folder name no longer contains CC.

I deleted the entry for the previous folder name, created one for the new folder name, setting the entry to High Performance and now Lightroom detects and utilizes the Radeon Pro.  It appears that the OS or Lightroom was getting confused by the changed folder name.

I suspect this explains why 8.4 seemed to be running slower for me, but I'll confirm this as I use Lightroom over the next couple of days.

Thank you all for your suggestions.
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
I can confirm that after using Lightroom for the last few minutes to edit a shoot, the poor performance I was seeing with 8.4 has been resolved.

Thank you to everyone who commented.
(Edited)
Photo of Simon Chen

Simon Chen, Principal Computer Scientist

  • 1635 Posts
  • 550 Reply Likes
Thanks Leroy for reporting what you've found. It would make sense to me.

 Lightroom does not pick and choose which GPU to use in a multi-GPU setup. It is up to the OS to tell Lightroom which GPU to use by default. When the Lightroom application path changes (initially happened on the Lr 8.0 release), one would need to update the list applications that are allowed to use the high performance GPU.

(Edited)
Photo of Leroy Schulz

Leroy Schulz

  • 40 Posts
  • 13 Reply Likes
With the hindsight benefit of knowing what was causing the problem, it makes sense.

Hopefully someone sees to it to add this point to the Lightroom troubleshooting page in case someone else runs into the same issue, though unless the product name changes again it may not be a frequent problem.

Cheers.