Lightroom: GPU & Multiprocessor acceleration

  • 105
  • Idea
  • Updated 3 years ago
  • Implemented
  • (Edited)
It would be great if Lightroom 4 had GPU support for CUDA enabled video cards similar to the Mercury Playback Engine in CS5. That would really speed up performance! Every couple of seconds helps when you are editing 1000s of files.

Saving 2 seconds between images comes out to be an hour of saved time when you are editing 2000 images. I have 12 GB of RAM, and 12 core processors at my disposal and sometimes i have 4 seconds between images.

Multiprocessor support would be great as well!
Photo of ken staab

ken staab

  • 2 Posts
  • 0 Reply Likes

Posted 7 years ago

  • 105
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 815 Reply Likes
Lightroom already supports multiple processors. But many of the compression methods used in camera RAW files do not lend themselves to parallel processing, slowing the pipeline.

A GPU is not a cure all, and can cause more problems than they solve (especially where some drivers are concerned).

It would be best to just list the problems you're seeing, and let the Lightroom engineers figure out the problem and best solutions.
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 815 Reply Likes
Er, no -- the file formats are compressed in a way that does not allow good parallel processing. That has little to do with the implementation, and a lot to do with the third party file formats.
Photo of Eric Chan

Eric Chan, Camera Raw Engineer

  • 617 Posts
  • 121 Reply Likes
This is correct. Most third party raw formats have a linear compressed image data stream, so image decompression is necessarily serial. One exception is the Sony ARW format for which we do have multicore parallel decode already implemented in ACR 8.x / Lr 5.x.

On the other hand, DNGs have a tiling option and currently all DNG images written by Adobe software (e.g., DNG Converter, Convert to DNG on Lr import) are significantly faster to decompress because we can take advantage of multiple cores.
Photo of bobobo1618

bobobo1618

  • 5 Posts
  • 0 Reply Likes
Even if you don't do it for format loading, Sony had some great success with image (really video) processing using OpenCL http://on-demand.gputechconf.com/gtc/... :)
Photo of Diko

Diko

  • 77 Posts
  • 9 Reply Likes
Does that mean that if work with DNG files (converted from CR2) this would   improve the visualization of images and ergo their edit (especially a heavy local adjustment speed)? Challenge accepted. Will come back with a verdict.

Bobobo168, Video as weird as it sound as 5 times faster with adobe. Images are another story.
Photo of Diko

Diko

  • 77 Posts
  • 9 Reply Likes
ОК. Just checked it. Re-imported files as DNG files with 1:1 Preview.

Example of lack whenever using the Spot removal (heal) for more than, let's say 3 times, the next is SLOW. Better don't ask about fixing times on eyes sharpening and/or skin under the eyes with the adjustment  brush. Noticeable lag even on RAWs from Canon 40D which is with 10 Megapixels.

Guys, you gotta be kiddin' me! Do I need to explain that I now work with 50 megapixels and what that means to me?

My rig is: i-7-6700K @ 4 GHz, 32 GB RAM @ 2400 MHz - OS and scratch disk on EVO Pro 840 500 GB SSD + Black Caviar 1 TB for catalog and actual images. GTX 970. Win 10x64 bit. Current catalog set on 3 GB Video, 88 GB RAW cache.

PS: I have tested on the SSD no change. Constant utilization of CPU from LR whenever calculating a local adjustment or similar (observing it on Process Explorer).

What excuse do you have NOW???? Please let me know. 

Jeffrey Tranberry, I am complaining about this and also about the lack of custom shortcut keys feature for quite some time already. Considering migration to Capture One. Currently in workflow adoption and testing phase.

Eric Chan, My issue is not with the your department, I guess. Whoever works on the local adjustments should improve performance-wise dramatically!
(Edited)
Photo of ken staab

ken staab

  • 2 Posts
  • 0 Reply Likes
Thanks for your reply. Does anyone know if Lightroom 4 will offer GPU acceleration?
Photo of Jeffrey Tranberry

Jeffrey Tranberry, Sr. Product Manager, Digital Imaging

  • 13932 Posts
  • 1653 Reply Likes
GPU Support is not magic fairy dust that would make all aspects of Lightroom faster - there are things that GPU's simply aren't fast at. It would be better to tell us what things are slow and let the engineers find the correct technology/solution for the problem.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 371 Reply Likes
Indeed. Some people have thought thumbnail scrolling jerkiness was due to GPU, but its not. One can grab between thumbs and scroll smoothly, its just the scroll-bar initiated scrolling that's jerky.
Photo of BobSanderson

BobSanderson

  • 13 Posts
  • 0 Reply Likes
"GPU Support is not magic fairy dust that would make all aspects of Lightroom faster"
Who teaches you guys how to handle customer recovery efforts?
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
Adobe isn't likely to release that type of information before release, but I sort of doubt it personally.
Photo of Assaf Frank

Assaf Frank

  • 147 Posts
  • 46 Reply Likes
When exporting big raw files (60MP) it takes time... , is it a task for CPU or GPU?
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 379 Reply Likes
All CPU.
Photo of Kathrin Kohler

Kathrin Kohler

  • 2 Posts
  • 0 Reply Likes
This could get a speed up by exporting several Pictures of the batch at once - more threads, more use of the cpu. No Lags while reading, writing to the disk etc.
Photo of Diko

Diko

  • 77 Posts
  • 9 Reply Likes
IMO and according to my experience it's solely CPU.
Photo of Jeffrey Tranberry

Jeffrey Tranberry, Sr. Product Manager, Digital Imaging

  • 14190 Posts
  • 1787 Reply Likes
Disk time factors in there a bit, too, when exporting.

See related:

Lightroom: System Configuration Recommendations
Photo of Stefan Klein

Stefan Klein

  • 121 Posts
  • 67 Reply Likes
I find, that LR does not react fast enough to slider adjustments. In comparison Capture One with Open CL enabled is really fast and smooth. So I can imagine the GPU could help LR as well. Am I wrong?
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 814 Reply Likes
Yes. A GPU is not a cure all, and can cause more problems than they solve (@#%$@#%$@# drivers).

It would be far better to list the problems you're seeing, and let the engineers on Lightroom figure out the problem and best solutions.
Photo of Stefan Klein

Stefan Klein

  • 121 Posts
  • 67 Reply Likes
Okay Chris, you want a concrete problem: LR reacts rather sluggish, if I drag any image adjustment slider. It`s just not smooth. And it gets worse, if you have already made some (local) adjustments. It gets worse, if you have enabled you secondary monitor view, maybe in LIVE mode or zoomed in or both. Also panning is not smooth at all, if you are zoomed in. Zooming is not smooth, also. Just everything, I see on screen, is not really fast and smooth. I compare it to Capture One with OpenCL enabled or to Photoshop`s screen redraw. Both are what I expect. Photoshop`s screen redraw is really, really great. It wasn`t until CS4, when it got GPU support. I use Windows7 64 bit, a Q6600/2,8Ghz, 8GB RAM, GTX470. Not the latest and greatest, but not THAT slow, either. But when I use LR, the program just does not feel "snappy". It`s fast in terms of finally rendering or exporting RAWs, no question. But things, that "happen on screen" feel slow: Scrolling in Grid mode, arranging images in Survey mode (stuttering, compared to Bridge`s preview(?) panel), onscreen response to slider adjustments, panning, zooming. And all this just makes the program feel slow.
I can only make a comparison to Capture One, where many, many things speed up noticably as soon as OpenCL is enabled.
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 814 Reply Likes
Thank you for defining the problem. Now the Lightroom engineers can look into it and see what is going on.

If in doubt: define the problem, and let the engineers figure out the best solutions. Suggesting a solution without a well defined problem tends to get the suggestion ignored.
Photo of Walt Sorensen

Walt Sorensen

  • 46 Posts
  • 4 Reply Likes
I agree that we need GPU accelleration in LR. I also agree The Murcury playback engine comparison is a good one, since MPE is a rendering engine. This will not save time for everyone, But there are many people who do lots of local adjustments which are more rendering intensive.

LR would benifit from GPGPU accelleration for anything that would be rendering intensive: Local adjustments, noise reduction, lens distortion correction, slider adjustments, 1:1 preview creation, etc. It should be possible to increase performance for those people by pushing the rendering to the GPU, as is done with MPE.

OpenCL would likely be prefered over CUDA since more people would be able to benifit from the implimentation of openCL based GPGPU. But since Adobe has allready Implemented one CUDA based project, MPE, they might be able to use their CUDA skills to implement GPGPU in LR faster than if they attempted GPGPU with OpenCL.
Photo of sun gax

sun gax

  • 5 Posts
  • 1 Reply Like
GPU CUDA Acceleration for rendering image previews, exporting images, viewing the map, merging images into pano's, making image adjustments, zooming in and out and playing video etc, in Lightroom 4.

If the GPU can do it better than the CPU, then have it GPU accelerated. In fact, it would be a good idea to have all of adobe's products be GPU accelerated!

I'd like to see Lightroom 4 make use of GPU processing. Similar to the Mercury Playback engine in Premiere Pro CS5.5

GPU acceleration should be available for ALL CUDA enabled GPU's.

I absolutely love GPU acceleration & Mercury Playback engine in Adobe Premiere Pro. It really helps to speed up real-time previewing of high resolutuon footage.

I am sure that GPU acceleration will speed up any professionals workflow.

Here is the thread that many people want it for Lightroom 4!

http://forums.adobe.com/message/41647...

This reply was created from a merged topic originally titled
CUDA GPU Acceleration for Lightroom 4.
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 814 Reply Likes
Again, you're putting the solution ahead of the problem. Tell the Lightroom team where performance is a problem, and let them figure out the problems and best solutions.

(odds are that GPU usage would introduce more problems, and limiting acceleration to a single brand of GPU simply would not cut it)
Photo of Walt Sorensen

Walt Sorensen

  • 46 Posts
  • 4 Reply Likes
Chris, I agree many people are just expecting GPU acceleration to be some kind of fairy dust that will suddenly make LR much faster. I think this is mostly due to the significant performance increase achieved in Premiere with GPU acceleration of Rendering. Personally I think there are several current performance issues in LR3 and LR4 beta (still sluggish in LR5.3) that could benefit from GPGPU assistance, or at the minimum performance enhancements. Rendering of 1:1 previews is fairly slow, as is the conversion of RAW files to DNG. Additionally it is a known issue that as the number of local adjustments on an image increases the performance of LR decreases due to an increase in rendering load.

All of those issues could be addressed by GPGPU
Photo of Michael Robertson

Michael Robertson

  • 17 Posts
  • 15 Reply Likes
My number one target for performance improvement would be preview generation, particularly in the Develop module. I don't want to see that spinning indicator (ever)

------------------Future related improvement ----------------------
Perhaps in LR5 you could give us an option to specify whether we wanted the GPU accelerated process to build true Develop module previews and not Library previews. The latter is the consequence of LR's assumption of a Linear workflow moving from Library to Develop. I (and I suspect others) would like the option to work efficiently in the opposite direction. I'd rather LR processing cycles (on import) were building the Develop previews I'm ultimately going to need to make my processing-influenced selections. Or you could give us the UI customization facility to combine the best of the two modules to best suit our individual workflows.
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 131 Reply Likes
There are no Develop previews. There is the Camera Raw cache, that can be populated during the regular preview building process, but that's it.
Photo of Michael Robertson

Michael Robertson

  • 17 Posts
  • 15 Reply Likes
That's exactly why I'm asking for them as a new feature in LR5 AND as a candidate for GPU acceleration
Photo of John Doe

John Doe

  • 2 Posts
  • 1 Reply Like
CUDA acceleration, while it's isn't the end all-be all, would help with some of the mathematical functions. LR is still sluggish on my box, not as snappy as it should be. My Visual Studio 2010 is sluggish too, but that's understandable as the debugger is hooked to every process. But, LR shouldn't be. clicking on each image, the spin indicator for Loading will sometimes appear for up to 2 seconds per image. With Dual Xeon X5470 Quad Core 3.33GHz, 12M cache procs and 32GB of RAM, Quadro FX3700 and 12x 15K 300GB SAS drives in a stripe, it shouldn't EVER show that indicator. But, it does.

The ATTO benchmark shows the drive READ speeds at over 750MB/s and WRITES are 500MB/s, and in frame testing on the video, I can get 600-1100fps, so the hardware shouldn't be a bottleneck. In many of the imaging apps I've written (mostly for medical radiographic/DICOM), large images (1.5GB+ files) are still rendering in 3-5 seconds, but my app uses the CUDA extensions for some of the calculations.

GPU acceleration could be useful...unless you get the people with their cheap GeForce cards with very few CUDA cores - it'll actually slow down rendering. Even with the FX380 Quadro, having CUDA doesn't mean that it's better - but it would be nice to be able to enable the GPU and make our own decisions on whether it's helping or hurting!
Photo of Spinthma

Spinthma

  • 24 Posts
  • 1 Reply Like
LR4 is very slow compared to LR3 - GPU support is not a may have but a must have;
like my NIK plugins - they run fast in GPU mode and 10-25x slower without GPU accelelaration using the 8 Cores of my mac Pro
cheers
/Karl
Photo of Scott Martin

Scott Martin

  • 197 Posts
  • 15 Reply Likes
Lightroom 4 is so much slower than it's predecessor that it can really kill the creative process - even on a brand new, super fast, loaded up i7 Mac or PC. Is the GPU fully utilized? Speedy switching between the Library and Develop modules is particularly important. I'd put up with a larger RAM footprint for more responsiveness. I'd put up with just about anything for more responsiveness.

The differences in responsiveness between LR and other apps like Capture One, ACDSee and others is astonishing.

I'm not a developer and can't imagine what to suggest here but if there are any unexplored avenues for optimizing LR's speed it would be greatly appreciated!

This reply was created from a merged topic originally titled
Optimize Lightroom for Greater Responsiveness.
Photo of Ronald Nyein Zaw Tan

Ronald Nyein Zaw Tan

  • 33 Posts
  • 3 Reply Likes
I'd also like to add my vote to see an openCL support in future versions of LR 4.x. Currently I am also the owner of PhaseONE CaptureONE PRO 6 and have a GPU card with openCl support.

I can professionally vouch that the processing with the help of openCL enabled is faster. Screen redraws and image updates are noticeably faster and spiffy.

I am happy with the current stable release of the final 4.1 version. Glad it was usable unlike the previous versions.
Photo of Christoph B

Christoph B

  • 9 Posts
  • 4 Reply Likes
Hello,

I enjoy using LR. But there is still one thing that lacks: Performance, especially in the Develop Module and Zooming...

So please speed up LR using:

- OpenCL/GPU acceleration. Capture One pro 6 is doing this. If they can do this, Adobe can also!

- AVX support: Nik Software is using this. Processes got 1,5 to 2,6 times faster than before. So again... if Nik Software is capable of implementing this, Adobe should be able as well!

Thanks, Adobe for hearing us.

Christoph

This reply was created from a merged topic originally titled
Lightroom: Please speed up Lightroom using OpenCL/GUP acceleration and/or AVX support!.
Photo of Spinthma

Spinthma

  • 24 Posts
  • 1 Reply Like
Adobe did already speed up CS6 with GPU support!
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 764 Reply Likes
The difference is that in Photoshop: displaying an image is just compositing and color conversion. In LightRoom or Camera RAW it is reading a file, demosaicing, applying a lot of color conversions, possibly applying a bunch of filters and corrections. While Photoshop and Lightroom have a lot of math in common, they are used differently, and making Lightroom/ACR efficient on the GPU isn't nearly as easy as it sounds.
Photo of Kathrin Kohler

Kathrin Kohler

  • 2 Posts
  • 0 Reply Likes
But why then does comparable software get those massive boosts from OpenCL usage? They do the same tasks and while beeing even less multithreaded due to less effort that can be applied to the code. Another thing, I noticed there are 3 Pictures rendered ahead, what i welcome. Could there be implemented an option how many Previews an rendered ahead? Thanks for looking in here ;-)
Photo of Stefan Klein

Stefan Klein

  • 116 Posts
  • 60 Reply Likes
".....and making Lightroom/ACR efficient on the GPU isn't nearly as easy as it sounds."
I can imagine that Chris, but.....:
"We choose to go to the moon in this decade and do the other things. Not because they are easy, but because they are hard" :)
Photo of Chris Cox

Chris Cox

  • 20280 Posts
  • 764 Reply Likes
Bad code can easily get a boost from OpenCL (or vectorization, or threading), but code that is already highly tuned will see much less benefit.

Lightroom/ACR has been doing the hard work and optimized their code, and GPU support doesn't necessarily provide a huge boost in performance at such a low cost as it does to simpler and less optimized applications. Again, the Lightroom/ACR teams are trying to get the best performance available, by whatever means is available. But that task is not nearly as straightforward as you seem to imagine. And when it comes to using GPUs, the task is complicated by all the horrid video card drivers out there.
Photo of Stefan Klein

Stefan Klein

  • 116 Posts
  • 60 Reply Likes
Chris, if doing the actual Raw conversion by the GPU is so complicated, would it be as complicated to only do the image display with the help of the GPU? So that we can get the smooth and fast image display that Photoshop offers with all the goodies like flick-panning, animated zoom, birds eye view, rotation etc?
IMO the whole image navigation in LR and ACR are really bad and slow compared to Photoshop.
Photo of Clint Steed

Clint Steed

  • 6 Posts
  • 3 Reply Likes
From my experiences Lightroom (LR) 4.1 is still not optimized to the fullest. Increasing LR 4.1 cache to 50GB was a big help. No other graphics/video intensive application that I have uses the resources like LR 4.1.

NEF files converted to DNG in LR 4.1 can take up to twice as long as the NEF files to load in the library and development modules. The DNG files also take 1.5 to 3 takes longer to load in 4.1 vs LR 3.6.

Although LR 4.1 is much better than ver 4 there is still a delay when using the sliders in the development module that is not present in the LR 3.6 with the same files. DNG more so than NEF files.

Intel Core2 Quad Q6700
8 GB Ram
Win7 x64 OS
Nvidia GForce 9500GS
2 - 1.5TB HD: OS on one, LR and photos on the other.
Photo of allen.conway

allen.conway

  • 10 Posts
  • 1 Reply Like
I would expect Adobe to keep on improving performance of their products by, for example, making better use of GPU capabilities, so I think this request rather pointless.
Photo of Agzamov Dmitry

Agzamov Dmitry

  • 2 Posts
  • 2 Reply Likes
Lightroom uses less than half computing power of PC because it doesn't use GPU - it is not a solution, it is a PROBLEM! And result of this problem is slowness.

GPU with its very fast RAM waiting for lightroom to use it!
Photo of Axiom

Axiom

  • 111 Posts
  • 9 Reply Likes
as I posted in another thread, if you actually disable modules that you don't use, - and we're not talking "hiding them" from menus here - you may significantly increase lightroom's responsiveness. It's no GPU solution for things like Capture One offers, but it definitely helps

I just updated to 4.4 and used it for a day without any modifications and it had zero improvement over 4.3 in regards to performance and was in fact significantly slower than what I had grown used to with my modded 4.3.

creating a "disabled modules" folder and moving Book, Layout, Print, Slideshow and Web .lrmodule files allowed me to again get back to a proper and snappy "for lightroom" perfomance.

Note, if you need these "features" then you're crap out of luck from this benefit. But I don't need to make books or print or make web junk from lightroom - i use it to catalog images and metatags, make corrections and export the images needed used for production in the software built for production - whatever the task may be.
Photo of Diko

Diko

  • 77 Posts
  • 9 Reply Likes
I wonder if this is still true for LR 5.3 I really don't use those as well.

And as I understand it is quite reversible if needed.

Currently my workstation is for an upgrade and I can't check this solution's effectivness. :-(
Photo of Mike

Mike

  • 3 Posts
  • 4 Reply Likes
From my understanding, Invida has application that are developed to leverage the power of the GPU to speed up some specialized serial tasks. With the progression of resolution in cameras ie.. Nikon D800 and high megapixel, medium format cameras, computer hardware seems to lagging behind in all but the most cutting consumer accessible machines. Building Lightroom to take advantage of already present hardware, can lower the hardware barrier to performance entry, for much of the consumer market. My computer was built on a core i7 platform, 12GB ram and SSDs, and struggles with 40-50 mp raw files. If progressively higher resolution is the future in multi media, and hardware is the bottleneck, we need to find alternatives to achieving consistent and reasonable performance with current technologies.

This reply was created from a merged topic originally titled
Leveraging the GPU to assist in the processor intensive task of manipulating images in Lightroom.
Photo of Eric

Eric

  • 3 Posts
  • 2 Reply Likes
Pathetic that this was open over 2 years ago.. and still LR5 has no GPU acceleration! and still is slow as &$/*"
Photo of jpresumido

jpresumido

  • 2 Posts
  • 3 Reply Likes
Both today and in the future the photos have to be edited as soon as possible. It is what worries most professional photographers. Deliver work on schedule sometimes becomes a drama. With the technological increase, could have at our disposal today very fast computers, which is no longer a concern for us. The computer hardware is a technological level very good for photo editing. What happens sometimes are cases such as Adobe with Photoshop Lightroom, there seems to be willingness to follow the need of those living with deadlines for their clients. Everyone knows that lightroom has problems of delays and slowness in processing RAW files. I hope you consider these needs and provide a product of excellence for professionals take pride in working with Adobe products. Sorry for my English.
Photo of Brad Chatellier

Brad Chatellier

  • 1 Post
  • 0 Reply Likes
My extreme sympathies to Chris Cox for all the non-listening users posting here and not following his simple instructions. As soon as I have a bonafide performance issue I will post it here and I promise not to demand you take a GPU path to fixing it.

Of course, with the Mac Pro 2013 due out any time, I wonder what steps Adobe will take to utilize it, if any, in Lightroom. It's a heck of a lot of power, and seems it will be a model of pro desktop design going forward.

Thanks Adobe for making awesome products (so glad I don't work there anymore...)
Photo of Diko

Diko

  • 77 Posts
  • 9 Reply Likes
Hi everyone,

I find every LR feature just great, however I believe that it is

cruitial for all of us to get performance.

What LR is still missing is:
- shortcutkeys mapping as in PS (especially an option for keys mapping on custom presets)
- performance boost especially for retouching using LR tools

The latter was topic of this discusion that began 2 years ago...

I found the following post from a DEV (I guess):
http://forums.adobe.com/message/4129233 Post #17

"GPU accelerated screen redraw in PS was a significant effort and
required many months of development and testing."

YEAH! Since this post 2 years passed and we are on LR CC still awaiting it.

The explanation:
"Even if we did that, it wouldn't really help the user experience
because our pipeline is different than in Ps (in Ps the pixels are
effectively baked; in ours, we do all our rendering on-the-fly since
the editing is parametric -- our bottleneck is NOT the display
path!)."

If it is NOT, then please DO optimize the code perfomancewise.
I find it unaccepetable to experience lag on my i7 (3770k OC @ 4GHz
equipped with 2x8GB of fast 1600MHz RAM and SSD) machine with 10 MP
raw files. That is 10MP not 20MP. Let us NOT mention any better
resolution.

Everytime I retouch blemishes and skin with adj. brush I get that
LAG. When I I am editing a single shot - PS is my choice of a weapon. But for some projects I am paid in number of shots delivered. Then the correct tool is LR (or LR alternatives if this isn't fixed soon enough).

It is time for a new and improved LR CC with better performance. In the age of 20MP minimum and MediumFormats and Hi-Res images as industry standart now LR brought the right tools. Now it is time to make those stop being sluggish.
Photo of Walt Sorensen

Walt Sorensen

  • 46 Posts
  • 4 Reply Likes
I agree it's time for some significant improvements in performance.

Personally I find the explanation from Adobe that "it wouldn't really help the user experience because our pipeline is different than in Ps...our bottleneck is NOT the display path!" completely laughable. It shows a complete lack of understanding of GPGPU capabilities, and what was done in MPE by the person who wrote that.

So does GPGPU programing solutions do what we need, and does it do it faster than CPU solutions? Yes, and yes. We need on the fly rendering, GPGPU could do that (that is what the MPE does). We need faster initial renderings for imported image previews, GPGPU could do that. We need fast database queries on large databases to get all the parametric information for an image, GPGPU can do that.

Yes, there are some disk access bottlenecks affecting speed, but there are areas that GPGPU could provide some significant benefits.

I believe the real reason the LR hasn't gotten any GPGPU implementations is the lack of any well developed OpenCL or CUDA libraries for LUA (the underlying language of LR). I doubt Adobe will make any effort to develop write a OpenCL or CUDA library for LUA so they could put more effort into putting GPGPU features into LR.