Lightroom: exporting uses excessive memory

  • 2
  • Problem
  • Updated 7 years ago
I've been experimenting with whole-catalog exports - none complete before running out of memory (memory consumption exceeds 90% after a few thousand photos), including built-in hard-disk export.

Whole catalog updates of various kinds (building previews, plugin stuff...) do not consume any memory - just the exports...

Lightroom version: 3.5 [775451]
Operating system: Windows 7 Ultimate Edition
Version: 6.1 [7601]
Application architecture: x64
System architecture: x64
Physical processor count: 4
Processor speed: 3.4 GHz
Built-in memory: 7934.1 MB
Real memory available to Lightroom: 7934.1 MB
Real memory used by Lightroom: 505.0 MB (6.3%)
Virtual memory used by Lightroom: 596.0 MB
Memory cache size: 1015.0 MB
System DPI setting: 96 DPI
Desktop composition enabled: Yes
Displays: 1) 1920x1200, 2) 1920x1200

UPDATE 2011-10-03:
----------------------------
745 photo export (built-in hard-disk export service):
- preparing for export: memory went from 34% to 44%
- after exporting 1/3 photos (say 250-ish), memory increased to 54%
- memory consumption stayed about the same for the duration of the export.
- memory consumption did not drop after export successfully completed.
3714 photo export (built-in hard-disk export service):
- initial memory consumption: 33% (after Lightroom restart).
- It took a *long* time before the initial "preparing for export" dialog box even came up - no memory increase.
- After preparation complete, memory consumed = 50%
- It took a few minutes before the first photo was rendered, during which time, memory consumption increased to 62%
- After export 10% complete (say 350 photos), memory consumption = 65%
- After export 20% complete (say 700 photos), memory consumption about the same.
- After export 30% complete (say 1000 photos), average memory consumption has increased slightly, but is fluctuating more.
- After export 50% complete (say 1800 photos), average memory consumption about the same... (about the same after 80% & 90%...)
- Upon export completion, memory consumption dropped to 54%.

Preliminary conclusion: not a true (progressive) "leak" per-se (memory consumption holds steady if it makes it past the hump). The problem is over-consumption - a 10000 photo export is impossible with 8GB ram (for me). I suspect if I had 16GB it would make it to the plateau, and on to the promised land...

PS - I just tried a 5000 photo export and managed to reach the plateou (I think) at about 90% memory consumption. I canceled the export however since computer was becoming unresponsive. So, 4000 photos is about the limit for me with 8GB.

Rob
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 379 Reply Likes

Posted 7 years ago

  • 2
Photo of Dan Tull

Dan Tull, Employee

  • 172 Posts
  • 38 Reply Likes
I suspect I know what might cause this. In order to ensure that the export operation produces files based on the settings at the time you initiated the export, it loads all those settings into memory before returning control.

That way, if you are still working on the same files and subsequently change them while the export is going on, you'll reliably get the same settings as when you hit export and not some non-deterministic result depending on whether you made further adjustments before or after that particular photo got exported.

That's what the preparing step is. It harvests the settings for all the photos in a modal state and then proceeds to chew through the exports. If you had a ton of local corrections, the number of photos' settings that would fit in memory would probably be lower.

If exporting many thousands of photos at once is a common enough workflow, we probably could be convinced to store that settings snapshot in a temporary location on disk instead of in memory. If the catalog had a guaranteed persistent history (instead of one that could be cleared), we could instead load just the keys for looking up the relevant settings state from the catalog, but that isn't how the schema is designed at the moment.

DT
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 134 Reply Likes
"That's what the preparing step is. It harvests the settings for all the photos in a modal state and then proceeds to chew through the exports."

Does it do that on request for rendering previews as well?
Photo of Andrew Rodney

Andrew Rodney

  • 496 Posts
  • 86 Reply Likes
>If exporting many thousands of photos at once is a common enough workflow, we probably could be convinced to store that settings snapshot in a temporary location on disk instead of in memory.

That's the $64K question and if its worth the engineering time and money. I export out of LR a lot, but thousands of images (even many hundreds)? Never. If someone is expecting to do anything to thousands of images, you'd think they would understand its going to take a fair bit of time.
Photo of Dan Tull

Dan Tull, Employee

  • 172 Posts
  • 38 Reply Likes
Regarding Lee Jay's question, I don't think preview rendering bothers to do this set up. It does a prescan to figure out might be out of date, but I think it only retains the list of items to render. The only reason it even does that much is to make the progress bar more accurate.

The difference between the two is that preview rendering doesn't care which settings existed when it was invoked, only that when it is done everything is rendered with the latest settings.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 379 Reply Likes
Dan,

Thank you for chiming in.

Its not that big of a deal to me personally. Now that I know about it, I can always break up exports into chunks on the rare occasions when I do whole-catalog re-exports.

Its more of a problem for people who stumble into it and don't know what's going on except "it didn't work"...

Also, some of my plugin users report "over-consumption" when doing whole-catalog metadata updates (e.g. DevMeta & ExifMeta), yet I see no memory increase what-so-ever when doing the same exact thing, so there are definitely some system-dependent memory consumption behavior differences...

PS - Whole-catalog exports are not that uncommon for people whose workflow includes keeping whole catalog export trees sync'd and such stuff...

The problem is not just the preparation step - memory consumption continues to climb during the first part of the export too. I have 8GB of ram and exports top out at about 4000 photos. I assume people with "only" 4GB would top out at a much smaller number, should their system be having the same "behavior" that mine does.

Rob
Photo of Dan Tull

Dan Tull, Employee

  • 172 Posts
  • 38 Reply Likes
Hmm. We could come up with much better ways to keep a whole catalog export tree in sync if enough people really want that. A sufficiently well written publish plugin would do a better job than stock export, for example.

As for the additional memory accumulation, if I had to guess the memory growth after the export starts is the image cache for recently opened images. It could probably be argued that images opened for rendering during export shouldn't stay in the cache since they're not all that likely to get re-used, but it shouldn't be overly harmful either (and in the case of exporting multiple virtual copies of a single image, it would help a little).
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 379 Reply Likes
I haven't checked if whole-catalog publishing has similar symptoms (I don't use publish services), maybe I will if I can find the time. But note, publishing services have some limitations that exports don't:

- Publish services require one to initiate each in turn, whereas all exports can be initiated in one-click (using ExportManager).

- Also, collections do not support stacks, so smart collections will often include bottom feeders and most publish services are not programmed to eliminate them, although DevMeta will filter based on stack-position, should one know about it and be willing to use it...

In the case of jeffrey friedl's tree publisher, it does not support top-of-stack detection, nor target path depth control (elimination of extraneous parent folders in target tree).

This is why I still use TreeSync export instead of equivalent publish service.

I would not object to a slick Adobe solution to keeping usable copies (usable outside Lightroom that is) at the ready, especially if source folder structure was mirrored, or collection hierarchies were mirrored as folder structure (like jf's tree-publisher does). Providing the hooks in SDK so plugins could do it may be sufficient...

PS - Thanks for educating about the image caching...

Rob
Photo of Alan Harper

Alan Harper

  • 412 Posts
  • 82 Reply Likes
I, by accident, tried to export (Export to Same Location) a number of files yesterday. (I thought one file was selected, but thousands were). It was set up to open the exported files in Photoshop. I found that I completely lost control of Lightroom and Photoshop--LR stopped responding, and I couldn't halt the process. I ended up force quitting. (I hate force quitting database apps, as there are so many opportunities for corruption).

I have not tried to replicate this, but it would be nice if one could quickly interrupt an export gone awry.

A
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 379 Reply Likes
Could you not click the 'X' to cancel? (upper left corner progress indicator).
Photo of Alan Harper

Alan Harper

  • 412 Posts
  • 82 Reply Likes
No progress indicator. Just hung for a few minutes, and then windows started opening in Photoshop. It was about 4 minutes after I hit cmd-opt-E that I killed Lightroom. Perhaps if I had waited longer, the progress indicator would have shown up. One problem is that for every file, Photoshop is brought to the front, which adds confusion.