Lightroom: Assign metadata preset to thousands of photos: "Instant" memory leak (3.4.1)

  • 1
  • Problem
  • Updated 7 years ago
  • (Edited)
Test arrangement: Windows XP professional 32 Bit SP3, 4 GB RAM, standard configuration (i.e. no /3GB switch), LR 3.4.1, catalog containing ca. 20.000 photos. Create a metadata preset with all fields set to some random values. Select all 20.000 photos. Assign the preset to all photos.

Result: Virtual memory consumption is steadily climbing until at the ca. 8.000th photo, the 2GB "barrier" is reached. At this point, a) LR stops assigning metadata (progress bar just stops), b) Virtual memory consumption suddenly drops to ca. 300MB and c) LR freezes on the next user interaction (no CPU load, no apparent I/O activity) and can only be terminated via task manager.

After restart of LR, the catalog seems to be intact (integrity is OK according to LR), but for none of the photos the metadata have been assigned.

This clearly looks like a memory leak to me, because IMHO there is no reason why such a "sequential" operation steadily uses more and more memory. NOTE: I did not test how LR behaves when using only one or two fields in the metadata preset. And it would be interesting to know how LR behaves on 64 Bit systems or on Mac.

I repeated the test using a *develop* preset with slightly different result: Virtual memory consumption is also steadily climbing, but slower, so LR was able to assign the develop preset to all 20.000 photos, reaching a peak of about 1GB virtual memory consumption. Apparently, something is wrong here, too.
Photo of LRuserXY

LRuserXY

  • 426 Posts
  • 41 Reply Likes
  • sad

Posted 7 years ago

  • 1
Photo of Steve Sprengel

Steve Sprengel, Champion

  • 2651 Posts
  • 340 Reply Likes
If none of the photos got their metadata changed then it sounds like LR is doing everything as a single database transaction which never got committed at the end of the process so the database was never actually updated. LR could presumably make each update as a transaction so things would get updated one by one, but that would be very slow by comparison. It also makes each operation more robust if the updates are either all or nothing, rather than having to figure out what was updated and what wasn't to redo only the part that wasn't done, or undo the part that was.

That said, this is only an explanation why thinking about things as many sequential operations rather than one large parallel operation may be incorrect, not why the memory is exhausted and things blow up. That is still a problem It's possible that any alternative solutions would be prohibitively slow.

To actually test if there is a memory leak (memory that is no longer in use but unable to be reused, either) doing the same update operation in 4 groups of 5000 or 5 groups of 4000, one after the other, would tell you. If 4x5000 blows up about the same place--partway though the second set then there could be a memory leak. If it works ok, then there is a memory limitation on how many operations can be accomplished in a single transaction, where the choice of a single transaction was done for speed and robustness and likely won't change. I would guess the problem would also occur on a 64-bit system (win/Mac) but you'd have more memory available before things blew up.

SQLite, the underlying database that LR uses, has sophisticated memory allocation options so it seems like Adobe developers could make it not use more than a particular amount of memory and have that be less than the 2GB 32-bit maximum, but there may be some tuning that has been done to favor speed over absolute robustness. It's also possible that there is some bad memory allocation that is happening on XP that would be ok on 32-bit Windows 7.