Lightroom: xmp files lacking lots of information

  • 3
  • Problem
  • Updated 7 years ago
  • (Edited)
Hi,

when letting Lightroom write all the picture settings to a xmp file, both the stacking and the collection settings are missing.

Basically, I'd expect to find just every work done about the picture in the xmp file (e.g. for use with other tools. If I put several pictures in collection, I'd like to use that information and the order of the pictures from other programs.)

Even worse, when making a virtual copy of a picture, it's settings do not appear anywhere in an xmp file.

regards
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes

Posted 7 years ago

  • 3
Photo of Jeffrey Tranberry

Jeffrey Tranberry, Sr. Product Manager, Digital Imaging

  • 14147 Posts
  • 1765 Reply Likes
Is this a problem or are you requesting additional functionality? It sounds like you'd like metadata about collections/stacks stored with the file.
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
Collections, stacks, VCs, flags and Develop history do not appear in the XMP data. There are application interoperability reasons for this, but I think they should work through as much of this as possible.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
Yes, I'd like to see metadata about collections and stacks (and their order) stored in the xmp files.

Whether this is a bug report or a wishlist depends on the point of view, i.e. the formal specs of the xmp file (if there is a precise specification).

If the xmp file is intended to contain _all_ information about an picture file, including all work, then this is a bug.

If the xmp file is supposed to contain only some parts of it (why?), then this is a wish.

The reason I am asking: I am currently (still) preferring Bibble5 over Lightroom, because Bibble5 runs under Linux as well and has several functions missing in Lightroom. On the other hand, Lightroom seems to produce more reliable results and to have a more complete workflow. I do consider changing to Lightroom (have a license).

A major problem of lightroom seems to be that it is based on a central database that cannot be stored on a network share. This makes it impossible to concurrently work on a large collection of pictures due to the synchronisation problem. Bibble5 solves this problem by having a so called „File System Mode”, where just everything is stored in xmp files, so you just need to synchronize file systems with tools like rsync or unison, which perfectly allows to work simultaneosly on the same collection of pictures as long as not on the same pictures.

I tested whether it is possible to do the same in Lightroom by writing out the xmp files after work and synchronizing before. Unfortunately Lightroom's xmp files are somewhat incomplete.

Another point is that complete xmp files allow to read (and change) all settings e.g. from scripts or other programs from any copy of the file system tree.

It's a feature of Bibble5 that I miss on Lightroom and which currently keeps me from changing.
Photo of john beardsworth

john beardsworth

  • 1037 Posts
  • 237 Reply Likes
Writing more metadata back to the XMP is desirable, and I'd also add custom fields' metadata. However, I wouldn't justify the request on the basis of concurrent working over a network. Rather than a file system kludge, better to have multi-user network access to the database.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
First of all, having all relevant data in the XMP file is what gives the XMP file it's sense.

In contrast, having multi-user network access to the database does not always make sense. You're assuming that there is a high speed and low cost local network, which is not always the case.

It must be possible e.g. to take a notebook computer with a copy of the file system tree with pictures (or part of it) onto a trip, maybe in foreign countries without mobile coverage, and to work on pictures.

Then, it should be enough to synchronize just the xmp files (oder send them by email) over expensive or low bandwidth lines.

It should also be possible to send some pictures together with all settings to other people (e.g. by email) without having to completely export and import a Lightroom catalog. Or letting other people do the work of image processing and adding keywords and other stuff, and just send/sync back the xmp files.

This boils down to the central problem, that Lightroom is based on the assumption of one central database, that's always online and available. This hold's true for people sitting in agencies, but not for people travelling around (or preferring other operating systems) or participating in distributed working structures (e.g. freelancer).
Photo of john beardsworth

john beardsworth

  • 1037 Posts
  • 237 Reply Likes
The database is Lightroom's great strength. When a network isn't available, offline workflows are addressed by File > Export as Catalog and Import from Catalog which does things like replicate file structures and has intelligent ways of roundtripping work.

Your mentioning of sidecars for emailing isn't right on three counts. You could send a catalogue just as easily, fortunately not all file types have sidecars - eg DNG, TIF, JPEG, - and thirdly you'd have to transmit the much larger originals in the first place.

So while I agree with your basic request, concurrent working isn't a justification for it.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
The way you describe it, it does not sound as if the database was Lightroom's great strength, but the dependency of a central database it's major weakness.

Exporting a Catalog does not work that way because it implies that the one working on the database knows exactly which pictures are to be exported.

E.g. I can easily synchronize a part of the tree from my server onto my notebook when I am at home, can then be on the road and do some changes, and then do a fast sync of the changed XMP files over UMTS. e.g. use the time for working when sitting in the train or at the airport.

Argueing that my mentioning would not be right does not make sense, because that's how I am working right now with Bibble5. Perfectly fulfills my needs.

Therefore, it would not be much of a problem to transmit the larger originals in the first place. It's about not having to transmit them again once done with the work.

It is also not about concurrent working (which can basically be done with a central database), it is about distributed working (which requires better synchronization methods than a central database or import/export of whole databases) and data exchange.

Just have an example:

Let's have a database with pictures A,B,C,D. And two workers with notebooks on the road. (Or myself with two distinct notebooks.) One changes A and B, the other works on C and D.

With import/export you get an extreme mess of files and versions. With those sidecar files and a regular file synchroniser you solve the problem fast and easy. And you solve it automatically, without having to manually (!) import and export files.
Photo of john beardsworth

john beardsworth

  • 1037 Posts
  • 237 Reply Likes
You really are deploying some straw man arguments! And if you want to change the goalposts from concurrent working (the phrase you originally used) to distributed working, Export/Import are equally suitable and provide the level of controllability that's needed. Your Heath Robinson / Rube Goldberg workflow processes may work for geeks and can be made to work by normal users, but they're still ugly and still involve manual processes and knowing which files you're working on. Lightroom should be about doing the right thing.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
In case this point hasn't sufficiently landed:

Many of us would *really* like an xmp sidecar option for all file-types, as well as virtual copies.

...the completer the better...
Photo of TK

TK

  • 531 Posts
  • 110 Reply Likes
Hadmut, FWIW, I don't see you making any straw man arguments at all. I think your proposal makes perfect sense. I suggested something similar a while ago: I believe that XMP files or metadata in DNG files should retain edit histories. Dan Tull was supportive of the general idea and regards complete XMP files as a distributed catalogue backup.
Photo of john beardsworth

john beardsworth

  • 1037 Posts
  • 237 Reply Likes
In relation to distributed or concurrent working, they sure are!

The same arguments that we covered in that thread apply here - these are areas where if the FR is ever implemented, users need control over each chunk of data that's being saved out. For example, if I sent pictures to another Lightroom user would I always want them to see my collections structure, revealing how I organise my work or other confidential information? And would they want to have my collections appear in their well-organised catalogue? The same would apply to stacking or what virtual copies I've made. So you need write and read options - or be happy with the unintended consequences.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
I don't see much difference in sending someone "complete" xmp files to read into a foreign catalog versus sending them a catalog fragment to import into a foreign catalog - theoretically it would be the same info in a different wrapper. But some options for what goes out and what comes back in doesn't seem like such a bad idea, in either case...
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
@John Beardsworth: That's pure nonsense, and it does not get any better by using technical terms not understood by everyone in this forum.

Export/Import do provably not maintain a proper and synchronized data structure. A first problem is, that they generate additional data sets, i.e. the exported files. So besides the heap of pictures I have, I would additionally have to deal with all those exported files and to keep track with them. This will definitely fail (and waste additional disk space I might not have when on the trip).

A second problem is that Lightroom works under Windows and MacOS only. It requires someone to sit in front of the computer and move a mouse, and to have that particular computer with Lightroom installed turned on and someone logged in. This is a nightmare. It does not work with a central repository. E.g. there are lots of free/cheap disk space/backup providers out there, but there is no one providing a free online version of Lightroom. The requirement to have a central database entity operated manually by a human through a graphical interface is a severe failure in design.

A third problem is that this method does not deal with change sets. Imagine that I am at home at my personal network, and do synchronize my heap of 100,000 travel pictures onto my notebook computer. Then I am on a 3 weeks trip with lots of time to spend in trains and airplains, and I do some corrections, b&w conversions, add keywords and so on. How would I export from the database all settings that I have changed and not just all the 100,000 ? And how do I import them into my central database when I am away from home and thus from my central database?

Lightrooms structure of dancing around a central database on a Windows machine that forces users to operate it centrally with a human graphical interface and mouse is based on assumptions that do not hold true.

I've changed the term from concurrent to the more precise special case of distributed, since someone in the comments above reduced "concurrent" to the special case, where all concurrent entities have high-speed low-cost access to a central database. Photography is not centralized work. Not anymore. It used to be in the old days, when we had chemical labs. But today we have notebooks and mobile phones, and we leave our office for taking pictures and communicate with others.

Geeks: Using one of the common tools to keep a directory tree of regular files in sync between computers (or computers and online storage providers) is definitely more simple, comprehensible, fool proof and robust than dealing with an unstructured collection of exports, that require manual interaction and most probably lots of manul corrections and collision solving.

That discussion sounds, btw., somewhat silly, because Lightroom basically already has the required functions, i.e. writing xmp files and reading them in through synchronizing directories with the database. So what's the point of this discussion? Denying that Lightroom already has it? It's just that it is incomplete, inconsistent, not complete, and poorly implemented.

But I think we can stop discussion right here. I think I'd rather stay with Bibble5 instead of turning to Lightroom and felting my work through inconsistent databases. The guys from Bibble Labs have realized, addressed, and solved the problem. The guys from Lightroom don't. Bibble5 solves the problem, Lightroom doesn't. Sorry to say that, but it's really that simple.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Hadmut - You may not be able to convince everyone, but you've convinced me.

Networked Lr with true-sync... - may be a while off, but syncing via xmp would be a short reach.

I'm hoping Adobe will remedy come Lr4, or at least augment the SDK so a plugin author could implement the solution (its "almost" doable now as plugin but would be severely limited due to omissions...)

Is this really a deal breaker for you?

Rule 5.2 - Enjoy other software if Lightroom not quite cutting it...

Rob
Photo of john beardsworth

john beardsworth

  • 1037 Posts
  • 237 Reply Likes
@Hadmut - I'm not using technical terms or jargon, and your reply is really too long to go through countering or agreeing with every single point.

Note, I am not talking about simple Export/Import but specifically used the terms "Export as Catalog" and "Import from Catalog". These don't need to generate additional originals - they can, if wanted, but don't need to.

"I would additionally have to deal with all those exported files and to keep track with them." No, Export as Catalog/Import from Catalog can work without passing extra copies of the file back and forth. The process just needs a slightly more user-friendly appearance / metaphor / name to help people appreciate its elegance.

"Second problem...". Is Win + Mac only really a problem? Not sure what you're on about here, but a database is a huge advantage, not a "severe failure".

"Third problem..." Export/Import is actually a lot smarter than you seem to think! In your example I just do a smart collection based on the edit date, export those items as a catalogue (without the originals), and import that to the master. But there are a few easy ways to do it without playing with sidecars.

I just don't think your concurrent/distributed workflow is a reason for the FR. LR offers better ways to work.

John
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Hadmut - it does sound like you may not fully appreciate how to use Catalog Export / Import. Its just one file, not a bunch of files and could essentially contain all the changes you'd have made on the road, as example, not a whole catalog, and no photo files... I'm a big proponent of complete xmp for all+virtual, for a variety of reasons, including the ones you've brought up. But it may still be possible for you to accomplish your objectives without too much trouble using catalog export / import. I recommend making sure you do understand how that works before making your final judgement... I mean, I'm not vested either way, and I'm not an Adobe/Lightroom defender, and I don't always see eye-to-eye with John Beardsworth, to make an understatement, but I do think in this case he's brought up some points worthy of a bit more consideration.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
@Rob: This is exactly the problem. Or more, it is a bunch of problems.

First: it is _one_ file. So you always run into the collission problem for the whole export and not just for single pictures.

Second: it is an _additional_ file. I do not just have my file system tree of pictures, I also get a large heap of these export files, which I have to keep in order and organized.

Third: It requires manual interaction and thus causes lots of overheads. File tree syncing is cheap and automated, I'll to it anyways, and most business machines do it automatically without the need of user interaction. In contrast, a Lightroom export does not do anything at all automatically. Someone must sit in front of a Windows machine, start it, use the mouse, and must do the import manually.

Fourth: It's foolproof. If you have two machines with the same files, you know they are synchronized (or you can see, where they are not). If you have to machines with Lightroom databases, you never know whether they are in the same state or where they differ. Even if you have exactly the same contents, database files are never similar.

Sixth: It is not about accomplishing my objectives "without too much trouble using catalog export/import" I need to accomplish my objectives without the need to start Windows, run Lightroom manually, check all that things, and never be sure as if everyhting happend as expected, just for my daily notebook sync/update.

Don't you understand that the idea of having Windows as an operating System and Lightroom as an interactive program be started and operated with the mouse for every single synchronisation itself is the misconcept, no matter how smart it might be?

You need a way to synchronize
a) automatically without user interaction
b) reliably and in a way that can be verified
c) fast and over cheap/low traffic lines
d) without mid-air collissions and conflicts if several people (or just me working at home and on my notebook) change different files
e) work with dumb storage like hard disks, USB sticks, online storage without the need to have running instances of Windows and Lightroom

That whole concept of needing a proprietary, user interactive, license bound program depending on a graphical interface and human interaction to just synchronize machines, even worsened by the fact that Lightroom does not allow catalogs on network shares, is broken by design.

Furthermore, it is based on the obsolete idea of fast PCs or Workstations, but today we have a client/server/central storage/distributed work model.

And the more comments I read in this thread, the more I suspect that this is made intentional to increase the number of Lightroom licences customers need.

And for those who want to tell me that this is wrong or would not work: Why does it work so well with Bibble5 then?
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 371 Reply Likes
Fair enough Hadmut.

I just wanted to make sure you understand how the catalog export/import works so at least you are making an informed decision.

Your points are noted...
Photo of john beardsworth

john beardsworth

  • 956 Posts
  • 195 Reply Likes
How long/much have you been using Lightroom? You really do need to look at lot closer at how File > Export as Catalog works, and at its Import from Catalog counterpart. They're the key to slick and controllable multi-machine workflows.

And for an example of a straw man argument... "If you have two machines with Lightroom databases, you never know whether they are in the same state or where they differ." Of course you do - you just have to use your brain! By which, for instance, I mean naming one master.lrcat and the other laptop.lrcat. So not a lot of mental effort.

John
Photo of TK

TK

  • 531 Posts
  • 110 Reply Likes
Hadmut, FWIW, I see your points and fully agree. Manual fiddling with exporting and importing (sub-)catalogues is just too cumbersome.

You could have added that

a) LR catalogues are huge. They often compress to 10% of their size using WinZip. If you compress them, further steps are introduced. If you don't, they take up a lot of space and require a lot of bandwidth to transfer.

b) LR catalogues are not (not yet?) a paragon of stability. Now and then they become corrupt. Losing edits this way is not necessary if edits are saved to XMP files. Lee Jay once wrote: I don't trust the catalogs or backups of the catalogs as I've been burned by them going bad on me several times (Dan saved me once). I'm not willing to lose the work I do between catalog backups which means I really want a reliable backup after each image edit, which is totally impractical using the catalog backup technique. As of now, I've given up entirely on backing up the catalogs and only backup the XMP data, which is image-by-image, basically instant and has saved me a couple of times when a catalog went bad inexplicably (I would have lost perhaps ten hours of work each time going back to a catalog backup).
Photo of john beardsworth

john beardsworth

  • 988 Posts
  • 217 Reply Likes
Well, that is Lee Jay's way of working where he doesn't use a range of LR features because they don't get written to xmp! Database corruption is very rare indeed and people are best advised to backup up their catalogues in the confidence that they are reliable!
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 134 Reply Likes
My entire catalog went corrupt for no apparent reason whatsoever the other day. LR wouldn't even open. I had had no crashes or problems of any sort.

I used a catalog backup from a few days before, but of course it didn't include anything I had done in LR since it was made. I sync'd and everything was back just how I had it exactly because I use xmp the way I do, and don't use any features not supported inside of xmp. If I hadn't done that, I would have lost several days of work.
Photo of john beardsworth

john beardsworth

  • 988 Posts
  • 217 Reply Likes
I wouldn't have lost any as I backup upon exit, and use all those features you can only dream of ;)
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 134 Reply Likes
If LR had a backup file management system (you know, keep the most recent three, one from a week ago, and one from a month ago or something like that), I might consider backing up on exit. As it stands, that would be a nightmare for me.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
..and if a catalog is broken, all is lost. Where in contrast xmp files are much more robust since they are not updated with a complicated file structure, indexes and that, they are just plain files, and if they are corrupt, it's just the editings of a single picture, not the whole database.
Photo of john beardsworth

john beardsworth

  • 956 Posts
  • 195 Reply Likes
"and if a catalog is broken, all is lost. "
Another straw man? Proper catalogue backups mean all your work is backed up and easy to restore. And only raw files have sidecars - other file formats contain embedded XMP and are not "just plain files".
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
May I just ask what role "john beardsworth" is playing here?

Is he related to Adobe? Or is he a forum troll? What are his intents to continuosly deny other people's needs?

What's the point in offering LR users a forum for feature requests and improvement, if they have to battle around with third party people in pointless discussions?

(What- or whoever he is, he seems to have an extremly limited point of view and understanding of computer technology, and although I respect his opinion as apropriate for his workflow, it's absolutely irrelevant for my point of view both in my privat photograpic use and in my professional work. So I do consider his comments as insubstantial and jamming of that forum.)
Photo of john beardsworth

john beardsworth

  • 956 Posts
  • 195 Reply Likes
I'm a user of the software just like you and "a well-respected member of the Lightroom community" (from a recent thread here). My concise counter-arguments are because your longwinded posts ignore important aspects of Lightroom and demonstrate ignorance of how best to use it. Just because I pull apart your arguments is not a reason to throw insults.
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 3774 Posts
  • 1249 Reply Likes
I'll vouch for the fact that John is not a troll and has been a well-respected member of the community for a long time.

Though it may feel like the discussions have been pointless, as a result of John's questioning and counter-arguments you've raised a workflow situation that we don't come across very often - combining multiple users of the same catalog and a remote location making import/export catalog a less useful option - and we wouldn't have fully understood your reasons for wanting this feature from your initial post.

That kind of additional information is useful - if Adobe can't or won't include all data in xmp for one reason or another, understanding your reasons behind wanting that feature may provide an alternative solution.
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 4305 Posts
  • 1569 Reply Likes
The limitations of xmp are as designed, however I do agree there is an argument for including all available data.

So bringing this back on track, it seems to me that the main argument you're making is not only about xmp holding all available data, but also for Lightroom to work as a file browser rather than a database. Would that be accurate, or have I missed something in amongst all this chatter?
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
Victoria,

Long ago (about Beta 4 prior to 1.0), I suggested a working mode that was browser-like. It would work like this. You'd "browse" to a folder, and those images would be automatically imported into a temporary collection (possibly even in a temporary catalog). Autowrite would be turned on for those images automatically, and you'd do your thing. If you browsed away to another folder, the collection would be removed, but of course it could come back quickly if you browsed back because of the XMP data. Essentially, this is how RSE/RSP worked, and that's the tool I was using before Lightroom. I felt this approach would ease the introduction of the catalog concept and allow "quick looks" and "quick editing sessions" where those techniques would be appropriate.
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 4305 Posts
  • 1569 Reply Likes
Yeah, I could see how something along those lines could be useful in specific situations. How would you imagine it working if someone then wanted to add those photos to the full catalog? Or wanted to drag/drop between folders, treating it like a full-blown file browser?
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
If someone wanted to do a transfer, the simplest thing would be to just do an import - no new UI required. I was trying to keep this simple.

If people want to do the whole folder-to-folder and other file management things, I'd require them to import and do it as we do now.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
@Victoria: "File browser" is not really accurate, but it gives an idea of what I am expecting. It is a wrong view on the thing.

To be more precise in terms of programming, the storage engine storing the settings for each single picture should not be strictly bound to SQLite, but encapsulated and object oriented, i.e. separation of programming interface and implementation, thus allowing different implementations for the same thing.

Then, SQLite (as it is used right now) would be one implementation and xmp files another, both fulfulling the same task.

Your point of view („file browser”) is only user interface oriented. As my first and main profession, I am not a photographer, but a „Informatiker” (computer scientist + engineer + consultant), especially in IT security. That's why I do more intensively focus on keeping machines synchronized in case of loss/theft/damage. The idea is not to turn LR into a file system browser. It is not meant to change LR's appearance and usage at all.

It is meant to give LR just an optional second/alternative storage engine, which behaves diffently, e.g. allows much better synchronization between several machines of the same user or different users.

I usually synchronize my portable computers with my home server whenever I connect them to my home office. Since I use plenty of different programs, it would simply be absolutely unfeasible, if every single program expected my to manually perform import and export both on the server and each notebook to keep my machines in sync. Whoever believes that this manual export/import by mouse is a good way to achieve synchronisation has never ever had his machines in a good state. There must be a general way to synchronize several machines fast and automatically, no matter where you actually did some changes on your data (and it does not matter whether you were writing software, writing a book, or work on your pictures). And you can proove that it is not trivial to keep several machines in sync if you always produce export files. What if you have two appartments and two notebooks? Export and import every change for four times? How much time would you have to waste?

The second aspect is that I have a different view as a professional security consultant. I always highly recommend my customers to install software on their portable computers to a) keep them encrypted b) keep them automatically synchronized whenever they attach their notebook to their desk to minimize impact in case of theft or other sorts of loss.

The third aspect is, that because I used to have two appartments for a long time and to travel around a lot, I had to learn how to use several machines at the same time and having work done on one machine available on the others. This works with file system syncs and online storage. But not with having to start every single program to manually export/import things. Where should that end? Having 30 exports every months?

The fourth aspect is, that for technical and professional reasons I do not use Windows as my primary operating system, but Linux (that's why I started with Bibble 5). For all those photographic applications that don't run under Linux, I have a Windows in a virtual machine and another windows in a small machine at home to use over rdp. And my central storage servers are all Linux machines. I simply don't have that much time to waste to start a virtual Windows machine and a second Windows PC, to start LR on both machines, operate them with a mouse just to perform my daily sync. Both Windows entities use Linux as their underlying storage, either through virtualization or samba shares. So both storage devices must be able to sync without having a running LR instance.

The fifth aspect is, that for the very same reason every professional photo agency should store pictures on central servers with backups, and not on the local machine. Servers do not execute Software like LR. They just offer storage, dumb storage. So it must be possible to synchronize against any storage device, not just running LR software.

So, to repeat myself again, I do not want to turn LR into a different program, do not want to change it's appearance, or change it's usage. All I want to have is the storage engine replaced with the choice between the old one and a more robust and less colission-prone one that can sync against plain storage, and has a more precise state.

And, btw., Bibble 5 had pretty good reasons to offer both options, storing in local databases or storing in file system mode. User can choose between both (and even use them together). I've been using that for about a year, and it is working pretty well, simple and reliable. And yes, since Bibble5 has the file system mode where it does not use a central database but taking the files 'as they are with their sidecar', you can actually use it as a sort of file browser, if that's the term and point of view you prefer. You see almost no differences in usage between database and file system mode. But I can always sync a part of my image archive on my notebook for working outdoors, or take pictures onto the notebook and sync it into my central archive - without needing to run Bibble5 or even have it installed on the central machine. I can burn images with their sidecars onto CDROM or pack on USB pendrives and can work on them without the need to import them into a local database. It makes work so much easier.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
What is "Bridge" ?

I actually did not intend to view offline photos, but that's a good point. I'd actually see three different approaches how to deal with it, and I'd suggest to leave it to the user to choose between:

- No method to watch offline files
- Include a thumbnail in the xmp (which makes sense only if the xmp is still online, which I could not really imagine at the moment)
- Use a regular database as a underlying cache. If LR keeps its code really clean, that it is really simple to build an abstract storage engine consisting of two others, which writes always in both, and reads from the available if one is missing. Would make searching much faster than traversing the file system each time. Actually that's not too far from what LR does right now with file system synchronization against it's database.

So I'd prefer basically three modes of storage:

- Database only with manual writing of xmp (as it is just right now)
- xmp mode only
- mixed mode with xmp preference, but using a database as a cache and backup in the offline case (where offline and missing because of deleted are distinct things)

The more I think about it, the more I'd see the function of that mixed mode database in fast searching/caching than in offline reading. E.g. having 200,000 pictures in a large file system tree would really take time to find all pictures with a particular key word or maintain a collection.

I think having the choice between these three modes would be good.
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
Bridge is the file browser/organizer that comes with the Photoshop CS.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
Photoshop CS? You mean that incredibly cheap software, almost for free?
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 4261 Posts
  • 1544 Reply Likes
Mmmmmmm, that mixed mode was the kind of scenario I was wondering about, and possibly more likely to make the cut. Not a complete departure from the current setup, but extending the xmp spec to contain all (or at least more) information and improving the metadata synchronization with those xmp files to give a more intelligent sync, or something along those lines.
Photo of Hadmut Danisch

Hadmut Danisch

  • 15 Posts
  • 0 Reply Likes
But it should be possible to do that without polluting the regular databases. Maybe have a separate database used for that purpose only, which is subject to cleanup of expired entries.