Lightroom: A real plugin architecture to make TIFF files unnecessary

  • 101
  • Idea
  • Updated 3 years ago
  • (Edited)
My number one feature request would be a real plugin architecture for Lightroom. It's just kind of nuts that to use any sort of plugin or external application, I have to create a separate TIFF file. The TIFF files break workflow by losing all the history of adjustments applied to the RAW before creating the TIFF, they're a pain to manage, and they take up a large amount of space on disk.

Perhaps Lightroom could let plugins create mask overlays or something like that, which would integrate with the develop history. But anything that would avoid the necessity of creating a TIFF file would be welcome.
Photo of Eric

Eric

  • 1 Post
  • 0 Reply Likes

Posted 7 years ago

  • 101
Photo of Butch_M

Butch_M

  • 291 Posts
  • 112 Reply Likes
I totally agree ... Not sure how easy it would be to integrate the capabilities to be able to offer parametric, rather than pixel editing based plugins ... But it sure would streamline the workflow for many tasks ...

The addition of the improved NR features in ACR 6/LR 3 have drastically reduced the number of derivative files ... Maybe even thousands per year as I shoot a considerable amount of sports in less than ideal conditions ...

It would be great if third part plugins could offer the same abilities ...
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
This would definitely open some doors and provide some advantages.

Its obviously possible, since other raw/parametric editors are doing it. I'm not sure how well it fits into Lr design or Adobe vision, but the benefit to users is clear.

Rob
Photo of Lloyd Roseblade

Lloyd Roseblade

  • 4 Posts
  • 1 Reply Like
Yes. I like this idea a lot.
Photo of Rory Hill

Rory Hill

  • 242 Posts
  • 35 Reply Likes
I have been waiting for this since the first betas. I hope adobe wants this to happen. If they do not then it really makes a mockery of the whole parametric editing concept if you have to leave LR.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Rory - Don't you think the term "mockery" is a little "dramatic" :-}

Still, I think I see your point - if you have to edit non-parametrically/destructively at any stage, it breaks the parametric/non-destructive editing chain...

The good news: even traditional destructive pixel editors can be made "parametric", in some instances. If its possible to define a formula for taking input and producing output, and that formula can be saved as "parameters", then so-called destructive editors can be chained into the parametric pipeline. User's will need to understand that these stages will not be as efficient as the native Lightroom stuff, since there is no consolidation of settings...

Anyway, Adobe can certainly invent the solution once they set their minds to it. It'll be interesting to see if/when it happens...

Note: This paradigm already exists for exporting: plugins can define "filters" which take an image file and produce an image file which is passed to the next "filter" in the chain. There has been surprisingly little use of this feature to date (I'm not sure why - except the documentation is confusing), but the idea can certainly be extended to the develop module - preferably, with option to bypass the intermediate files and pass the image data in ram...

Rob
Photo of Rory Hill

Rory Hill

  • 242 Posts
  • 35 Reply Likes
"Rory - Don't you think the term "mockery" is a little "dramatic" :-}"

Okay Rob, I'll grant you that. ;-)

I guess it is the disappointment leaking out. I luv Lightroom and want it to be the best it can be. Early on in the Lightroom development I mistakingly interpreted the promised SDK to be "all inclusive". Given it is impossible for adobe to meet everyone's needs, and their history with photoshop plugins, and their contempt for what aperture calls plugins, that are not parametric, I felt that a rendering pipeline plugin architecture would be necessary and forthcoming.

Either adobe does not want to implement this or they are waiting for the rendering pipeline architecture in LR to stabilize before implementing a SDK. I'm hoping for the latter. Being basically on optimist, LR4 would be an excellent time

To any Lightroom engineers listening in, I did not mean to disparage your work, for which I have the greatest respect. LR is a joy to use.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
The more I think about this, the more doable it sounds to me.

I mean, ultimately, Lightroom ends up creating an rgb bitmap in the develop module. I would think inserting a pixel editing step or a plugin image transformation step would be a matter of finalizing the rgb bitmap and passing it to the plugin. All subsequent lightroom edits would then operate on the new image data.

I dont mean to make this sound trivial - its not. But, very, very doable...

And this would be parametric/non-destructive by definition: all plugins or image editors in the chain would be required to take their input and create their output on demand, given the settings (or parameters) that are set for them.

PS - I understand your emotion/disappointment/frustration...

R
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
"Either adobe does not want to implement this or they are waiting for the rendering pipeline architecture in LR to stabilize..."
Tom Hogarty has commented directly on this:

http://blogs.adobe.com/lightroomjourn...

"Photographers would still like to see image processing plug-ins in Lightroom and I agree with them."

It seems clear that the problem is not one of intent.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Regarding: http://blogs.adobe.com/lightroomjourn...

The journal-blog looks like a great resource (sorta like a forum, except Tom Hogarty willing to do more than a cameo appearance there).

From what I gleaned in that article, it seems Adobe is getting stuck, and maybe they shouldn't be. In other words, plugins dont need access to raw data, just rgb in / rgb out. Although one could conceivably do this with true plugins (meaning UI and everything else via SDK) by extending the SDK with functions like: GetRGBImage (for getting input), displayRGBImage (for displaying intermediate results), and SetRGBImage (for final plugin output)... I would think a better approach would be to allow full-fledged external applications as imaging plugins. I say better, because it could be done immediately - I'm talking about Lightroom development, AND the development of the plugins.

With this approach, the only thing Lightroom has to do is:

- be able to invoke the plugin app for user setup, pass it image data, and retrieve image data out (for parametric plugin edit setup).
- Be able to invoke the plugin without the UI, once already setup, and pass it image data and retrieve output (for re-rendering).

Not only would this be easy and quick for Adobe to do, but would allow 3rd parties to harness existing technology immediately with very little fuss. - No need to rewrite from scratch to conform to an all new environment - maybe just a tweak for source of input / output...

Bottom line: Postponing development of image access to plugins / plugin-apps in favor of some super-integrated all-new environment, is too much work, and will make us wait too long - for Lightroom imaging plugin support, as well as the imaging plugins themselves. In other words, this is a case where a "compromise" solution may actually be "better" all around than going all out...
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
That's basically what we have now, Rob. It's not non-destructive, and if you wanted to go back and make adjustments at the LR/raw level, it might mess up the work of the plugin massively.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
When you say its what we have now, do you mean the external editor interface that forks a new tif? - I'm not sure what you mean. I mean presently thats the only way for a 3rd party to influence image processing in Lightroom, and it leaves a lot to be desired - thus this topic.

If you look at how Nx2 works, it does this:

First, there is an optimized process to fold basic settings in with the raw to produce an rgb, then passes that rgb to optional next steps in a chain. Each link of the chain has settings and performs a transformation and outputs the rgb to the next link in the chain. Yes, previous steps may influence subsequent results - thats the good news and the bad news all rolled into one, and is the reason every single step includes a histogram and shadow / highlight recovery - to keep the image data bounded for the next stage.

Its totally non-destructive. Raw data enters, rgb data emerges, and the settings in between determine the final outcome.
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
That's really no different that what we have now with passing data to PS as a smart object and then using all the tools, filters and plugins available there, just as described in Tom's blog post. I think the team would rather go a step further and allow plugins directly into the raw pipeline somewhere, even if it's at only one or two places. However, I think that's actually pretty difficult to do. As I've said before, this team seems to prefer to fix problems thoroughly rather than quickly. I've used this example before, but I and many others wanted them to fix the orange-reds problem. They did, but they also revised the DNG spec, came out with all new profiles, came out with camera-matching profiles, and provided a tool users can use to make their own profiles. They did the same with lens corrections. It seems to be their way, so I'm not sure if they'd seriously consider a processed-RGB-only approach such as you described above, especially since we sort-of already have it. Perhaps I'm wrong, I don't know.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
I mean, if work in prior stages effects the input to and hence output of a plugin, that's almost the definition of non-destructive. Destructive means baked-in (results saved in rgb concrete), and non-destructive means "a recipe for transforming input to output", where the output varies with the input, as dictated by the settings.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
There are some BIG differences in what I'm talking about versus what we have now. Yet only SMALL changes would be required, because all of the existing pieces more or less already exist - they just need some glue to hold them together.

Presently, if you edit a smart object in Photoshop, you still end up forking a large tif that ends up as a separate photo in Lightroom. This is the biggest part of the problem the OP wants resolved, and so do I.

As I see it, Lightroom can (and should) implement two flavors of 3rd party imaging support:

1. The ability to feed image data to a resident app, and use its output in Lightroom, without forking a new photo. App only stores parametric settings - no image data, thus non-destructive, and small data footprint.

2. An SDK that supports resident (non-modal) plugins, in the panels, with native look & feel, for BOTH reglar plugins and imaging plugins.

In my estimation, type 2 is WAY more work than type 1. So, it seems to me, unless Adobe has more development resources up their sleeve than I think, it makes a lot of sense for them to do type 1 in Lr4, since it gets the most bang for the buck by far, and then reserve type two for Lr5, since it would require a lot more work.

If it were an either-or deal, and Adobe/users were dead set on type 2, I'd say "so be it". But, the two are not mutually exclusive at all, and I think having a big box of imaging plugin-apps available within a few months of releasing Lr4 should be a compelling motivation. These would be available quickly because existing apps could be massaged in relatively minor ways, instead of having to be re-written.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
I mean, what is a raw pipeline anyway?

Lightroom first has to convert raw data to rgb, and I don't imagine any plugins ever getting involved in this. Maybe I misunderstand. Assuming I'm correct, when one says "raw pipeline" they really mean "rgb pipeline", where the distinction is in the type of data flowing at this point. And presently, Lightrooms algorithm is an optimized thing that consolidates the various settings to minimize re-rendering. This is the reason Lightroom is faster than NX2 at rendering photos from scratch that have a multitude of adjustments applied (NX2 re-renders for every stage instead of consolidating settings across stages).

So, are people really expecting plugins whose settings can be consolidated with pre-existing Lightroom settings, or other plugins, or post-existing settings? - I dont think so, or at least I would not attempt this. If this is what Adobe has in mind, then I bow to the masters, and will watch with awe from the sidelines...

But, I personally cant imagine anything other than rgb-in/rgb-out for plugins, at least not in this decade. I assume thats how Photoshop plugins work, although I have no experience with them under the hood. i.e. Camera raw transforms raw to rgb, and Photoshop(proper) & plugins take it from there...

?
Photo of Lee Jay

Lee Jay

  • 990 Posts
  • 135 Reply Likes
What if the plug-in has something like control points, and those control points are based on the X-Y location of where they are selected. So you select one. Now, LR warps the image and/or crops the image so that that X-Y location is in a totally different X-Y location on exit from the pipeline? Wouldn't the control-point plugin need to know that? Of course it would. Okay, so you need to put it in the pipeline before any warping or cropping. Now isn't it in the pipeline itself? Okay, maybe there could be some sort of two-way and the plug-in could pass its X-Y location to the pipeline and LR could pass back something about that point pre-warp. But that would only work for a single point. What about a path? What about a complex path? Wouldn't that really, really need to be in the pipeline before warping? And what if the plug-in wants to do more complex stuff like maybe white-balance gradients or doing its own warping or something else? Wouldn't that need to be possibly in another place in the pipeline? Frankly, it makes my head hurt, and I wouldn't be surprised if the Camera Raw engineers are in some similar pain, but I'm not sure of a way around this that gets us more power than we have now with external editors without giving some sort of access inside the raw imaging pipeline.
Photo of Rory Hill

Rory Hill

  • 242 Posts
  • 35 Reply Likes
Okay, now my head is hurting too. I imagine the plugin point(s) in the pipeline would have to be constrained, and also the type of edits allowed. Your earlier point about the LR team's thoroughness is very well taken.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Rory said: "I imagine the plugin point(s) in the pipeline would have to be constrained, and also the type of edits allowed."

I'm not sure why the edits couldn't be anywhere downstream of raw conversion, and any edit types allowed.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
You make some good points Lee Jay. But I'd say: screw the warping. Do all your warping before you feed the data to your other plugins. Or if you re-warp, then redo your downstream plugins if necessary... I mean, Adobe went all out allowing us to add dust spot removal to our photos, then follow up with lens corrections that maintains the positions of the dust spots. While very ambitious and impressive, it was also a lot of work, and a source of problems, and you have people recommending workflows to avoid the extra performance penalty. Dont get me wrong - this was a really cool thing they did, and is one reason so many other things did not get done in Lr3.

On the other hand, cropping definitely needs to be handled. But that's easy enough to do by simply supplying the crop position and dimension to the plugin so the control point coordinates can be adjusted. By this, control points stay in position despite recropping.

Summary: Your point is well taken: plugins with position-sensitive settings will be wonky if previous pipeline stages warp the image. Cropping compensation however is childs play.

And, one could extend this argument to other things. Plugins that are designed to transform the hue of purple would be wonky if preceding stages already transformed the hue of purple... And tone...

But, I mean that's just how things are: If you crank up the fill-light, then you may need to increase the black-point too...

So if you do things that adversely affect a downstream plugin, you may have to go tweak the downstream plugin - thats life in the photo biz...

Bottom line: If Adobe is determined to "do it right", where "do it right" means tight integration and interaction control, we may not see imaging plugins for a long, long time.

Personally, I'd prefer they throw us a bone in the mean time... woof-woof.

R
Photo of Sean McCormack

Sean McCormack, Champion

  • 31 Posts
  • 19 Reply Likes
You'll see them when it's right Rob, as Lee Jay has said. And your own reasoning is exactly why. It's part of the Mark Hamburg ethic of giving more than you asked for.
You talk about the RGBout/in line, but there are 2 aspects to the process pipeline: Settings applied to on a preview, and then those settings on the Raw to create the export. With export it's a matter of apply the sum of the settings and off we go. Working with Develop previews it's a tad trickier.. So we have Plugin X that does something. We do a bit in Lightroom, then use plugin X, then back to normal settings, then we need to tweak X again.. etc, etc. Each time we go between the normal pipeline and the plugin, we have to make sure Lightroom understands the plugins effect on the file and interact with it correctly, passing back and forth between LR and the plugin. Now throw in 5 or six plugins that interact, as well as Lightroom's own processing, and you've potential for the computer grinding to a halt trying to keep up.. like with mixing a lot of spotting with Lens Correction.

I certainly would love plugins internal to the pipeline, but I want it to be right.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Well, I suppose it suffices at this point just to say: "I hope they dont wait too long before engineering a smaller smoother seam between existing editing capabilities in Lightroom and the big separate tif.

R
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Note: If you use Photoshop/plugins, with smart objects from Lightroom, there is no compensation that ripples across plugin boundaries, when altering the position of bits by tweaking lens correction in camera raw (as example), or if warpage is done by plugin... - so you have all the same issues there now that I'm suggesting would be acceptable in Lightroom too. And, there is no reason there couldn't be the capability to define multiple freeze points - so instead of a camera raw cache that caches the rgb image at first birth only, it could be cached later on in the pipeline, at multiple points, so if you aren't trying to back up too far, then its fast, and if you try to go back to ground zero, then it may take a while to re-render through a bunch of plugins... I mean if the ACR cache maintained cached versions such that rendering can be done incrementally when there is a cache hit, then it could be the key in a multi-stage pipeline that includes imaging plugins, such that re-rendering from scratch would not normally need to be done.
Photo of Sean McCormack

Sean McCormack, Champion

  • 31 Posts
  • 19 Reply Likes
Smart Objects are not the same Rob. Once you create a pixel layer on top of one, it makes no difference what you do to the smart object, the effect is hidden.
I know what you're getting at, but every change after using a plugin will have to reference both the prior Lightroom settings and whatever interaction was done via plugin. I simply don't believe it's as cut and dried as you make out.

I do agree that more use could be made of the Camera Raw Cache though.
E.g. store the most recent develop preview. Do a check that it's still current, use the recent preview, if not use the original info to create a current one.
Photo of Rob Cole

Rob Cole

  • 4831 Posts
  • 382 Reply Likes
Well, I hope Adobe can do something about this, preferably sooner than later, whether its cut-n-dried or not...

+1 vote - better develop-mode caching.

Over-n-out,
Rob
Photo of Braden gunem

Braden gunem

  • 21 Posts
  • 1 Reply Like
I wish i could use photoshop features like content aware fill. On a image in lightroom and not create an extra tiff.

maybe its time for a "pro version" of lightroom or just a better integration of photoshop features if they are installed side by side.
Photo of Sravan Nerella

Sravan Nerella

  • 4 Posts
  • 0 Reply Likes
To me adding the smart filters functionality in Photoshop with the layer mask for the effects is more than sufficient for most users. This way you can apply most effects quickly in lightroom. Once you different effects in different parts of the image, then you go to photoshop. Gives a clear differentiation from lightroom to Photoshop. Photoshop files are big and huge because they have layers. I want the simple controls like what viveza or Color efex or other filter applications can give you in lightoom without the necessity to do TIFFs.

I actually think this is a different request from this request and so created a new idea thread for this.
http://feedback.photoshop.com/photosh...
Photo of Aron Schmukle

Aron Schmukle

  • 21 Posts
  • 10 Reply Likes
Hi there,
I am a photography loving sound engineer.

in audio production with tools like avids "protools", real time plug ins have been working since many years. today you can add up to ten plug ins per track all having impact to each other. sessions growing to 64 tracks and more are very common, even in the native version of the program where only the computers own processors are calculating. there is also a program version powered by additional dedicated dsp's doing nothing else than processing audio plug ins, track counts there go up to 256. not to mention what else is going on under the hood like automation of every single parameter, real time time stretching and pitch shifting and much much more.
please dont get me wrong. i am not a programmer and dont want to underestimate what is going on under lightrooms hood, i love lightroom for what i can do with it and appreciate the work of adobe's engineers a lot.
making a long story short i want to suggest that it might be worth for adobe engineers to have some sort of knowlege exchange (chat, coffe, lunch...) with avid (or others) about their plug in architecture and 3rd party implementation. why not learn from each other, since they are not in competing markets.

regards,
aron
Photo of Helmut Hudler

Helmut Hudler

  • 4 Posts
  • 0 Reply Likes
I use Photomatix as an external editor, and this can cope with nearly all raw ́s and also with dng ́s
Photo of Jon Rista

Jon Rista

  • 4 Posts
  • 4 Reply Likes
I whole-heartedly throw in my vote for this. It is very frustrating having to break the parametric pipeline to do "advanced" editing like debanding with Topaz DeNoise or something like that. Not to mention that some things are best done on the original digital signal, rather than on the post-rendered RGB output.

To Aron Schumkle's point, it would be even better if plugins could utilize the GPU to distributed and compute intensive tasks. The idea of advanced, modern wavelet-based denoise and debanding, for example, would be ideal candidates to be powered by a GPGPU (or few, as may be the case with an SLI system.) I've seen prototypes of some advanced noise removal algorithms, and when they have enough horsepower, the results are phenomenal...when run directly on the RAW.

The age of workflow-disruptive editing via lightroom and a diversity of tools should really come to an end. A proper plugin architecture and the ability to hook into the non-destructive parametric pipeline, and even offer real-time configurable plugin settings in the develop panel for that matter, is most desperately needed.
Photo of Julian Z

Julian Z

  • 37 Posts
  • 0 Reply Likes
I also would love to see such a feature.

It could be implemented rather simple for a start:
LR creates temporary TIFF file of resolution it requires (i.e. preview size)
LR calls external tool with selectable parameters
LR reads in the TIFF file and processes further or writes a JPEG to the cache.
LR deletes temporary TIFF file.

That feature could be disabled in develop module. So only in the library view we can add (or remove) certain tasks to image(s). On view, export and print such tasks have to be processed.