Skip to main content
Adobe Photoshop Family

1 Message

 • 

1.1K Points

Thu, Mar 31, 2011 2:35 PM

103

Lightroom Classic: A real plugin architecture to make TIFF files unnecessary

My number one feature request would be a real plugin architecture for Lightroom. It's just kind of nuts that to use any sort of plugin or external application, I have to create a separate TIFF file. The TIFF files break workflow by losing all the history of adjustments applied to the RAW before creating the TIFF, they're a pain to manage, and they take up a large amount of space on disk.Perhaps Lightroom could let plugins create mask overlays or something like that, which would integrate with the develop history. But anything that would avoid the necessity of creating a TIFF file would be welcome.

Responses

322 Messages

 • 

7.5K Points

10 years ago

I totally agree ... Not sure how easy it would be to integrate the capabilities to be able to offer parametric, rather than pixel editing based plugins ... But it sure would streamline the workflow for many tasks ...

The addition of the improved NR features in ACR 6/LR 3 have drastically reduced the number of derivative files ... Maybe even thousands per year as I shoot a considerable amount of sports in less than ideal conditions ...

It would be great if third part plugins could offer the same abilities ...

4.5K Messages

 • 

76.3K Points

10 years ago

This would definitely open some doors and provide some advantages.

Its obviously possible, since other raw/parametric editors are doing it. I'm not sure how well it fits into Lr design or Adobe vision, but the benefit to users is clear.

Rob

4 Messages

 • 

154 Points

10 years ago

Yes. I like this idea a lot.

248 Messages

 • 

4.1K Points

10 years ago

I have been waiting for this since the first betas. I hope adobe wants this to happen. If they do not then it really makes a mockery of the whole parametric editing concept if you have to leave LR.

4.5K Messages

 • 

76.3K Points

10 years ago

Rory - Don't you think the term "mockery" is a little "dramatic" :-}

Still, I think I see your point - if you have to edit non-parametrically/destructively at any stage, it breaks the parametric/non-destructive editing chain...

The good news: even traditional destructive pixel editors can be made "parametric", in some instances. If its possible to define a formula for taking input and producing output, and that formula can be saved as "parameters", then so-called destructive editors can be chained into the parametric pipeline. User's will need to understand that these stages will not be as efficient as the native Lightroom stuff, since there is no consolidation of settings...

Anyway, Adobe can certainly invent the solution once they set their minds to it. It'll be interesting to see if/when it happens...

Note: This paradigm already exists for exporting: plugins can define "filters" which take an image file and produce an image file which is passed to the next "filter" in the chain. There has been surprisingly little use of this feature to date (I'm not sure why - except the documentation is confusing), but the idea can certainly be extended to the develop module - preferably, with option to bypass the intermediate files and pass the image data in ram...

Rob

248 Messages

 • 

4.1K Points

10 years ago

"Rory - Don't you think the term "mockery" is a little "dramatic" :-}"

Okay Rob, I'll grant you that. ;-)

I guess it is the disappointment leaking out. I luv Lightroom and want it to be the best it can be. Early on in the Lightroom development I mistakingly interpreted the promised SDK to be "all inclusive". Given it is impossible for adobe to meet everyone's needs, and their history with photoshop plugins, and their contempt for what aperture calls plugins, that are not parametric, I felt that a rendering pipeline plugin architecture would be necessary and forthcoming.

Either adobe does not want to implement this or they are waiting for the rendering pipeline architecture in LR to stabilize before implementing a SDK. I'm hoping for the latter. Being basically on optimist, LR4 would be an excellent time

To any Lightroom engineers listening in, I did not mean to disparage your work, for which I have the greatest respect. LR is a joy to use.

4.5K Messages

 • 

76.3K Points

10 years ago

The more I think about this, the more doable it sounds to me.

I mean, ultimately, Lightroom ends up creating an rgb bitmap in the develop module. I would think inserting a pixel editing step or a plugin image transformation step would be a matter of finalizing the rgb bitmap and passing it to the plugin. All subsequent lightroom edits would then operate on the new image data.

I dont mean to make this sound trivial - its not. But, very, very doable...

And this would be parametric/non-destructive by definition: all plugins or image editors in the chain would be required to take their input and create their output on demand, given the settings (or parameters) that are set for them.

PS - I understand your emotion/disappointment/frustration...

R

946 Messages

 • 

13.8K Points

10 years ago

"Either adobe does not want to implement this or they are waiting for the rendering pipeline architecture in LR to stabilize..."
Tom Hogarty has commented directly on this:

http://blogs.adobe.com/lightroomjourn...

"Photographers would still like to see image processing plug-ins in Lightroom and I agree with them."

It seems clear that the problem is not one of intent.

4.5K Messages

 • 

76.3K Points

10 years ago

Regarding: http://blogs.adobe.com/lightroomjourn...

The journal-blog looks like a great resource (sorta like a forum, except Tom Hogarty willing to do more than a cameo appearance there).

From what I gleaned in that article, it seems Adobe is getting stuck, and maybe they shouldn't be. In other words, plugins dont need access to raw data, just rgb in / rgb out. Although one could conceivably do this with true plugins (meaning UI and everything else via SDK) by extending the SDK with functions like: GetRGBImage (for getting input), displayRGBImage (for displaying intermediate results), and SetRGBImage (for final plugin output)... I would think a better approach would be to allow full-fledged external applications as imaging plugins. I say better, because it could be done immediately - I'm talking about Lightroom development, AND the development of the plugins.

With this approach, the only thing Lightroom has to do is:

- be able to invoke the plugin app for user setup, pass it image data, and retrieve image data out (for parametric plugin edit setup).
- Be able to invoke the plugin without the UI, once already setup, and pass it image data and retrieve output (for re-rendering).

Not only would this be easy and quick for Adobe to do, but would allow 3rd parties to harness existing technology immediately with very little fuss. - No need to rewrite from scratch to conform to an all new environment - maybe just a tweak for source of input / output...

Bottom line: Postponing development of image access to plugins / plugin-apps in favor of some super-integrated all-new environment, is too much work, and will make us wait too long - for Lightroom imaging plugin support, as well as the imaging plugins themselves. In other words, this is a case where a "compromise" solution may actually be "better" all around than going all out...

946 Messages

 • 

13.8K Points

That's basically what we have now, Rob. It's not non-destructive, and if you wanted to go back and make adjustments at the LR/raw level, it might mess up the work of the plugin massively.

4.5K Messages

 • 

76.3K Points

10 years ago

When you say its what we have now, do you mean the external editor interface that forks a new tif? - I'm not sure what you mean. I mean presently thats the only way for a 3rd party to influence image processing in Lightroom, and it leaves a lot to be desired - thus this topic.

If you look at how Nx2 works, it does this:

First, there is an optimized process to fold basic settings in with the raw to produce an rgb, then passes that rgb to optional next steps in a chain. Each link of the chain has settings and performs a transformation and outputs the rgb to the next link in the chain. Yes, previous steps may influence subsequent results - thats the good news and the bad news all rolled into one, and is the reason every single step includes a histogram and shadow / highlight recovery - to keep the image data bounded for the next stage.

Its totally non-destructive. Raw data enters, rgb data emerges, and the settings in between determine the final outcome.

946 Messages

 • 

13.8K Points

That's really no different that what we have now with passing data to PS as a smart object and then using all the tools, filters and plugins available there, just as described in Tom's blog post. I think the team would rather go a step further and allow plugins directly into the raw pipeline somewhere, even if it's at only one or two places. However, I think that's actually pretty difficult to do. As I've said before, this team seems to prefer to fix problems thoroughly rather than quickly. I've used this example before, but I and many others wanted them to fix the orange-reds problem. They did, but they also revised the DNG spec, came out with all new profiles, came out with camera-matching profiles, and provided a tool users can use to make their own profiles. They did the same with lens corrections. It seems to be their way, so I'm not sure if they'd seriously consider a processed-RGB-only approach such as you described above, especially since we sort-of already have it. Perhaps I'm wrong, I don't know.

4.5K Messages

 • 

76.3K Points

10 years ago

I mean, if work in prior stages effects the input to and hence output of a plugin, that's almost the definition of non-destructive. Destructive means baked-in (results saved in rgb concrete), and non-destructive means "a recipe for transforming input to output", where the output varies with the input, as dictated by the settings.

4.5K Messages

 • 

76.3K Points

10 years ago

There are some BIG differences in what I'm talking about versus what we have now. Yet only SMALL changes would be required, because all of the existing pieces more or less already exist - they just need some glue to hold them together.

Presently, if you edit a smart object in Photoshop, you still end up forking a large tif that ends up as a separate photo in Lightroom. This is the biggest part of the problem the OP wants resolved, and so do I.

As I see it, Lightroom can (and should) implement two flavors of 3rd party imaging support:

1. The ability to feed image data to a resident app, and use its output in Lightroom, without forking a new photo. App only stores parametric settings - no image data, thus non-destructive, and small data footprint.

2. An SDK that supports resident (non-modal) plugins, in the panels, with native look & feel, for BOTH reglar plugins and imaging plugins.

In my estimation, type 2 is WAY more work than type 1. So, it seems to me, unless Adobe has more development resources up their sleeve than I think, it makes a lot of sense for them to do type 1 in Lr4, since it gets the most bang for the buck by far, and then reserve type two for Lr5, since it would require a lot more work.

If it were an either-or deal, and Adobe/users were dead set on type 2, I'd say "so be it". But, the two are not mutually exclusive at all, and I think having a big box of imaging plugin-apps available within a few months of releasing Lr4 should be a compelling motivation. These would be available quickly because existing apps could be massaged in relatively minor ways, instead of having to be re-written.

4.5K Messages

 • 

76.3K Points

10 years ago

I mean, what is a raw pipeline anyway?

Lightroom first has to convert raw data to rgb, and I don't imagine any plugins ever getting involved in this. Maybe I misunderstand. Assuming I'm correct, when one says "raw pipeline" they really mean "rgb pipeline", where the distinction is in the type of data flowing at this point. And presently, Lightrooms algorithm is an optimized thing that consolidates the various settings to minimize re-rendering. This is the reason Lightroom is faster than NX2 at rendering photos from scratch that have a multitude of adjustments applied (NX2 re-renders for every stage instead of consolidating settings across stages).

So, are people really expecting plugins whose settings can be consolidated with pre-existing Lightroom settings, or other plugins, or post-existing settings? - I dont think so, or at least I would not attempt this. If this is what Adobe has in mind, then I bow to the masters, and will watch with awe from the sidelines...

But, I personally cant imagine anything other than rgb-in/rgb-out for plugins, at least not in this decade. I assume thats how Photoshop plugins work, although I have no experience with them under the hood. i.e. Camera raw transforms raw to rgb, and Photoshop(proper) & plugins take it from there...

?

946 Messages

 • 

13.8K Points

What if the plug-in has something like control points, and those control points are based on the X-Y location of where they are selected. So you select one. Now, LR warps the image and/or crops the image so that that X-Y location is in a totally different X-Y location on exit from the pipeline? Wouldn't the control-point plugin need to know that? Of course it would. Okay, so you need to put it in the pipeline before any warping or cropping. Now isn't it in the pipeline itself? Okay, maybe there could be some sort of two-way and the plug-in could pass its X-Y location to the pipeline and LR could pass back something about that point pre-warp. But that would only work for a single point. What about a path? What about a complex path? Wouldn't that really, really need to be in the pipeline before warping? And what if the plug-in wants to do more complex stuff like maybe white-balance gradients or doing its own warping or something else? Wouldn't that need to be possibly in another place in the pipeline? Frankly, it makes my head hurt, and I wouldn't be surprised if the Camera Raw engineers are in some similar pain, but I'm not sure of a way around this that gets us more power than we have now with external editors without giving some sort of access inside the raw imaging pipeline.

248 Messages

 • 

4.1K Points

Okay, now my head is hurting too. I imagine the plugin point(s) in the pipeline would have to be constrained, and also the type of edits allowed. Your earlier point about the LR team's thoroughness is very well taken.

4.5K Messages

 • 

76.3K Points

Rory said: "I imagine the plugin point(s) in the pipeline would have to be constrained, and also the type of edits allowed."

I'm not sure why the edits couldn't be anywhere downstream of raw conversion, and any edit types allowed.

4.5K Messages

 • 

76.3K Points

10 years ago

You make some good points Lee Jay. But I'd say: screw the warping. Do all your warping before you feed the data to your other plugins. Or if you re-warp, then redo your downstream plugins if necessary... I mean, Adobe went all out allowing us to add dust spot removal to our photos, then follow up with lens corrections that maintains the positions of the dust spots. While very ambitious and impressive, it was also a lot of work, and a source of problems, and you have people recommending workflows to avoid the extra performance penalty. Dont get me wrong - this was a really cool thing they did, and is one reason so many other things did not get done in Lr3.

On the other hand, cropping definitely needs to be handled. But that's easy enough to do by simply supplying the crop position and dimension to the plugin so the control point coordinates can be adjusted. By this, control points stay in position despite recropping.

Summary: Your point is well taken: plugins with position-sensitive settings will be wonky if previous pipeline stages warp the image. Cropping compensation however is childs play.

And, one could extend this argument to other things. Plugins that are designed to transform the hue of purple would be wonky if preceding stages already transformed the hue of purple... And tone...

But, I mean that's just how things are: If you crank up the fill-light, then you may need to increase the black-point too...

So if you do things that adversely affect a downstream plugin, you may have to go tweak the downstream plugin - thats life in the photo biz...

Bottom line: If Adobe is determined to "do it right", where "do it right" means tight integration and interaction control, we may not see imaging plugins for a long, long time.

Personally, I'd prefer they throw us a bone in the mean time... woof-woof.

R

Champion

 • 

29 Messages

 • 

1.3K Points

10 years ago

You'll see them when it's right Rob, as Lee Jay has said. And your own reasoning is exactly why. It's part of the Mark Hamburg ethic of giving more than you asked for.
You talk about the RGBout/in line, but there are 2 aspects to the process pipeline: Settings applied to on a preview, and then those settings on the Raw to create the export. With export it's a matter of apply the sum of the settings and off we go. Working with Develop previews it's a tad trickier.. So we have Plugin X that does something. We do a bit in Lightroom, then use plugin X, then back to normal settings, then we need to tweak X again.. etc, etc. Each time we go between the normal pipeline and the plugin, we have to make sure Lightroom understands the plugins effect on the file and interact with it correctly, passing back and forth between LR and the plugin. Now throw in 5 or six plugins that interact, as well as Lightroom's own processing, and you've potential for the computer grinding to a halt trying to keep up.. like with mixing a lot of spotting with Lens Correction.

I certainly would love plugins internal to the pipeline, but I want it to be right.