Advice on which Mac would best suit editing (Mac Mini vs. iMac vs. waiting)

  • 2
  • Question
  • Updated 3 days ago
So, I asked a question about an eGPU that didn’t get a ton of attention. With Apple’s announcement yesterday, the question becomes even more relevant, and now I’m wondering about my options.

My wife and I want to buy a desktop to be our home family computer, and now I’m debating among three options. We currently have our own laptops, and I have a 2018 13” rMBP with Quad core but no discrete GPU, so I don’t necessarily have to rush into a decision anytime soon.

1. The 2017 iMac
Pros: User upgradeable RAM, 5k Display, discrete GPU
Cons: older CPU, no hexacore CPU, slower RAM (slightly) and SSD

2. The 2018 Mac Mini + eGPU
Pros: New hexacore CPU, faster RAM (slightly) and SSD, upgradeable RAM (possibly more difficult than iMac)
Cons: no internal discrete GPU, adding eGPU costs more than iMac, 5k display would cost another $1300

3. Wait for an updated iMac
Could be better solution, but we honestly don’t know for sure when it would be released or what it would, feature. In particular, would it follow suit with the iMac Pro and have RAM that requires a technician to upgrade? But my hope is that we’d see an update by spring.

I wanted to get some advice, particularly on what would be best for running Lightroom and Photoshop, since those are the primary tools that warrant the more powerful computer. In particular, does the eGPU provide sufficient support to LR? I kinda assume more cores is, generally more important than the GPU. But is it worth waiting?
Photo of Jon Anscher

Jon Anscher

  • 179 Posts
  • 22 Reply Likes
  • undecided

Posted 2 weeks ago

  • 2
Photo of David Converse

David Converse

  • 440 Posts
  • 126 Reply Likes
I wouldn't worry so much about this. Any recent Mac will be just fine. The eGPU won't be much help and the built-in GPU (even the integrated one) should work well.

At some point you are spending a lot of money for incremental upgrades.
Photo of Jon Anscher

Jon Anscher

  • 179 Posts
  • 22 Reply Likes
That’s good to know. And it has crossed my mind in a specific way. For instance, buying only 32GB of RAM for a Mac Mini for now, and waiting for the cost to go down before upgrading to 64GB was something I have considered as well. I really would love to speed up LR. Particularly when going through and picking through photos and when stitching large panoramas. Based on what you and dmeephd have said, it sounds like the limiting factor on LR speed is not as much the hardware though.

I do know (as I mentioned below) that LR slows down when I have two external displays plugged into it. So the eGPU, I would hope, should help that by removing that burden off the integrated or discrete GPU in the machine.
Photo of Jaroslav Bereza

Jaroslav Bereza

  • 808 Posts
  • 187 Reply Likes
What about PC instead of Mac? Upgrades are cheaper and you will have more performance for the same price.
Photo of dmeephd

dmeephd

  • 214 Posts
  • 49 Reply Likes
With the Mac OS, there's only a few eGPU options available (compared to Windoze; but really, Windoze?  Never!) and Lightroom is not designed to take advantage of them, though some VR programs and games are, and supposedly so too is Photoshop (but I have not been able to confirm that).

The key is get as much RAM as you can afford for both the CPU and the GPU.  Select clock rate over number of cores as Lightroom was never designed to use symmetric multi-processing on multiple cores.  RAM and clock rate is the Name of the Game.

Look to respected third-party RAM suppliers like OWC for larger memory modules, then buy the smallest with your Mac from Apple, and then buy the larger ones and sell the original Apple RAM back to OWC.  (I bought a MacPro with 3.5GHz clock, six cores, and two 8GB GPUs with 16GB of RAM.  I sold the RAM to OWC when I bought 128GB of RAM from them.  The RAM turned the MacPro into a rocket—except with LR 7.5 and LR8.)
Photo of Jon Anscher

Jon Anscher

  • 179 Posts
  • 22 Reply Likes
That’s disappointing to hear that LR8 was still slow even on your Mac Pro configuration.

The eGPU front is where I lack clarity. I’ve noticed that Lightroom slows down even further when I have both my 2560x1440 Thunderbolt 2 Cinema Display’s plugged into my laptop. My hope with the eGPU is that if it was running my displays, that would save the integrated graphics card for crunching photos in LR and Photoshop. But to be clear, are you saying that likely would not be the case?

I do know that it would be better to have a discrete GPU for that purpose, so the question really comes down to this: what is more valuable, the faster clock speed and CPU of the new Mac Mini, or the discrete graphics card and larger graphics RAM of the iMac. Or would it be better to wait and get both when the new iMac comes out.

I’m surprised to hear you say that cores do not matter. Everything I’ve read up to this point, including Adobe’s own recommendations to speed up LR have said that more cores is better. Is that just marketing speak then? I guess the fact that your Mac Pro didn’t speed up LR, or didn’t speed it up much is probably a good indicator of what David said, at some point the difference is incremental, at least as far as LR goes.

Part of me is also thinking about video editing, although I don’t do a lot of video work, it is something I toy with here and there. And when I have kids, I have a feeling I’ll want to do more of it. Or maybe just feel compelled to do more of it.

Either way, iMac or Mac Mini, I’d be looking at getting 64GB of RAM, and as you said, buying it 3rd party (OWC is my favorite place for that as well, and it doesn’t seem Crucial has the right memory in their inventory at the moment). I’ve yet to see what it takes to install new RAM in the Mini. I’ve read that Apple doesn’t consider it user-replaceable, but I’m not sure if it’s on par with the iMac Pro as far as difficulty goes, or more along the lines of replacing SSDs on older MacBooks.
Photo of dmeephd

dmeephd

  • 214 Posts
  • 49 Reply Likes
Difficult to say, as I never ran any formal tests with metrics.  I was running LR6 at the time, as I was put off by all of the bugs reported for LR7, and that version was dramatically improved during importing (which is the most process intensive aspect of my workflow as I tend to have shoots with a minimum of 1,000 raw files of 50MB plus).

I can state that an import which took over an hour in LR6 with 16GB dropped to less than 10 minutes when the RAM was increased to 128GB.

Unfortunately, it all went south when I 'upgraded' (and I use that term very loosely) to LR7.5 (and later LR8) as Adobe in their infinite wisdom decided to have import AND preview building run in parallel—Lightroom simply isn't built for that.  It does not support multi-cores worth a damn.  Now an import of similar size takes over an hour.  (A 13-day trip to the Toltec ruins in Mexico and Guatemala yielded over 10k images; that import took over four hours before a single image appeared, and then overnight to finish building the previews.  Sick.)

I have proven this by not selecting any previews building—standard or 1:1—during an import and the import process is much faster.  But never has it approached that of LR6.

Whether Adobe fixes this or not, the bottom line is go for the highest clock speed and the most RAM you can afford.  This hasn't changed since I was awarded my Ph.D. in Electrical and Computer Engineering, and according to Moore, it isn't likely to.

Unless, of course, Adobe decides to truly support multi-core CPUs.
Photo of dmeephd

dmeephd

  • 214 Posts
  • 49 Reply Likes
Sorry, I was typing before your second response appeared.

Multi-core support and Adobe—marketing BS.  Period.  Check out their patent applications.  (Hint: they don't exist.)

Dual monitors can be problematic, even with multiple GPUs.  I run three (3) 34" LG curved displays, making sure that each one is connected to the MacPro via a separate Thunderbolt cable to it's own Thunderbolt connector on the MacPro. (The MacPro has six Thunderbolt ports, BTW)

Using In-Site to watch their performance, one GPU idles and the other works like a racehorse.  Adobe cannot even support multiple GPUs.  Pitiful.  Cores do matter, just not with Adobe products.  There are video editing products out there from others which take full advantage of multi-core CPUs.  Perhaps Adobe is just too lazy or incompetent?

I was thinking of toying around with Virtual Reality (VR), so I read damn near everything I could find regarding VR support and the Mac.  First, VR needs a eGPU.  Second, there are only two or three which are supported by the Mac (BlackMagic and I forget the others at the moment) but in all cases they need Thunderbolt 3 for the bandwidth and the MacPro trashcan does not have TB3.  However, the upshot was in my research that virtually nothing—no pun intended—as far as software goes currently can take advantage of a eGPU on the Mac.  Perhaps this will change, but I'm not holding my breath waiting for Adobe.

My next upgrade is to replace the 1TB SSD in the MacPro with the 2TB SSD from OWC.  It is nearly three times the read/write rate as the Apple SSD, gives me twice the capacity, and at a net $689 (after selling the 1TB SSD back to OWC), sounds like a deal.

I came to this conclusion this past weekend when I noticed that the preview cache—which is on the SSD—was gobbling up a lot of filespace.  (I run LR from a dedicated 8TB G-Tech Thunderbolt drive.)  I am hoping that the extra space and the faster transfer rates would offset the lag I noticed as the cache grew larger and available disk space dropped to under 100GB.
Photo of Jon Anscher

Jon Anscher

  • 179 Posts
  • 22 Reply Likes
Dmeephd, thanks for your detailed response. You know, I am hitting myself for not thinking of this early. Partly it’s because I’d have to buy another Thunderbolt 3 to Thunderbolt 2 adapter as well as some sort of hub in order to plug everything in (my two hard drives and the power can certainly share a single Thunderbolt 3 port), but I did have the two displays daisy chained. I forgot that there are two DP streams, one on each side of the MacBook Pro.

Of course, one of the other advantages of buying an iMac would be the built in display.
Photo of dmeephd

dmeephd

  • 214 Posts
  • 49 Reply Likes
Right.  The TB3 to TB2 is not strictly bi-directional, so it doesn't work for those of us whose Macs have only TB2.  Sigh.

I have had my displays daisy-chained in the past, primarily on my MacBook Pro (late-2013), and I hoped that single porting each one would improve performance with LR and Photoshop.  No dice.  Not with Adobe.

AutoCAD, on the other hand, runs like a champ and uses both GPUs equally, so it can be done if the developer gives a hoot.
Photo of Victoria Bampton - Lightroom Queen

Victoria Bampton - Lightroom Queen, Champion

  • 4426 Posts
  • 1637 Reply Likes
There is no single magic answer, I'm afraid.

Multiple cores make a difference for things like preview building, exporting, merges, etc. But many things have other constraints that prevent Lightroom from maxing out the CPU.

Some things are limited by disk transfer speeds, so it's no good having super fast CPU if the disks can't feed the data quickly enough.

Extra RAM is useful because it means more data can be held in fast memory instead of reading off the disk. And additional spare fast disk space for caches is useful when skipping around, rather than viewing photos sequentially.

GPU helps most with high res screens, like those beautiful 5K screens. But it's still early days for eGPUs.

Lower screen resolution makes the biggest difference for interactive performance in Develop, because there's simply less pixels to compute for each change.

The moral to the story is fairly simple - spread your bets. Don't go all-out on a single hardware component that you hope will help, but spread the cash round a bit.

To see where you can make improvements, I'd keep a close eye on what you're maxing out on your current MBP.