Originally published at: https://boingboing.net/2018/03/30/mac-os-update-adds-support-for.html
…
I’m going to work on a Mac for the next couple of years. I’m not going to like it. But more graphics power? Well, why not. External? WTF.
Can someone please shove the whole idea of a non-modular PC up the arse of those engineers?
So, YES! GODDAMN FINALLY.
Unfortunately, as of now, no NVIDIA are supported.
Does anybody know if developers have to enable eGPU support in their apps, or does it come “free” with 10.13.4?
Official Apple doc here: https://support.apple.com/en-us/HT208544
Would love to hear some stories from the trenches on how this works out for people. This is about a week before NAB, so no shocker there in terms of timing in some ways.
May I ask what you will be using your Mac for?
And why you aren’t going to like it?
Mostly usual office stuff. Staple crunching is GIS. Some CAD, quite possibly. A little VR image stuff would be nice, but I doubt I will have the time. And I hope I can use some stats, in R. Depends on the projects.
Not my choice, but I can adapt. Sysadmin says it’s much easier for them to do the administration, and migration to new machines.
The whole “ecosystem” (don’t get me started on the appropriation of the term for IT bullshit…) gets a bit on my nerve, TBH.
We’ve been using eGPUs with cylinder Mac Pros since last Spring, specifically the Akitio Node Thunderbolt 3 enclosures with NVIDIA GTX 10xx cards for 3D animation/rendering work in Maya and Cinema4D. It’s been a pain, requiring terminal level prodding and poking, plus video driver juggling to get things running and keep them stable, but it’s been worth it. Having the AMD option is a nice addition, but I don’t think eGPUs will be taken seriously by those Mac graphics professionals who have held on this long until NVIDIA is brought back into the fold. My company is lucky to have someone tasked with making the damned things work, but that defeats most of the reason for sticking with Mac OS this long.
Yes, this is the kind of thing I wouldn’t recommend in a production environment typically, unless it’s officially supported by Apple and app developers. At least we have that now, and ideally supporting it will be much easier. I work in post-production tech, and obviously many environments are Mac on the client side, so I will very much be keeping my eyes on this.
I’m going to be cynical and speculate this more about GPU-accelerated bitcoin mining…
To answer this question with your BBS handle: Why bother? Laptop and desktop Macs are not what you would want to build a bitcoin-mining rig around. They use data centers for that these days, from what I gather…
…In the meantime, the aging-but-still-muscular 8-core Mac Pro on my desk at work gradually built up a pile of peripherals, including two webcams and a USB3 adapter that became un-supported by successive OS upgrades.
They work fine on the tiny Linux computer sitting next to it though
Because GPUs in laptops get HOT which isn’t great for the GPU or the laptop system board in general. Also see this from earlier today which is a win laptop that hooks up to an external GPU.
I think it is pretty neat option. You get some portability but can plunk down in a stable space and get your Borderlands fix.
I figured they would release this along with the trashcan Mac Pro, which I will say I called out pretty well ahead of its release. I’m thinking utility may have been limited prior to TB3, that and he fact that Apple seem to only remember pros once in a while these days — long while. Well, better late than never.
A part of me thinks “awesome!! Maybe I don’t have to rrpelace my iMac now just to have a gpu that Blizzard still will support”. Then the other part of me thinks “well wtf. I have to buy an external gpu for this as opposed to Apple just allowing me to buy a modular iMac?!?”
It’s very frustrating. Apple has made some stupid decisions these past 8-10 years when it comes to desktops. And I as a customer am losing patience.
I’m not sure if you saw this? They seem to have acknowledged their mistakes, but it remains to be seen what this “modular Mac Pro” will be. Obviously it could leverage eGPU now that it’s supported, question is will it have a couple PCIe slots at least, in addition to external “modularity?” We shall see, I would think they will show it at WWDC and release at end of year, similar to last year’s iMac Pro schedule. But could take longer.
Yeah, high-end game cards/graphics workstation cards can suck 250-300 watts. You couldn’t cram enough of those dinky 1" muffin fans into a laptop to prevent fried thighs, not to mention battery life measured in minutes.
Oh I know all that. But I Don’t exactly have faith that they will fix their mistakes. Course correction is not a strong suit of corporations.
I tend to forget that we are cramming everything into those two centimetres thickness.
My private “notebook” indeed looks like one of the old laptops. (It’s a Toughbook. I killed an IBM ThinkPad and decided they were not sturdy enough for proper field work.) And it is nearly modular.
At least I can swap some components… Not the GPU, though.
Well, I’ll say this:
Apple irks me sometimes, for sure. Lack of pro hardware at top of that list probably.
macOS is far from perfect.
OHMYGOD it’s better than “the Windows situation.”
I have a Windows desktop rig, and man – I wish there was a line of sick Mac Pros and sick Apple pro laptops. Because yeah, ugh, Windows is still far behind in my opinion.
Amen. My home machine is a 2008 Mac Pro 8 core tower. It’s quite happy running a pair of NVIDIA GTX 980s, SSDs, etc. It’s now ten years old, and just as useful as the 2015 machine at work. I’ll probably buy a used 2012 tower for home in the next year or so, and simply shift useful bits to the new frame.