Originally published at: https://boingboing.net/2020/06/23/apple-to-switch-mac-lineup-to.html
…
So, every single (Intel-based) application I own will no longer work on newest Macs. Because that’s what I want to do: buy the same product, over and over again. So, if they can’t force you to pay for a cloud subscription, they just tear away the actual hardware platform. But, thanks again, Apple.
Does a company like Parallels love this or hate this? Sure, they get to do another full release that they charge heftily for (after charging for every MacOS release), but at what point do their customers storm the barricades and turn Parallels into an autonomous collective where all the decisions of the presiding officer have to be ratified at a special bi-weekly meeting by a simple majority in the case of purely internal affairs?
Also, I dont care about the ‘ideological purity’ of all-USB slots. I just dont want my laptop to crash to the floor if i trip on the cord, which is what the MagSafe connector provides used to provide.
yes @beschizza you will notice a serious change…like not a single third party software suite anymore.
Adobe has had issues getting their suite of products to run on the ARM based hardware. I am sure other companies will not be quick to re-develop for a new architecture. We finally had streamlined hardware profiles with Intel and AMD; and OF FUCKING COURSE Apple has to come in swinging their junk around and fuck it all up like a bull in a china shop.
heaven forbid we have global standards.
@deltaecho EXACTLY.
As a developer, this basically means I have to switch to Windows (or Windows/Linux dual-boot). No more running Windows in a VM at reasonable speeds to compile Windows software. Docker? Also problematic.
[rant]I’ve been a Mac guy since 1985. They totally lost the plot when they created iTunes and shifted focus from making good products to making (crazy) money on selling digital “stuff” you don’t own. I don’t know if most people remember, but their hardware and software used to be rock solid![/rant]
P.S. And what’s with the obsession with the thinness of a laptop? I don’t care how thin it is, I care how heavy it is.
Exactly. Good rocks are hard to break. Good rocks don’t make rock salesmen rich. -SomebodyonFlintstones
End-stage capitalism has figured out that there is less profit in allowing that. Newer and different must surely be better, right? Umm… no, far from always, in my experience!
I figured out a long time ago that if I found a product I liked and that worked well, I should buy a couple more (if I could afford it - and yeah, I’m very lucky and mostly I can) and stash them away for the day the first one wore out. Because when that day came, the manufacturer will have replaced it with a newer, different version that either ceases to meet my needs or has function/feature bloat, or with a range of replacements that needs an expert insight to figure out which one is closest, requires the fewest trade-offs (it’s never zero) or is least likely to be a disappointment.
And yeah, get the fuck off my fucking lawn too!
And Apple are among the worst offenders when it comes to removing current /old functionality in order to run to the new shiny. It is only ever possible to have a love/hate relationship with Apple.
I’m looking forward to have all the mac products an unrepairable mess that will last the standard 3 years before nneding to get replaced.
Good thing I stopped buying Apple products back in 2008.
but yay! Let’s get excited about vertical monopolies!
PD: Also, it means that hackintoshes will stop being able to update in about 4 to 5 years.
“To support old and new apps, Apple will use Rosetta 2, integrated emulation software, to enable ARM-based Macs to run Intel code. In the prior PowerPC to Intel transition, Apple used Rosetta to let PowerPC apps run with performance compromises on new Intel machines, but Apple says the performance should be much faster for Intel apps running on ARM Macs.”
If they didn’t make changes to their API, porting most applications should be very easy. Only most heavily optimized (or badly written) software may cause problems. It’s a bit like recompiling Linux applications for ARM or IBM POWER, they mostly just work, sometimes with lower performance.
Having said that that, they will probably lose many professional users over that move.
Aside from any changes Apple makes, I suspect that this will be the kicker. It won’t change things for the core of committed OSX-supporting developers; but it sounds like Apple is willing to risk hideously-nonnative iOS apps becoming the new shoddy Electron app garbage; so long as they are developed as iOS apps and go through the store, unlike the shoddy Electron app garbage.
I can see why they would take that step; but it will not be without its downsides, at all.
Apple to keep Intel at Arm’s length: macOS shifts from x86 to homegrown common CPU arch, will run iOS apps
Apple is introducing a new binary compilation target for apps in its Xcode developer tool called Universal 2 that bundles native code for Apple Silicon and Intel x86_64. It’s also offering Rosetta 2, a virtualization layer to allow legacy x86_64 code to run on the upcoming Arm-based Macs. Rosetta 2 will translate x86_64 code on installation or on-the-fly in the case of browsers using JIT-compiled JavaScript or Java.
A new Virtualization layer is also in the works for running Linux VMs and Docker. Few details were provided during the WWDC video event, apart from Apple hardware SVP Johny Srouji demonstrating the launch of an Apache web server from a Linux VM via the command line. Apple is also patching for various open source projects like Chromium, Node, and V8 so they will run on Apple silicon.
To help developers convert their macOS apps to Arm-based chips, Apple has introduced the Universal App Quick Start Program, which provides support in the form of documentation, forums, beta versions of macOS Big Sur, Xcode 12, and the “limited use” of Arm-based hardware for app testing – a Mac mini with an A12Z Bionic SoC, equipped with 16GB of memory and a 512GB SSD.
Developers in 31 countries are eligible to apply and, if accepted, must pay $500. “Limited use” means the hardware must be returned to Apple within a year of acceptance. The kits start shipping this week.
Ah, Mac PowerPC running FORTH under the hood. I remember coding in those days…
Then they tried to compete with Dell with the same X86 chips but a fancier exterior. BIOS development has not been fun but at least they tried to maintain a “UNIX-like” feel.
Now they’re going ARM? Ha! This will be fun.
My 2012 i7 MacBook still does anything I need it to do. When it dies, I have no idea what I’ll buy. I don’t like the new laptops, especially since you can’t change the HD or RAM. Maybe I’ll just have a decade where I take a computer break.
That’s really not how it works. Not saying this is a good thing or a bad thing, but the transition will likely be seamless for most people. Apple has done this twice before and is very good at it. In the switch from 68k to PPC and PPC to Intel, they had seamless emulation layers (Classic and Rosetta, respectively) that worked so well you usually didn’t know what type of app you were running. These emulation layers ran for many years so developers had tons of time to migrate.
The experience is nothing like Parallels or other heavy handed virtual machine emulation. You generally cannot tell which app is running on which layer until years in, when you start to get notices about the compatibility layer going away and which apps haven’t been updated.
The usual victims in this are niche-use-case apps that aren’t well supported. This can be really bad for some people, no question. I lost the app that updates my old TV remote control in the Intel switch, that sort of thing. I needed a new remote anyway, but something like engineering or scientific tools going away can be worse for people.
I don’t recall ever paying to update an app during these switches. Not saying it never happens, but I didn’t notice any. Developers (of which I am one) do fat-binary updates that include the new executable compiled in. Code changes are rarely required.
Again, not saying this change is good or bad, just saying the evidence suggests it won’t be a big deal logistically for users. I have no interest in pro or anti Apple conversations. I use their stuff at home but not at work and I think they do some things well and some things poorly. Architecture migration is something they are actually really good at.
In the switch from 68k to PPC and PPC to Intel, they had seamless emulation layers (Classic and Rosetta, respectively)
Classic wasn’t for transition from 68k to PPC, it was for transition from the “Classic” Mac OS (system 1 through OS 9) to the new Unix-y OS X, on the same hardware architecture. Classic was more akin to WINE letting you run Windows software on Linux.
Since when has Apple ever given a flying chip about backward compatibility?
Their devotees know that you just drop last year’s piece of hardware like a 2-year-old who sees a new shiny toy, and max out out those credit cards!
Yesterday no longer exists. NEW SHINY TOY
I remember when the switch from PPC to Intel chips broke everything we had at the time. I expect this will be no different.
Sigh. I’m tired of trying to keep up with tech. I no longer have the money to keep up with Apple. My iMac is from late 2015, so it will need to be replaced at some point. When that time comes, I have no idea what I’ll do.
All I can think about is this old man who lived a few houses up from me when when I was in my mid-20s who constantly needed help with his Windows 3.1 computer. I’m going to become that old man and the cycle will be complete.
Wow.
Cuz yeah, running a fully software-emulated version of the x86/64 hardware layers is TOTALLY not gonna hammer performance into the dirt, right?
Yeesh. I didn’t realize the official Apploid arm-waving was quite that brisk o.o’ .
Are you joking or did the PowerPC somehow implement Forth…?
They also aren’t the only people playing with Arm this way. There’s already an Arm version of Windows, with the same sort of emulation Apple are talking about. Several of the Surface products are running on Arm processors. Which is probably how you get Microsoft so readily on board, they’ve kinda already done the work.
Two things you can take away from that is that the compatibility doesn’t work nearly as well as pitched. And that it’s not keeping up with x86 in performance contexts. Maybe Apple pulled off a thing dozens of companies have been shooting for for 20 years, but I kinda doubt it.