Reviewers impressed by ARM MacBooks

Originally published at:


I’m confused


Apple really couldn’t have made it clearer that they do not care to support capital-G Gaming, ever. They will probably never support Nvidia hardware again (because Nvidia clearly means to become the PC platform), they deliberately spurned the opportunity to make Apple TV into a games console, and they only grudgingly allow external GPUs for their ultra-expensive desktop macs because a handful of professional users have reasons to want it.

They are very into GPUs in general, and I’m sure their own GPUs will be technically competitive, but I wouldn’t ever mistake that for Apple trying to court game development the way Microsoft does. That market is defined by supporting someone else’s latest hardware, which is the opposite of what they’re about.

I assume that’s snark, but iOS devices have always been subjectively fast, and over the last couple of years they’ve pulled ahead objectively, too; Intel doesn’t make an x86 chip as fast as what’s in the current range of iPhones. So for people who don’t have a problem using Apple devices, it’s a descriptive comparison.


Despite the incredible performance improovements of RTX 30X0, the RX 6X00 might be the winner.

(The top end cards are at very different price points, so it’s difficult to compare.)


And that’s why, on second thought, tempting as it may seem, there’s no way in hell I’m going to buy a Mac Mini.


Good to hear. I’ve been waiting to upgrade my 2014 Air until the keyboard was fixed, now I may just fully take the plunge to the M1.

Hasn’t that been the case for years? The 2019 Pro only had two.


My 15" Pro has four.

I typically only use two, but its nice to not have to grope for the USB hub when I want to do something simple like plug in a thumb drive.


Only half snark!

To me “works like an iPad” means a computer is useful for everyday browsing, office software and light games, but wouldn’t hold up for more GPU or CPU-intensive uses.

The idea that the reviewer meant “like an iPad” in that it stands out amongst its peers honestly hadn’t occurred to me! But it makes sense.


Well, this M1 stuff suggests AMD will also be losing Apple as a customer in the near future. Though actually, if Apple has to keep using x86 chips in its Mac Pro line for a few years, they might switch to AMD for that.

But what I meant about Nvidia was not so much about market share as their apparent future plans. Graphics cards are getting less and less commodified, and increasingly the high-performance PC market is about “GPGPU”, where Nvidia’s proprietary CUDA platform is becoming the clear leader. You can already see specialist STEM software that may not care if you’re running Windows or Linux, but requires Nvidia hardware.

They keep trying to make their own platform by other routes, too – it’s kind of obvious they mean to be the next Intel or possibly the next Wintel.


Boing Boing has been running on ARM for awhile now for databases, and we’ll move our webservers over shortly. Those servers use far less power, making them cheaper to run, with no real drawbacks besides availability.

x86’s time has finally come, and it was the relentless drive for ultraportables (phones) with better power consumption and performance that did it in.

For one narrow class of “gamer”, that may be true, but not for everyone. I game, daily, on my 5k iMac, and I met my partner through gaming, so we would hardly fail to fit the mold of a “gamer”. My two primary gaming platforms are my Mac and my console nowadays, and it’s been a long time since I felt any lack of choice or options in either regard.

This is an interesting line of thinking, something along the lines of x86 being the performance king of compute forever, too. If you look at the trajectory of integrated GPUs over time, I think you’ll find that discrete components are getting close to having the exact same issues that x86 vs ARM have run into. There is a point where performance will be “good enough” with IG that this entire paradigm will be moot. While Intel and AMD will probably crank out some boutique parts to ensure ARM doesn’t take the performance crown away entirely, it won’t matter - the benefits to everyone else will be obvious enough that the transition is going to happen regardless. Boutique gamers who want liquid-cooled systems will still exist (and more power to them), they just aren’t going to drive the gaming industry they way they have in the past.

The avalanche has already started; it’s too late for the pebbles to vote.


There is a first for everything. This may be the first time I’d choose a Macbook over a Thinkpad.

1 Like

Very possibly, but I’m pretty sure the M1 chip is exactly the kind of thing Nvidia plans for high-end gaming PCs to be based on in the near future (i.e., a GPU with some ARM cores). When people buy those machines, it’s already pretty much the graphics card they’re paying for, and the x86 chip is just an accessory, like the hard disk or the USB ports. With the right support from Microsoft, no one will care whether it’s even there any more. Apart from Intel.

Speaking of Intel, they of course make lots of other chips, including (I think) the Thunderbolt hardware in Macs. I just noticed today that the M1 Mac Mini has only two Thunderbolt ports, down from 4, and I wonder if there’s a story in that.

On EC2, or do you actually buy physical ARM servers?

Yes, I basically meant the sort of gaming that involves a lot of RGB LED strips and Axe body spray.


EC2, Graviton2’s:


Re: the webcams … I conducted an informal poll against my coworkers and clients this morning as to their satisfaction with the webcam quality on their existing Mac laptops. Out of the 15 people I asked, 14 of them were happy with the current quality and didn’t see a need for higher resolution cameras. In one person’s words: “It’s enough that I can see that I’m talking to so-and-so. I don’t want 4K… or even HD. I don’t want to see the pores on their face.”

A number of them did express a desire for a camera that did a better job at handling tough exposures, especially backlit scenes.


The low resolution certainly helps with computing those dynamic background inserts :wink: Personally I use a phone as external camera anyway, the angle from the laptops displays top really does me no favours.


Hilariously, I’ve noted the same thing - I recently updated to an external webcam because the lighting in my home office is terrible, and I needed better low light performance.

I accomplished that, but immediately upped the “touch up my appearance” options and quickly realized why hollywood adds makeup to everyone :wink:


Over the years I’ve experienced only a few really big jumps in computing power:

  • 8-bit computing to i386 in 1991
  • 200 MHz to 1+ GHz in 2001
  • Spinning hard drives to SSD in the 2012 MacBook Air

This new platform may be the next big jump.

But Apple’s reputation is still marred by the 2016-2019 generation of MacBooks, which are literal garbage thanks to the faulty keyboard, clown-sized trackpad, and poor thermal management. They have some work to do to earn my trust back.

The dumb ideas from 2016 are still around. Apple needs to ditch the touchbar, and they need to give their users at least one USB-A port so they can charge their stuff without a dongle.


I have a 2019 16" MBP. It is, without question, the best laptop I’ve ever owned, and the first real replacement for my 2015 MBP that I really enjoyed.

Of course, I still have a 2011 11" MBA I carry into datacenters. The new M1 Air may well replace that and will feel like a quantum leap in performance to boot.

No thank you. The physical ESC key was needed, but I’ve modded my touchbar interface in iTerm2 and don’t want to go back at this point. :slight_smile:


I’ve used a lot of Thinkpads and they had a lot of clunkers. It seemed like they’d alternate between rock-solid and problematic models. I mean, there was one model that we were using as a console server in the lab 10 years later, and the next model very literally fell apart from the stress of opening and closing the cover every day. Most people returned theirs to corporate IT in a bag. I had to, and I am generally pretty easy on laptops.

I do miss the Thinkpad keyboard, though, which has been a problem on the 2013 Macbook Pro I have yet to replace.


You don’t need any dongles if you replace the cables. A C to micro-B cable is less than $5; the latest iPhones all ship with C to lightning. My new keyboard has a type C port. Microsoft and Nintendo both have gone over to type C for their gamepads. Type A is quickly going the way of PS/2 and RS-232 ports; nice to have for backwards compatibility on a desktop but not something I want to waste space on for the computer that I carry around in my backpack.

I carry:

  • tiny GaN-based 60W power supply
  • C to lightning for my phone
  • C to C for my laptop, iPad, Switch, and most accessories
  • C to micro-A for legacy devices
  • Multiport dongle with type A, HDMI, and passthrough charging, which I never use, just for emergency… powerpoint presentations, I guess?

Everything can charge from the same brick, with fast charging where applicable. This is a huge improvement over the days of enormous Magsafe bricks and USB type A.