Reviewers impressed by ARM MacBooks

Originally published at: https://boingboing.net/2020/11/17/reviewers-impressed-by-arm-macbooks.html

4 Likes

Apple really couldn’t have made it clearer that they do not care to support capital-G Gaming, ever. They will probably never support Nvidia hardware again (because Nvidia clearly means to become the PC platform), they deliberately spurned the opportunity to make Apple TV into a games console, and they only grudgingly allow external GPUs for their ultra-expensive desktop macs because a handful of professional users have reasons to want it.

They are very into GPUs in general, and I’m sure their own GPUs will be technically competitive, but I wouldn’t ever mistake that for Apple trying to court game development the way Microsoft does. That market is defined by supporting someone else’s latest hardware, which is the opposite of what they’re about.

I assume that’s snark, but iOS devices have always been subjectively fast, and over the last couple of years they’ve pulled ahead objectively, too; Intel doesn’t make an x86 chip as fast as what’s in the current range of iPhones. So for people who don’t have a problem using Apple devices, it’s a descriptive comparison.

6 Likes

Despite the incredible performance improovements of RTX 30X0, the RX 6X00 might be the winner.

(The top end cards are at very different price points, so it’s difficult to compare.)

2 Likes

And that’s why, on second thought, tempting as it may seem, there’s no way in hell I’m going to buy a Mac Mini.

4 Likes

Good to hear. I’ve been waiting to upgrade my 2014 Air until the keyboard was fixed, now I may just fully take the plunge to the M1.

Hasn’t that been the case for years? The 2019 Pro only had two.

2 Likes

My 15" Pro has four.

I typically only use two, but its nice to not have to grope for the USB hub when I want to do something simple like plug in a thumb drive.

3 Likes

Well, this M1 stuff suggests AMD will also be losing Apple as a customer in the near future. Though actually, if Apple has to keep using x86 chips in its Mac Pro line for a few years, they might switch to AMD for that.

But what I meant about Nvidia was not so much about market share as their apparent future plans. Graphics cards are getting less and less commodified, and increasingly the high-performance PC market is about “GPGPU”, where Nvidia’s proprietary CUDA platform is becoming the clear leader. You can already see specialist STEM software that may not care if you’re running Windows or Linux, but requires Nvidia hardware.

They keep trying to make their own platform by other routes, too – it’s kind of obvious they mean to be the next Intel or possibly the next Wintel.

2 Likes

Boing Boing has been running on ARM for awhile now for databases, and we’ll move our webservers over shortly. Those servers use far less power, making them cheaper to run, with no real drawbacks besides availability.

x86’s time has finally come, and it was the relentless drive for ultraportables (phones) with better power consumption and performance that did it in.

For one narrow class of “gamer”, that may be true, but not for everyone. I game, daily, on my 5k iMac, and I met my partner through gaming, so we would hardly fail to fit the mold of a “gamer”. My two primary gaming platforms are my Mac and my console nowadays, and it’s been a long time since I felt any lack of choice or options in either regard.

This is an interesting line of thinking, something along the lines of x86 being the performance king of compute forever, too. If you look at the trajectory of integrated GPUs over time, I think you’ll find that discrete components are getting close to having the exact same issues that x86 vs ARM have run into. There is a point where performance will be “good enough” with IG that this entire paradigm will be moot. While Intel and AMD will probably crank out some boutique parts to ensure ARM doesn’t take the performance crown away entirely, it won’t matter - the benefits to everyone else will be obvious enough that the transition is going to happen regardless. Boutique gamers who want liquid-cooled systems will still exist (and more power to them), they just aren’t going to drive the gaming industry they way they have in the past.

The avalanche has already started; it’s too late for the pebbles to vote.

10 Likes

There is a first for everything. This may be the first time I’d choose a Macbook over a Thinkpad.

1 Like

Very possibly, but I’m pretty sure the M1 chip is exactly the kind of thing Nvidia plans for high-end gaming PCs to be based on in the near future (i.e., a GPU with some ARM cores). When people buy those machines, it’s already pretty much the graphics card they’re paying for, and the x86 chip is just an accessory, like the hard disk or the USB ports. With the right support from Microsoft, no one will care whether it’s even there any more. Apart from Intel.

Speaking of Intel, they of course make lots of other chips, including (I think) the Thunderbolt hardware in Macs. I just noticed today that the M1 Mac Mini has only two Thunderbolt ports, down from 4, and I wonder if there’s a story in that.

On EC2, or do you actually buy physical ARM servers?

Yes, I basically meant the sort of gaming that involves a lot of RGB LED strips and Axe body spray.

3 Likes

EC2, Graviton2’s:

4 Likes

The low resolution certainly helps with computing those dynamic background inserts :wink: Personally I use a phone as external camera anyway, the angle from the laptops displays top really does me no favours.

3 Likes

Hilariously, I’ve noted the same thing - I recently updated to an external webcam because the lighting in my home office is terrible, and I needed better low light performance.

I accomplished that, but immediately upped the “touch up my appearance” options and quickly realized why hollywood adds makeup to everyone :wink:

5 Likes

Over the years I’ve experienced only a few really big jumps in computing power:

  • 8-bit computing to i386 in 1991
  • 200 MHz to 1+ GHz in 2001
  • Spinning hard drives to SSD in the 2012 MacBook Air

This new platform may be the next big jump.

But Apple’s reputation is still marred by the 2016-2019 generation of MacBooks, which are literal garbage thanks to the faulty keyboard, clown-sized trackpad, and poor thermal management. They have some work to do to earn my trust back.

The dumb ideas from 2016 are still around. Apple needs to ditch the touchbar, and they need to give their users at least one USB-A port so they can charge their stuff without a dongle.

2 Likes

I have a 2019 16" MBP. It is, without question, the best laptop I’ve ever owned, and the first real replacement for my 2015 MBP that I really enjoyed.

Of course, I still have a 2011 11" MBA I carry into datacenters. The new M1 Air may well replace that and will feel like a quantum leap in performance to boot.

No thank you. The physical ESC key was needed, but I’ve modded my touchbar interface in iTerm2 and don’t want to go back at this point. :slight_smile:

3 Likes

I’ve used a lot of Thinkpads and they had a lot of clunkers. It seemed like they’d alternate between rock-solid and problematic models. I mean, there was one model that we were using as a console server in the lab 10 years later, and the next model very literally fell apart from the stress of opening and closing the cover every day. Most people returned theirs to corporate IT in a bag. I had to, and I am generally pretty easy on laptops.

I do miss the Thinkpad keyboard, though, which has been a problem on the 2013 Macbook Pro I have yet to replace.

2 Likes

You don’t need any dongles if you replace the cables. A C to micro-B cable is less than $5; the latest iPhones all ship with C to lightning. My new keyboard has a type C port. Microsoft and Nintendo both have gone over to type C for their gamepads. Type A is quickly going the way of PS/2 and RS-232 ports; nice to have for backwards compatibility on a desktop but not something I want to waste space on for the computer that I carry around in my backpack.

I carry:

  • tiny GaN-based 60W power supply
  • C to lightning for my phone
  • C to C for my laptop, iPad, Switch, and most accessories
  • C to micro-A for legacy devices
  • Multiport dongle with type A, HDMI, and passthrough charging, which I never use, just for emergency… powerpoint presentations, I guess?

Everything can charge from the same brick, with fast charging where applicable. This is a huge improvement over the days of enormous Magsafe bricks and USB type A.

3 Likes

Yah, it really doesn’t make sense for Apple to court PC gamers. Apple is an integrated hardware company, not a GPU maker or a software company like Nvidia and Microsoft. To support gamers, you need something modular and performant above all other concerns, which is in direct opposition to what almost all non-gamers want (as evidenced by voting with their wallets for quiet, thin, light laptops again and again).

Compared to the mainstream, gamers are a small market nowadays. Not like in the 1990s, when they drove almost all PC hardware innovation. Today the big money is in catering everyone else so it makes no sense for Apple to bother catering to that niche.

3 Likes

Color me impressed actually. i’ll wait for barefeats.com to review the Rosetta2 emulation (esp adobe CC apps). But wawee wow those numbers look good. I listened to Marques review and tho preliminary it was encouraging.

Imagine a new iMac: a giant iPad at 27" pencil2 compatible about 1cm thick. MagicStand costs an extra $500 but allows you to tilt and swivel etc. =`P 16 core CPU and another 16 cores of GPU, up to 32 GB of SoC memory (see ya upgradeability!) Since the Mini has 2 USB-A and 2 USB-C/USB4/TB3 one might expect a few more of each? or mb not to give Gen2 some runway… a decent webcam is wishful thinking as well. Or how bout a portable MacPad in the 20" range?

OK just take my damn money. And bring a towel for the drool.