Many years ago, I had a number of programs that came with keyboard overlays [1] or big diagrams [2] of what key combinations were mapped to which functions. I remember thinking how great it would be if keyboards could change their labels to match the program you were using. Then after GUIs became a thing and we were using a mouse to click at images of knobs, dials, sliders, and scrollbars, I thought it would be great if we had a controller like a keyboard, but that could actually show the relevant controls there, so that instead of keys you might have sliders and dials that you could actually touch depending on what you were doing.
Having now used touchscreens, I realize it was a dumb idea. But a single touch strip…and removing keys to make room for it? That seems even dumber.
This guy uses ‘hacker’ and ‘open’ in ways they never were meant to be used.
Apple finally makes laptop without proprietary ports. Hurrah. Just like ALL other laptop manufacturors. Except it lacks ports you might also want, like a headphone port or an ethernet port. But what the hell does this have to do in any way with ‘open’ hardware?
Nothing.
And then the random ‘hacker’ reference … wtf? A real hacker would like some other ports, maybe an integrated fpga. But this weird definition of ‘hacker == usb-c port’ is just … well, at the very least very strange.
So you won’t be able to charge and run the machine with a majorly underwattage power adapter, but you will be able to stave off going totally flat as fast and you can charge it at a turtle’s pace. A 12W power adapter (5V @ 2.4A) should be able to charge the 15" MBP from near flat to full overnight. Now, this is granted the adapter really puts out 12W and the cabling and interconnects don’t slow down the charging either. Even the iPad can be charged with puny power adapters, just slow af.
I was thinking of the generic Targus ones in CVS, they’re about $40 and have all sorts of tips. I’m not sure they have USB though – but they will soon, in any case.
Who uses ethernet much any more? I do…I’m a rarety. I see all the Windows users in my office simply using WiFi and I’m here with a dongle because I get my work done much faster with it.
As for headphone, its there. You are thinking of the phone. Who uses a phone for things that need audio anyways. Are you one of those old people? Hell, from what I understand Millenials don’t even listen to music anymore, they send emoji lyrics and that conveys all the needed emotions.
There was an article somewhere that explained that it is actually the Intel chipset that put these limits. I’ve used 16GB for several years and is actually overkill for most things. Unless I was doing video editing (errrr…much higher end than I’m doing now as a hobbyist), in which case I’d PROBABLY want to move to a desktop. Then again, Final Cut has been optimized heavily as of recent and folks are doing 5k editing on these with no problem. Unless they drop into Premier…in which case it sucks once again. As a former programmer, it’s easy to keep upping the specs on your applications without doing any optimization and hoping that Moore’s Law keeps up with your compiler needs. Sadly, I use Premier (we have a site license at my university which means I get it even though this isn’t my job!)…
Interesting! That might be the case, honestly. The source of my post was a comment from (supposedly) Phil Schiller himself, but he might have been putting a positive spin on a hardware limitation. I agree that 16gb is a hefty amount of RAM for most people, but in a way, it’s more about the message/image a company is conveying to its pro users: instead of letting you easily pop RAM into your laptop, like we used to, we’re going to put a hard limit on it and weld the suckers in.
As a supposed Fanboi…I would be lacking in my enthusiasm if I were to say that even back in the day when I use to upgrade the ram…quite often the RAM I put in was not kosher with the boards. There was a time when Apple actually disabled RAM that didn’t follow spec – months after working but with SOME crashes and instability and otherwise that we’d argue was still better than a Windows LUser.
I got burned with RAM too many times…and it would be nice to have it upgradable, but at this point in my life, I can afford to upgrade every other year. Folks without this luxury may not be so happy. SUPPOSEDLY one of the big waits on the hardware update was that Apple was waiting for a better mobile set for their portables. And they have been threatening to build their own if Intel doesn’t get it together. The fact of the matter is, they own some of the best chip designers out there now (almost all solely through acquisition) and have been using this expertise to have their iOS chips customized by the bigger folks and licensing back the improvements.
Either way, I have to say my machines are far more stable with everything glued / soldered down. There are tradeoffs, but I like the fact that these machines are pretty much appliances these days. I can also understand why folks with much more time on their hands HATE this idea. I miss the days when I could hack my machines and switch out parts even in a laptop with other 3rd party (I remember a GPU that was designed for one laptop years ago that required minor surgery but made games so much nicer). So I understand why people want this…I just don’t get why they can’t imagine someone might want what we have without acting like their heads are going to explode because we are apparently just that stupid.
Quite honestly, when it comes to laptops, I totally agree, in that they’re increasingly being built (not just by Apple) as essentially solid-state, tightly-engineered devices with very little wiggle room for incompatibility. I have no desire to futz around with microscopic screws and pry-bars.
Desktops, however, should at least have the option to upgrade and open up. I still adore my increasingly-creaky 2008 Mac Pro, specifically designed for easy upgradability, its four hard drive bays, RAM slots, and expansion cards open and accessible. Compare that with the iMac, everything soldered in. Or the new Mac Pro, which is a lovely little soda can of a desktop machine but frankly more than I need for design work. I know I’m an old fart but I have absolutely no idea what to replace my machine with when it eventually dies.
I replaced a very old Mac laptop (Core2Duo era) with a 2008 XServe. It’s been a treat!
There’s room for RAM, room for disks, room for (some) cards… it’s a lovely thing.
Except that it EATS power. And the RAM costs a goddamn fortune. And it’s huge. And heavy. And loud.
But besides all that it’s great.
This is the ONE thing that I get really angry for Apple having gotten rid of. The XServes were great. Now I have to make due with a virtual server running OSX Server for some of my needs. Nowhere near as nice as having one that just plugs in and works. Hell, the standard virtualizing software doesn’t even allow this without a hack…
I’ve always wondered what Apple uses for their data warehousing needs. In the past I’ve run MOSTLY Linux (Redhat back in the day when I ran a business) and had an OSX server to deal with internal infrastructure.