Apple removes Ifixit's repair manuals from App Store

“Today we celebrate the first glorious anniversary of the Information Purification Directives.”

It’s been a while, but I don’t think a well-researched, well-built, good-reliable-parts hackintosh is going to need any sort of ongoing maintenance if you don’t keep changing components or updating OS X.

That said, if you do want to run the latest OS X and you don’t need constant component upgrades, maybe it would make more sense to just get a Mac anyway? It’s not like a one-machine boycott would make much of a difference apart from personal ideological satisfaction, particularly if you keep using Apple’s software.

Depending on your needs, it’s possible that one of the more mainstream Linux distros like Mint or Ubuntu might work. They’ve come a long way in terms of compatibility and UI in the last few years. The huge user base helps.

1 Like

Since I don’t have a paid Apple developer account, I can’t say with certainty what the wording is, hence my hedge. However, if you use your developer account to enter into an NDA for unreleased Apple hardware, and it’s the same account you use to publish apps, shutting down your account for violating the hardware NDA will by definition have an impact on your ability to use your account to publish apps. iFixit should have known this (in fact they appear to have intentionally thrown caution to the wind, based on their own blog post), and there’s frankly nothing controversial about it. It’s the same principle that locks you out of your entire Steam library when you get banned for cheating in TF2. The punishment for cheating isn’t explicitly that you can’t play your other games, but it’s an obvious effect of being banned for the thing you did do.

Now, granted, if you weren’t actually cheating in TF2 then the punishment is overly severe, but that’s not even remotely the case in this metaphorical instance.

If going back to Windows makes you uncomfortable. For usability reasons, *nix will be a bit of a nightmare. If it’s for “Microsoft is about as ethically dubious as Apple”, the learning curve of *nix will be tolerable (since, sadly, there are no other alternatives). Windows 10 is okay, as long as your willing to take the time to turn off all the snooping (it took me about 4 hours to catch all the hidden snooping options).

Depending on your needs, and technical abilities, I’d say just give Linux Mint a chance. And build your own PC, you save money in the long run. This is what made my give up on OS X, not being able to cheaply update my hardware.

But then again, if you need polished, well supported, professional software, you’re pretty much stuck with the big two. Virtualization has a pretty big overhead still. It can work for quick things, like testing pages, but you need a behemoth to run big software, and even then things might break for inexplicable reasons. Also, if you go this route, pay attention to what CPU you get, as some midrange ones don’t have the full ability’s to fully utilize virtualization. Dual booting works, but depending on how much software if dependant on Windows you need, yiu might end up neglecting nix, since it can be a bit of a hassle. I pop in and out of “work mode” as the need or whim takes me, so I can’t really dual boot, and the software I use sucks in a vm., so, sadly, I’m forced to use Windows.

As for mobile hardware, get something as close to Stock Android as you can, preferably a Nexus device, or a " Moto". Crapware, vendor crap (I’m looking at you Samsung), and evil, lazy, carriers are 90% of what’s wrong with Android. The pure Android experience is pretty good, a bit janky, but not bad. And it doesn’t need iTunes to eat my computer whole, which is a good thing.

Either that or Apple is an ethically dubious behemoth, who displays an increasingly worrisome level of control over the content we have access to. Today it is this (which might be benign), yesterday it was the drone app, which was purely dickheaded. Even if this is benign, it still doesn’t address the fact that Apple locks its hardware up, and tries to keep people from modifying or repairing their property. Intentionally. So your forced to buy more of their crap, instead of being Apple to fix something, or upgrade it. Disposable hardware needs to die.

I’ve kept, basically, the same computer for 8 years now, and it is fully up to date. You can’t do that with Apple, because Apple would rather I drop a grand every other year on their PCs. Worse, they try to train people to be happy with this. And, even worse, it works.

Also, trusting a faceless, uncaring, major corporation to the the sole curator of content is goddamn scary. We’re being trained that this is good (by Apple, by Amazon, and increasingly by Microsoft), which doesn’t really bode well for the future. If this happened on Android, you could sideload it. On Apple, the customer gets to suffer.

Iff. Mathematically and legally speaking.

You absolutely can do this with Apple.

My experience is that their hardware, in general, tends to be well-made enough to keep working for quite a long time, and they keep supporting older stuff for a decent while compared to most other branded computers.

The just-released OS X, for instance, supports quite a few 2007-2008 models, and I’ll bet those will run better than the majority of the “made for Vista” PCs of the same vintage would run Windows 10 after eight years of hard service.

Edit. Seems some older models will have better performance and battery life with the OS update because of software optimization. You’re doing planned obsolescence wrong, Apple!

1 Like

Bullshit. I had a PowerMac G5 for 4 years, and an Intel iMac for 5 years. I could have kept the G5 for much longer, but decided to move to an Intel machine because there was better software availability, and I could run a VM of Windows at better-than-awful speeds for the things I didn’t have Mac equivalents for. The iMac had a hard drive failure that I just didn’t want to deal with fixing, because I’d been wanting a laptop for a while by then, and it seemed like a good enough time as any to make that change (though I did keep going on the iMac for several more months by booting from an external Firewire drive). My brother is still using an iMac from the same era as mine, which means he’s had it for about 8 years now, and he’s never even upgraded the RAM from its starting 512 megs. I’m running a first-generation (2013) retina display MacBook Pro now, and I have no plans to upgrade to a new machine for the foreseeable future. My wife is running a 2010 MacBook Pro whose only long-running hardware problem is fan death, and I think that’s more to do with our cat than Apple.

Not only that, but every Mac anyone in my family has ever owned (with the exception of the G5) is still capable of running the latest OS (well, my brother would have to buy more RAM, but that’s doable). If Apple were so dead-set on me buying a new machine every other year, why would they still be supporting iMacs and MBPs they made in 2007? PC hardware nowadays is advancing at a much slower rate compared to what it was doing in the 90s, so micromanaging my machine to keep it “up to date” isn’t nearly as necessary as it used to be, and based on my extensive time using self-built PCs running Windows before switching to Macs, the upgrades I did were so infrequent as to be indistinguishable from buying all-new hardware anyway.

I agree with you that Apple’s attitude towards the App Store and what content they allow in it (versus their much broader policies for music, books, TV shows, etc.) is problematic and needs to be updated to better reflect the things people are doing with applications now. But unlike the drone app removal the other day, iFixit’s developer account being banned for violating an NDA when they knew that was the likely punishment when they posted their teardown is beyond unproblematic. Let’s absolutely have a big, raucous, philosophical debate over what Apple should do to bring its app store policy in line with its open-ended book and other media store content policies. But please, don’t die on iFixit’s hill.

Well, actually… I’m writing this from a 2011 Macbook Pro.

My experience is that Macbooks tend to last a year or two longer than the typical Windows laptop… Roughly the same with iPhones as long as you settle into them and don’t get on the treadmill. Yeah, they release a new model every year, but everyone else does too. You just have to have a decent immune system…

That’s false equivalence - you can’t compare something which is a legitimate exercise of a legal right to a move which, as you say, is indeed quite dickish. The fact that Apple took this action in the iFixit case neither adds to, nor subtracts from, their dickishness coefficient. Condemn them for the wrong they do, sure. But everything doesn’t become dickish just because Apple does it.

2 Likes

Their hardware is largely normal off the shelf hardware.

Erm… This is an Apples to oranges thing. I have a computer from 2008 that can’t run anything new (it was built as an HTPC, at an extreme budget), but my old gaming PC could would still be slightly better than average. Up until 2 years ago I was still running my (sub $1k) old gaming PC with no issues. My girlfriends computer is from roughly this time frame, and is a crappy Dell (with the addition of a hand-me-down GPU), and it runs Windows 10 just fine. My moms PC is even older, and it runs Windows 7 Pro well enough for her needs. I don’t need to worry about official support, it runs or it doesn’t. Further, any of these would run even better with a fresh Linux install, especially with something lightweight.

Also, my last PC, the core of it was from 2004. Yes, 2004. Over the years I added bits, and replaced bits. But never replaced the whole computer. A new video card, some ram, at the extreme a new mobo and CPU, upgraded some cards to match new technologies… My PC was basically the ship of Theseus. This computer STILL has some components from 2004, and others are floating around in my girlfriends computer, my best friends computer, and my Moms computer.

I suppose it depends on your needs. I quit my Apple stuff because I play video games, and I hit a wall where I couldn’t do it anymore thanks to crappy integrated graphics. On my PC I’d just open it, stick in a new medium-range GPU, and be good to go. On the Apple, there was nothing I could do. I also had a MacMini, which I DID upgrade (from mediocre, to piss-poor), but it was a goddamn ordeal, and the upgrade options were practically nill. I also had an iBook, which was awesome, until the HDD died, which was the biggest damn pain to replace (disassemble the whole thing, to replace a $50 HDD?) Hell, upgrading its RAM was a pain, unless I wanted to give Apple $200 for $50 of generic RAM.

On my computer, I just remove x, insert new x, and turn it on. I don’t care if its supported, I can prolong the life of my computer indefinitely. Especially if I’m running an AMD CPU, since they support the same socket for generations (Intel, not so much).

I should have known my hyperbole would have bit me in the ass, this being the internet and all. Sure, things can last over 2 years, and do. And often this is good enough. But it isn’t as good as being able to systematically upgrade everything at need.

Phones are the worst, and not just iPhones. All phones are landfill technology, which is a horrible idea.

Sure, it was their legal right, but basically the customers suffer because of it. It renders a bit of information unavailable for arbitrary reasons, as far the the users are concerned. It is a symptom of the problems of central control, of the “walled garden”. It isn’t because Apple does it, it is because Apple has the ability to do it in the first place.

I don’t mean to come off as anti-Apple hardware/software. If it works for you, fine. If you like it, good. It isn’t my business, nor should my opinion matter to you. I’m just stating my opinion because this is an Apple story. If there is a Microsoft, or Google story, I will rage on how piss-poor they are as well. The world of computers generally depresses me, we had so much optimism for them in the 70s-90s, and we let all of their potential be squandered by a handful of monolithic amoral corporations.

Well, there are arbitrary cutoffs, and then there are cutoffs based on anticipated performance. Have you ever tried running OSX 10.5 on a single core PowerPC?

If iFixit had waited until the end of this month to buy an Apple TV at retail and tear it down, then this wouldn’t have even been an issue, and Apple wouldn’t have done anything (for proof, consider that iFixit has been tearing down iPhones for how many years now without Apple doing anything to them?). The problem is fundamentally the fact that they tore down a piece of equipment that they didn’t own because it was loaned to them and essentially under embargo due to the NDA.

With regard to Apple’s walled garden, I really honestly think that it will eventually come down. Not necessarily because it’s untenable, but because if you follow the evolution of iOS since 1.0, it’s been on a constant path of growing steadily more open. I’ve maintained for a few years now that it’s more likely that iOS will gain more permissive Gatekeeper-style functionality than it is for OS X to lose it, and the pattern that iOS 8 and 9 have established seem to continue pointing in that direction. I think Apple is just being extremely conservative with how they open things up, because 30 years of personal computing experience has taught them that opening things up in the wrong way in the interest of giving the user control makes the entire machine vulnerable to malicious actors, and your average person just isn’t able to maintain a 100% perfect line of defense against them. Just look at how much trouble Android has had with security (not to mention battery life and performance) as a result of its much more permissive underpinnings. Large projects within 5.0 and 6.0 are efforts to put the cat back in the bag in various ways, like selective app permissions and timer coalescing. iOS is essentially Apple building a more fundamentally secure platform from the ground up. This year they removed the requirement that you pay for a developer account in order to be able to run apps that you build on your own devices, which effectively opens the gate to open-source software being distributed in source form for users to compile and install on their own if they want. In the meantime, Apple is definitely worthy of criticism over their App Store policies when they do something boneheaded like banning the drone strike app, or insist that there’s no place for games to make political commentary (“go write a book”, they tell you), but that’s just not at all comparable to what happened in iFixit’s case.

1 Like

Eh, I was specific - macbooks, not desktops. Whom have you actually seen upgrading the video card on a laptop?

You’re lucky if a phone lasts a couple of years. But the 4s is still being supported, and I see people still carrying it around. So that’s a good 4-5 years that at least some of them last. Which is the reason I shifted from Android, quite frankly - a long-term support structure means I can use it for much longer. I found my androids barely lasting a year…

The reasons aren’t arbitrary; there’s a clear contract, which was wilfully violated. Forget for a moment that it’s Apple; between any two companies, if there’s an agreement that one violates, the other takes action. Period. There are other things we can find fault with Apple for, and I do that myself. But not in this case.

As far as I know, Apple have never taken action against iFixit or anyone else for posting teardowns of their technology. I don’t think they would have had a problem with their doing it once the product was released. Their problem seems primarily to be with the breach of contract. So, when you say “It is a symptom of the problems of central control, of the “walled garden”.”, well, it really isn’t. There are other symptoms, but this isn’t it.

1 Like

You’re right. And for this, we should be blaming iFixit, not Apple, in my opinion. iFixit did this knowing full well that the likely outcome was that they would get their developer account banned. If they didn’t like it, they could have easily waited a month to get a retail version instead.

I used my Samsung S3 for nearly 3 years, till I broke it trying to replace the glass on the front (note: heat guns are very good at melting digitizers). Official firmware support dried up, but I stopped using the official Samsung firmware about 6 months into owning the phone any way. That’s one of the nice things about Android - the fact that there are 3rd party developers building their own versions of Android for devices. At least, the popular devices.

1 Like

I got frustrated with the 3rd party firmware that often doesn’t handle half the hardware on the device, and the stock firmware that doesn’t get updated. At some point, if you want a well supported device, you have to pay for it. I came to the conclusion that paying a large amount and getting a device I had to spend weeks tinkering with was worth less to me than just paying a large amount and getting a device I don’t have to tinker with.

Not that I’m against tinkering, you understand, but I have limited time for it, and I’d prefer to spend it on my desktop (custom built, running Kubuntu) and the rest of my projects, than on my phone…

1 Like

I found that with my S3, it was very well supported by the third party devs. I’ve since moved on to Chinese phones, which are significantly cheaper but with much poorer third party support (and first party support, for that matter). But my tinkering fun is pretty much exclusively tied to my phone, which is my primary computing device these days, so I don’t mind at all. If anything, it’s kind of inspired me to start looking into doing some ROM development myself, though I have much to learn. iOS just feels too constrictive, to me (though I do have an iPhone for my work phone).

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.