As much as it kills me to say it, my next upgrade is going to be a PC. And I have been a very happy Mac user for 15+ years. I could always deal with that non-upgradeable CPU/GPU, although I didn’t like it. But the idea that I cannot even pop in more ram, or will have to remove adhesive and spend a lot of time carefully dismantling a glued together machine (which voids my warranty) to even replace a hard drive, which is like brakes in a car (a perishable part), or a notebook with a non-user replaceable battery is a non-starter for me.
Apple had it right for a long time. A fair compromise of keep-it-simple-stupid with the ability to get under the hood as well. But I feel like they’ve gone whole hog into non-serviceable, disposable machines. And on the software side, I’m not interested in a complete overhaul of the operating system annually. I just (sort of) got comfortable with Mavericks, got all my apps updated and (sort of mostly) working, and now they expect me to start over? No thanks.
In a year or so I’ll need to upgrade this 2009 iMac, and will be building a PC. Haven’t decided if I’ll hackintosh or just run a dual-boot Linux (for day to day)/Windows (for commercial applications). Can any PC users tell me if a user like me, that likes to limp a machine along for 10 years with upgrades, should be happy now with a PC (I’ve gotten 7-8 years out of iMacs, and then passed them to others who are still using them to this day)? My last experience was with XP in the early 00’s, and aside from a lot of maintenance, housekeeping, and annoying pop-up system messages, it wasn’t too bad of an experience.
Well, pretty much all the worry about voiding your warranty by upgrading are exactly the same for PCs – if you are talking about buying a pre-built machine from HP, Dell, or the like. Of course that isn’t the case for building a machine – but that’s only because there is no warranty to void if you do that.
As for operating systems, Windows 7 isn’t that bad if you could put up with XP. Windows 8 is a mistake and even Microsoft admits that. Still, if your work involves anything technical you’ll probably want a UNIX environment – that’s really why I like Macs today – they are basically a friendly UNIX machine – I had no interest in them pre-OSX. That being said, Linux distributions like Ubuntu and Mint approach OSX-levels of friendliness these days.
I’ve got a 2009 20 inch iMac, and Apple’s current offerings are aggravatingly expensive. You can pay through the nose for the privilege of upgrading your memory (a good midlife upgrade) by getting a iMac 27 inch. But by default, it comes with a rusty spinning disc, (and possibly a slowish video chipset() Upgrading to an SSD (which is a no brainer in terms of latency) can only be done at the factory and costs extra… I can imagine that the Mac Pro will last 5 years with a couple of midlife upgrades, but it is $3000.
If one doesn’t want to pay for the privilege of a screen, one can get a Mac Mini, but the video chipset is somewhat lacking (and like everything else, can’t be upgraded)
And we all know that the new experience you get with your expensive upgrade is always better than what you had, right? Who didn’t love Vista? And exactly the reason people are staying on Win 7. Hasn’t Microsoft pretty much admitted that every other version of their OS is a horrible catastrophe by skipping 9 altogether and jumping strait to 10? Or is it because they’ve felt like they are behind OS X 10.10? LOL
Good luck with that. Yes, you will be able to upgrade almost anything if you buy a desktop box. But other than that you are in pretty much the same boat. Also, while the PC market gets all sorts of upgrades every year, to really do more than limp after a few years you are going to be investing in a lot of new hardware inside that box. By year ten you will probably have replaced almost everything, because if you think that PC component manufactures and Microsoft are all about compatibility… then you really have been out of the PC game for a long time. You will find that the fancy new component you want won’t work with your motherboard. Or it works, but something else in the system just can’t stand to have it on board. I’m running a 2009 13" MBP with an SSD from OtherWorld Computing on Yosemite. The only thing I can’t do is Airplay video. Big system hogs like photoshop are slower than on a brand new MBP, but I don’t do much of that anymore. For everyday use it’s actually faster than the day I got it, thanks to the SSD.
A week ago I bought a new PC from HP. Yes, it has USB3 and a variety of other new hardware interfaces.
But it also came standard with the same parallel and serial interface connectors that were on the original IBM PC 30+ years ago. And that were on the Apple II before that. And on other computers before that. It uses the same power cord and headphone connectors used by my Apple II and equipment before that. (Granted the standard serial port changed from 25 pins to 9 pins around 1987, but a $2 adapter fixed that.)
The 1979-vintage Epson MX-80 printer that I used with my Apple II would still plug into my new PC today. The printer driver for it is still included in Windows 8.1. The new PC has a drive bay for an optional 5.25 inch floppy drive, should I need to read discs from the original IBM PC.
But after the Apple II and III series came the Mac. Different connectors. Forget using your old printer, even with an adapter. And they’d change the connectors on each Mac that came after. Heck, they even pulled that garbage with different microphone connectors for a while.
It’s a money grab. A way to force people to buy overpriced cables, adapters, and often new peripherals.
Nobody had an automatic update, and suddenly discovered they were now using Vista. Periodic patch updates do not change what OS you’re running.
They are skipping Window 9, to avoid a potential problem with some legacy software that is checking for Windows 95 and/or 98. Google it, if you actually want all the skinny. It was well covered in the news.
Nevertheless, a lot of high-def TVs and other equipment did not come with HDCP, and most consumers were never warned that their new equipment would be obsolete within months.
Heck, most companies selling the stuff didn’t know about it. In 2006 I got a PC that came with Windows Media Center, an HD-DVD drive, and one HD-movie. The movie wouldn’t play, because I spent the money for a higher end 24" 1080p monitor - a big deal back then. And of course it didn’t support HDCP.
Like many new standards, HDCP wasn’t fully nailed down. A lot early HDCP compatible devices had trouble talking to each other.
And there were a great many high end video cards from all the big brand names sold as being HDCP compliant. Except that they didn’t support it at all. While the chip sets supported HDCP the rest of the card didn’t.
Hey, check this out. I was digging around in this pile of shit and I found a huge diamond. So I took it back to my shop to have closer look. At first it seemed beautiful, but I noticed, under high magnification, that it has several flaws. You need to be an expert to to see them, but they are there. Also, the color choices are not so great. Tomorrow I’m going back to digging in the shit. At least I know what to expect.
Good for you, I guess? If you have devices that require a parallel port, or a serial port, or a SCSI port, or a PS/2 port or a microchannel slot or an 8-track deck or a 5 1/4" floppy drive, and you can find a machine that will support those things, then that’s awesome.
But the majority of us do not. My need for legacy interfaces expired with the legacy hardware that required them, and I’m happy that my current computers don’t waste the space on those antiquated interfaces.
At any rate, I’m not sure what this has to do with my statement about being conscious of the risks inherent in firmware and OS updates, and planning accordingly.
I am using windows 2000 and I haven’t had update problems for many years pity it does not have a 5 1/4" floppy drive. I also find 9 pin serial and 25 pin parallel ports amazingly useful. As for cars mine has a carb. and a distributor and can be repaired roadside if necessary.
I still want a parallel port for my HP LaserJet 4. This is a printer that was sold in huge number on maintenance contracts to hospitals, utilities etc. They’ll still be operating flawlessly - and toner cartridges will still be available - beyond the heat death of the universe.
And I need a serial port for my (radio) scanner. I don’t see any need to throw out other perfectly good devices just because I got new computer.
It’s not that manufacturers should be forced to include the ports or should make sacrifices to include them. It’s that including them is trivially easy for anything larger than a netbook.
Yes, backups are necessary. But updates have gone from a Big Deal - something you set aside time to do and shut everything down first - to something that happens automatically in the background, unprompted, and often. And it’s not just the OS, but Flash, Java and other apps updating automatically in the background, unprompted, and often. The idea that you need to stop what you’re doing for safety, and do a backup on the spot, is no longer realistic.
And the idea that an update can remove functionality (beyond say, SSL3 being turned off by default for security reasons) is ridiculous.
You can get USB Serial adapters, although in my experience they aren’t as good as a real serial port. They tend to fall over when you try to push big hunks of data through them–they’re basically only good for serial consoles.
Real serial ports and parallel ports are the reason I keep an old Linux laptop around.
This is not true. In Windows 7 at least, the default behavior for important updates is to give you a pop-up warning, and if you don’t dismiss it, a little while later (15 minutes? an hour?), it closes all your programs, throws away any unsaved documents, forcekills anything that won’t shut down automatically, and reboots to run the update. If you dismiss it, it just comes up again later. If you’re away from your machine, you can’t even do that. It’s insane.
You can, by digging around in the registry or obscure security settings, disable the automatic reboot and extend the nag frequency to 24 hours, but IIRC you can’t do any better than that without completely disabling Windows Update. (I may be wrong on the details, I jumped through the hoops to knock that shit out years ago on all my machines.)
You don’t have to automatically install Windows updates, you know. In fact, it might be a good idea to turn automatic updates off, as once in a great while, updates will break your OS. You don’t have to “completely disable Windows Update.” You can set it to tell you when updates are available, not to download or install them automatically.
(Besides, you want to be in control of when updates are installed, and get to choose which ones are installed, don’t you?)
In response to other posters:
I like old computer equipment. I still have a PC sitting on the shelf that has a 5 1/4" floppy drive installed, just in case I ever want to reinstall MS-DOS 5.0 and Windows 3.1 from the 5 1/4" floppies I got them on. (OK, truth is it’s actually about time to clean off the shelves in here…)
When my wife upgraded to Windows 7 from Vista, the installer left Vista on her PC (i.e. she can dual-boot, although she never does). If someone could give me a rational argument about what was so bad about Vista, I’d like to hear it. Every time I bring it up, it looks and feels very similar to Win7.
More on topic, Verizon pushes out updates to Android that install without asking. There’s probably a setting there somewhere to turn that off, but nothing’s broken so far.