And that is why I use a mac. When I was a total computer novice, back in the 80s I couldn’t understand the logic of going to the START button when I needed to SHUT DOWN a PC. Whatever you think of macs and Steve Jobs, the one thing he knew was that design and user-friendliness matters. I don’t ever remember using a PC and thinking ‘this machine wants to help me do something easily and efficiently.’
Windows didn’t have a Start button until 1995. Really, though, they either should have labeled it “Menu” or just used an icon (they started doing that in Vista).
Windows 95*: Click the Start button, same as you do for everything else. At the bottom of the menu, select Shut Down. Confirm that you really do want to shut down.
Mac: Click on the icon of a face. Near the bottom of the menu, select Shut Down. Confirm that you really do want to shut down.
OMG you’re right! Macs are SO MUCH more intuitive! Steve Jobs is a god!
There are things that Windows really does do poorly, and Macs really do do better. But most of the favorite examples that Apple fans trot out are just… dumb.
*As pointed out by GoatCheezInfrno, previously you’d have done everything through the File menu
And probably have done it using keyboard shortcuts.
BTW, remember when Windows let you do macros easily?
The keyboard shortcuts never really went away, by the way: Alt-F4 closes the currently-focused program. If the focus happens to be on the desktop, it closes Windows. (Why F4? For consistency across international keyboard layouts.)
Now THOSE, I miss. There are all sorts of third-party replacements, but - being third-party - they have to be re-installed whenever you switch machines; I really miss having something built in.
Unless you want the actually latest version of some thing that isn’t in the package manager yet, and probably won’t be for another year, so you have to go download the source tarball and run the makefile.
Not a massively onerous task, though, is it?
I agree - these days. Back in 2003, when BG wrote this email? Not so much. And even now, the anarchy and decentralization of Linux can make installing a less-than-common package feel not unlike BG’s experience.
(Let’s say you use Ubuntu, but want to install a program that’s only been packaged for SUSE. Please forward me the email you write afterward.)
As long as you know what -dev packages to install, along with the development tools. Heaven help you if the dependencies you need aren’t up-to-date enough to work, as then you have to build them from source, too. You might also want to put them in /usr/local (fortunately, most packages default to this when building from source) so as to not scribble over libraries that other existing packages depend on.
I’ve been down that rabbit hole before (and yes, I use Linux desktops as my daily drivers).
The problem gets worse when you use a distro that prizes stability (like RHEL-based, or Debian Stable). Getting an up-to-date Nextcloud instance running on CentOS 7 can be quite the chore.
The first version of Synaptic was released in 2003. Even before then you had apt and yum so even though it wasn’t a gui the process was still straightforward. “yum search movie maker” “yum install movie-maker”
By 2003 we were well past the point of rpm hell at long as you had a solid Internet connection and weren’t trying to install a new package on a machine that had not been updated in two years and had no Internet access.
Windows still has a usability issue. The last 2 feature updates pretty much broke notifications
Not to mention Outlook desktop client, a product Microsoft want people to pay to use, yet it is so obtuse to use that I wouldn’t be considered worth it for free.
And there are so many of them! You just can’t have too many package management systems.
I’d rather have too many than not enough.
Yes, but getting caught in a package manager fight is no fun.
This topic was automatically closed after 5 days. New replies are no longer allowed.