HP's new logo a hit

Yeah, but flip this one and it’s just a minimalist hand flipping you the bird

22 Likes

This is somewhat true, but not completely. I think Apple has the exact opposite issue with the iMac. They are valuing function more than anything. As a result it is a great standard household use all in one computer. It can do the overwhelming majority of what most people do: desktop computing, web surfing, multimedia, data storage, minor home photo and video editing. All completely in its wheel house.

But it just cannot hold up for any gaming or major design/video/photo type work (not because of the OS, because you still can use bootcamp) because of those damn integrated non upgrade-able graphics cards.The desktop RAM is way more than adequate. The hard drive space is fine. The processors are great, they are the same thing as their windows pc counterparts…but the graphics card/chip lag behind the industry standard. And you can argue when the models are released they are on par, but of all the components of a computer it is the graphics processor that most quickly becomes obsolete. Not allowing them to be upgraded or swapped out for new ones in some form is simply a huge mistake on their part IMO.

So, I agree. Most tech companies have begun to value the form over function and it is a terrible practice for the consumer all too often, but the exact reverse issue is just as terrible for the consumer too.

4 Likes

It is a cool logo, but the “old” one still has a personality, a vintage feel. It is like the Ford oval: who would change that?
That’s the kind of logo some companies revive decades later and people just love. Keeping the cool logo for premium products only seems a good choice.

1 Like

Exactly. “Vintage” isn’t necessarily a good association to create when you’re trying to market something as a cutting-edge piece of technology.

3 Likes

Every last Apple cable and adapter is a “premium” product. As in proprietary and overpriced.

My new HP PC has - in addition to modern industry-standard HDMI, DVI USB3 and other ports - the same parallel and serial ports that I had on my Apple II more than 30 years ago.

Apple has spent that 30 years changing to an entirely new set of proprietary ports every few years, over and over and over again, to ensure that the suckers have to buy new “premium” cables and peripherals with every new major device.

3 Likes

Pretty sure that usb-c will end that stranglehold. Shame that Amazon’s been bought off, though

1 Like

While the rest of the cell phone and tablet industry went to micro-USB, Apple not only went with a proprietary connector for iPhones and iPads, they switched to a different proprietary connector a few years later. They’ll likely embrace USB-C for desktop and laptop devices, but not for iOS devices.

1 Like

as a connector, microusb enjoys the same 4 dimensional topology as its larger sibling. ligtning doesn’t.

2 Likes

Like I said, interesting to compare and contrast. For better or worse Apple has been much more successful at convincing consumers to spend a bunch of money on their products. One of the reasons people perceive Apple products as more “valuable” is that Apple has their shit together in terms of branding and marketing.

Meh, its okayish, I guess.
Reminds me a bit in the old Fiat logo. Fun fact: The people who designed this new hp logo, Moving Brands, are also responsible for the dreadfully pretentious new logo of deviant art, which looks like the sad mutated love child of a fascist party symbol and a cactus.

4 Likes

Apple’s not just switching connectors to irritate people or make them buy new cables; their updates to connectors are done for very specific reasons or to adopt/promote industry standards. When it was introduced, Firewire was a huge step up from USB. Lightning was a big step up from that: all-digital, hot-swappable, reversible, faster, cheaper. The new cables they’re switching to now can also carry high res audio and 3D signals. They’ve always been good about offering adaptors and supplying free cables with products, as well.

3 Likes

There’s some truth to part of what you wrote, but…

Seriously…? You honestly believe this?

It’s like saying that the Republican Party has adopted Donald Trump to promote reasoned, measured civil discourse and policy. It’s all about steering people AWAY from industry standards.

And all through the evolution of the Mac they kept switching port designs, and not because they were better. I was once handed a Mac that I couldn’t connect to any known monitor - and the local Apple store didn’t have a connector that would make it work. Heck, I knew someone in the '90s with three Macs requiring three different microphones. (My new HP also uses the same microphone connector as my Apple II 35 years ago, and cassette players long before that.)

3 Likes

BP’s new logo a hit

4 Likes

Yes, I believe that they switched connectors to improve performance and to adopt new standards. Do you believe they were changing them just to irritate people for fun? You genuinely believe that Jony Ive is just driving trollies Apple users? Okay.

I used a variety of Macs all through the 80s and 90s. The only one that didn’t use a standard microphone port was the G4 Cube, which experimented with eliminating it in favor of USB microphones. The experiment didn’t last and they went back to the standard port immediately after.

ah, the good old days.

Frankly, any non-PC compatible computer from a certain era was likely to have its own standard.

And you couldn’t exactly plug in a EGA monitor to a VGA card, could you?

4 Likes

4 Likes

Nonsense. They switched connectors so that they could sell new overprices printers, cables and other peripherals.

For the rest of the industry, the standard parallel and serial ports worked just fine for 30 years after Apple stopped using them. Sure, better standards like SCSI and USB came along, but that didn’t stop them from still including parallel and serial ports.

Firewire may have had some improvement for video, but virtually no-one is outputting video from their iPads. And if they wanted to, one of my other tablets from the same era has both micro-USB and micro-hdmi in less space than one firewire port.

It’s all about some on-paper-only advantage for the users that they’ll never see in reality, in order to lock them into expensive peripherals.

3 Likes

Unfortunately, vendors continue to do their best to pull defeat from the jaws of victory by writing unbelievably lousy EDID tables in order to take advantage of the relative sanity of ‘how about we just have an i2c ROM that says what the monitor can do, rather than everyone choosing random resistance values between arbitrary pins to mean inscrutable things?’.

On the plus side, quite a few devices don’t pay nearly enough attention to the fact that a giant, complex, kernel driver is always listening to an externally accessible i2c bus that gets plugged into untrusted hardware all the time and most people don’t even know is there… Nothing could possibly go wrong. At all.

2 Likes

That’s pretty cool, but look at the logo that comes on an Aorus laptop:

Its an eagle making a fist!

1 Like

They didn’t use Firewire on iPads. By the time the iPad was released, FireWire was only used as an industry standard for video professionals, and even then, wasn’t very popular anymore. FireWire was only used on the older iPods – when they did equal duty as music devices and portable hard drives.

If your argument boils down to “Apple should’ve stuck to 30-year-old parallel & serial ports instead of using newer, faster, more efficient ones!” then I’m afraid you’re one of the very few people in the world with that belief.

2 Likes