PC shipments decline sharply, again

“XYZ is dead” is dead! Long live recursion!

4 Likes

It will be. What we need is some kind of wearable microdisplay with decent resolution. Possibly a knockoff of something like what the soldiers have. Or a variant on CastAR.

For desktop or on-the-go use, the interface should not be obstructive and the position of the virtual screen should be adjustable (or, flip it down/up). That should not be too difficult. The display will likely be some LCD or DLP (no OLED, I don’t think they would like being driven hard enough to compete with brightness of a sunlit landscape) so variable brightness backlight can be used with ease.

Using a large high-res panel like Oculus Rift does with a phone display will be difficult due to the size and the optics needed for non-immersive use.

Glass was meager 640x360 pixels. (Bleh. We need more.)

Then there was the case of underinformed nitwits with more loudness than vision and armed with the oh so clever “glasshole” moniker, who spoiled the fun for everybody else and delayed development of wearable AR by months if not years.

NEC display DLP chips have 1.38" size for 4k, which while a bit big for a wearable device may be doable if mounted on e.g. a hard hat. There are 0.69" chips for 2k. I have no idea how much size will the additional optics add, and what the necessary thermal design will add.

If the device can handle standard HDMI, you can feed it from a desktop computer or even a smartphone. Then with e.g. a finger-mounted touchpad and voice control you have a wearable interface compatible with existing software.

Should be within reach of a determined wealthy-enough DIYer or a small-scale company.

1 Like

Who needs “oomph” these days, whatever that is? If I have a big task, I put it on a server anyway. I end up wearing my laptop on my back for quite a few hours per week; frankly, all else being equal, i’d take portability over “oomph” without a second thought.

Maybe I’m an odd case (by popular standards): 95%+ of my useful cycles go to numerical computation; the compilation isn’t that demanding most of the time, but the runtime and memory usage can be high. I really don’t care that much whether something takes 3 hours instead of 2 hours. I just put it on the stack and come back to it when it’s finished. If it’s really important, I fire up an AWS cluster and it’s done in 15 minutes. The only thing you really need “oomph” for is media work or modern AAA video games, which are, imho, drek for the most part. I just play a few roguelikes and occasionally retro games or emulators. All this could run on a 10 year-old machine with few problems.

Admittedly, the unservicable black boxiness is crappy, and it disturbs me. But on the other hand, it’s not like there aren’t plenty of user-servicable computers out there anyway.

I effectively replaced my PC a few weeks ago, but it wouldn’t show up on those stats. I only bought a new motherboard and processor, everything else still has plenty of working life.

But when I open my DAW, I have 8 little processor indicators, and they all stay well within optimal range! Yay!

Windows 8.1 is pretty good, and I’ve heard good things about 10 as well…but 2 good versions of windows in a row??? I’ll believe that unicorn when i see it, everyone knows Microsoft only makes every other OS usable. :smile: Good thing I also use OSX and Linux.

Vista v1 was a POS…it did get quite a bit better on the second iteration. XP was always solid for me from day one, that was a great OS for the day.

they are pretty amazing aren’t they?

SSD and RAM make that gap much much smaller. Especially the SSD.

true the hardware video encode/decode makes a makes difference for video.

oh look, they’re saying the thing again.

3 Likes

Anybody who needs a decent speed of rendering larger documents written in lousy markup languages on higher-resolution screens?

ALLES TURISTEN UND NONTEKNISCHEN LOOKENPEEPERS!
DAS KOMPUTERMASCHINE IST NICHT FÜR DER GEFINGERPOKEN UND MITTENGRABEN! ODERWISE IST EASY TO SCHNAPPEN DER SPRINGENWERK, BLOWENFUSEN UND POPPENCORKEN MIT SPITZENSPARKEN.
IST NICHT FÜR GEWERKEN BEI DUMMKOPFEN. DER RUBBERNECKEN SIGHTSEEREN KEEPEN DAS COTTONPICKEN HÄNDER IN DAS POCKETS MUSS.
ZO RELAXEN UND WATSCHEN DER BLINKENLICHTEN.

2 Likes

yeah, my hard drive is one of those fusion models. Much faster.
as for ram, Oh look. I have a 11 gig cache, and my computers having a damned difficult time finding a use for the last four gigs.

Maybe I should take up photoshop.

He has 24 gigs? NOM NOM NOM.

The screen’s the biggest draw. Nothing like seeing a high resolution scan of some ancient book and seeing every last detail.

2 Likes

“ancient book”…that’s a funny way of spelling porn! :smile:

the screens are stunning, even regular apple displays are pretty amazing, the 5k one is almost too good for my poor eyes. i find most apple displays to be better then most of the alternatives on the market. bright, crisp, true colors, etc.

Why not both?

2 Likes

do you mean web-browsing? I haven’t had any trouble web-browsing on an intel core i5, or an i3. i think it would be just fine on even older cpus, but i haven’t really tested it. if it were a problem, i’d imagine that adblock and various browser extensions/downgrades would help.

could you elaborate? again, in my experience, the relative bulk of the computer has been more irritating than any ostensible slowness-of-processor. my bottleneck is usually RAM, but RAM expansion has been a stumbling block since… forever.

Like I said, speed has been an issue for me. I use LibreOffice. I have to deal with documents written for Word, and drawings with hundreds of groups and thousands of objects, which means dealing with freezes, basic functions breaking, etc.

But accessibility, and ability to open old documents, are usually more important than speed for me.

Others have said part of what I came here to say: There isn’t a reason like there once was to constantly upgrade. First of all, I build my own computers, but anyway: These days I only upgrade when a new version of Sid Meier’s Civilization comes out and inevitably needs more RAM or graphics card. I’m on a CPU that’s around 5 years old and it works just for everything I need.

I find that because of my photography and videos, the cloud isn’t going to work for me until we all have Google Fiber speeds everywhere all the time. I need to have large hard drives and if I’m going to be stuck attached to them, I may as well have the power and upgradability of a desktop.

I’m also a bad capitalist. I have an original Nook and original Nexus tablet (that someone gave me when they upgraded). I will use them until they die. I do not like the non-desktop constant upgrade treadmill.

3 Likes

Not enough typewriter ribbon to run Civ V.

3 Likes

To a smaller degree, yes. To larger degree, it is things like converting/scaling images, converting videos, processing graphic files. Gets annoying when the times add up during batch processing. Speculatively, computer vision (though I don’t play with that one yet).

A large surplus of power is also good for thermal reasons. These days, CPUs often overheat at full load. A higher-specs chip used at 20% of what it can do will likely run way cooler than a lower-specs one that has to be pushed to 100%.

RAM is always a bottleneck.

These days, another annoyance is the flash memory. Android does not prioritize graphics operations, so things can get sluggish-feeling if the flash gets slow (see the mmcqd process taking a long time), fstrim helps only somewhat.

It is possible this is a dip in advance of Windows 10. But it is also true that most PCs from 5 years ago are so overpowered that there isn’t any compelling reason to upgrade.

http://blog.codinghorror.com/the-pc-is-over/

About the only upgrade you’d notice in the last 5 years is moving from a spinning rust HDD to a SSD – and if you haven’t done that yet, holy crap, hurry up and get your SSD on! The difference is not subtle.

Even for gaming, CPUs are rarely the bottleneck any more. Bear in mind that the PS4 and XBone have 8-core Intel Atom class CPUs. X86, sure, but extremely average in performance, what you would see in the very very cheapest PCs from 2-3 years ago. (the latest Atom CPUs are a bit faster)

Anyways the war is over and smartphones – and to some extent tablets – won. Everything else is footnotes. It remains to be seen how much PC sales will contract over time, they might stay somewhat stable for a while.

3 Likes

Hardly - manufacturers staged a war in order to push products which are merely cheaper for them to make, while not being “better” at anything.

2 Likes

Yes, they tricked people into buying a billion smartphones. Those dicks!

1 Like

With nVidia offering things like the shield tablet, yeah. I think outside of very heavy duty computing we will have a smart phone sized base unit that we can plug into desktop shells for keyboard mouse interaction. I still like my proper pc though.

1 Like