ah, okay. i think most graphics processing is just intrinsically CPU-intensive and saturates the memory bus, not a matter of poor markup language. i didn’t believe it either until i did some (very basic) machine learning of images. i agree; it takes longer than i’d like. i guess it’s because, for a linear increase in quality, you incur a quadratic increase in image size since they are two-dimensional. (and actually it can be even worse because preserving fidelity means forsaking lossy compression.)
i mean, yes, these operations are conceptually simple until you are dealing with gigapixel files and trying to make sure that the output doesn’t look like total crap. even a passable blur (by modern standards) can be a non-trivial thing to get running smoothly. anyway, that’s why i listed “media work” as an exception.
i do keep forgetting that some people manipulate data by hand. how tedious.
Gamers. Electronic musicians. Which you mentioned yourself, though with some disdain; I happen to do both though.
Running games on servers in the cloud was tried, and failed both technically and as a business model. Running audio apps in the cloud is completely unworkable for reasons of latency.
Tablets, while they can do a surprising amount, are still quite limited in terms of multitasking, running multiple instances of software synths and effects, etc. And both tablets and laptops are impractical where it comes to upgrades and peripherals. If you’ve got an audio interface, a couple of MIDI keyboards, a Maschine, some hardware synths and stompboxes and a couple of guitars – or a hot video card and a racing wheel or Oculus or whatever – or both – there’s really no point in connecting them all to a laptop.
I have done some musical work on just a cheap laptop and pair of headphones, but it was limited in scope. (Sometimes technical limits bring creative gains, but sometimes they stand in the way of creativity.)
And software development is much more comfortable with a nice dual-monitor setup, a mechanical keyboard and a comfy chair than it is on a laptop, even if your IDE and compiler are in the cloud.
All this said, my PC and even my graphics card are 4 years old and still going fairly strong. I just replaced the hard drive with a 1TB SSD and I’m not planning to replace the machine itself anytime soon. I’m not really limited in terms of audio processing (I could occasionally use more inputs and outputs on my audio interface, but I love its very tiny latency). I do have to be a bit cautious about graphics settings in Dirt Rally, but it still looks pretty good and runs around 50 FPS.
When I worked at the university so so long ago now we always knew when the physics department number crunching jobs started on Friday afternoons as the mainframe response time went up considerably as the jobs started up. It would usually crunch the numbers all weekend and I am willing to bet even with the improved computing power they have kept up with improved data points as well.
It can be smart to use a CLI over your phone, but this doesn’t IMO make a phone a smartphone.
I used Linux for an example, but I forgot that iPhones use BSD. The crucial thing is that the user is the admin, controlling the phone. If you don’t control the file system, it’s not really your phone at all.
ah, yes. sorry, your wording had the whiff of innuendo.
anyway, this is another one of those places where the lower power consumption designs shaddack was pooh-poohing pay off. i can max out two cores without noticeable (to me) fan noise. above that, yeah, it’s noticeable but it doesn’t bother me. as usual, there’s a lot of people bickering about how quiet macbook pro fans really are, but in my experience they are pretty damned quiet.
The iPad Air 2 is amazing, too – so fast, near low end laptop speed, 2GB RAM, and triple core. Just needs iOS 9 for better split screen and multiple app at once support. I am very much looking forward to the rumored 13 inch iPad plus, too.