Awwww… [throws screenshot away, walks off sulkily]
There are some juicy quotes in there"
It’s like you took a bunch of small-business accountants and told them they were going to be designing multi-billion dollar corporate tax shelters in the Seychelles.
Suddenly they feel alive, they feel free. They’re right at the top of Maslow’s hierarchy of needs, self-actualizing on all cylinders. They don’t want to go back.
I tend to keep a lot of windows open for long periods of time, and I noticed a while back that BoingBoing pages seemed to be using a shit ton of CPU time, even just sitting minimized in the dock. Having more than a few open would result in noticeable drag on whatever else I was doing. Frustrating because this is a site that I am most likely to have pages in the “read and/or send to someone else later when I have time” category.
Which url? And which browser?
From where I’m standing, boingboing’s pages seem positively well behaved compared to, e.g. MacRumors forums (which can consume more than 1.5 GB for a single page)
What a great talk - covers many great points including the problems with today’s heavily hyped “cloud”. The internet sucks these days. Let’s make the Internet great again. And get off my lawn.
ETA: I hit the “reply” button at the bottom of the page, not sure why it replied to that message specifically.
“Before the comments begin, I will cop to Boing Boing being just as guilty as many of the examples cited by Ceglowski.”
Too bad nothing can possibly be done about it…
Isn’t that part of the culture of zines, though? To embrace an alternative to the mainstream, and not just to apologize for not having a fancy offset printer?
So that’s how the world ends…technically dynamic but hanging in eternal gridlock, a frozen mandala riddled with malware and choked with pop-ups…a confounded God impotently mashing ctrl alt del and screaming at His once lightspeed Creation.
If you open that tweet in a browser, you’ll see the page is 900 KB big.
That’s almost 100 KB more than the full text of The Master and Margarita, Bulgakov’s funny and enigmatic novel about the Devil visiting Moscow with his retinue (complete with a giant cat!) during the Great Purge of 1937, intercut with an odd vision of the life of Pontius Pilate, Jesus Christ, and the devoted but unreliable apostle Matthew.
For a single tweet.
Which if you think about it is a perfect description of Twitter, mostly bloat and scant substance.
The designers of pointless wank like that Facebook page deserve the ultimate penalty.
They should be forced to use the Apple hockey puck mouse for the remainder of their professional lives.
Here, here! Let the pucking of Facebook commence.
Out of an abundance of love for the mobile web, Google has volunteered to run the infrastructure, especially the user tracking parts of it.
Such altruism! Why I get warm fuzzies whenever I think about…hold on, I have to close a video ad for some cancer charity scam.
We’re in a stupid situation where ads make huge profits for data carriers and ad networks, at the expense of everyone else.
Yeah, and one guess who has a stranglehold on the infrastructure we’re stuck using.
It costs a lot less to pay for a couple freelance journalists and a web designer than it does to film a sitcom. So why is it unthinkable to force everyone back to a successful funding model that doesn’t break privacy?
It’s not, but adtech profiteers are accustomed to siphoning off the excess like ticks bleeding a mule. The only way you’re going to dislodge them is to shave the mule, burn the ticks and extract them with sharp legislation. But their con dovetails with data plan providers who make both you and the publisher pay for all the worthless advertising bandwidth, at least until the bubble bursts. Getting paid twice for the same bandwidth is a nice little gig. Those providers are going to stand vigil over the mule until the whole system comes crashing down, because they want to suck out the maximum profit from that tax. So data providers and ticks will throw as much of that bloodmoney at legislators as needed to keep the extortion flowing, because legislators are cheap dates and the alternative is having to make an honest living!
This is no way to live. We’re not animals!
The point is that assumptions about complexity will anchor your expectations, and limit what you’re willing to try. If you think a ‘real’ website has to live in the cloud and run across a dozen machines, a whole range of otherwise viable projects will seem unprofitable.
Therein lies the one nugget of hope. If small publishers and their web developers take this to heart and make their business models profitable, they’ll become a valuable long-term post-bubble asset to the data providers who will need someone to generate traffic after the adtech industry collapses.
Yah. I was going to point out all the pot calling kettle web design choices sprung up on Boing Bong over the last five years that’ve incrementally gotten under my skin. But Cory’s admission of guilt pretty much undercuts those gripes.
Short of shrugging our shoulders, palms up in exasperation, not much we can do, eh?
Defensively, I partitioned the code so that time critical processes could run on a different server from the presentation logic. That’s where I left it. I suspect that by now a simple form has a megabyte of bloat but they haven’t been in touch for two years so I guess everything is just fine. Maintenance? We spit on your stupid maintenance! Frameworks and multiple languages on a page are the future.
I write all my browser client code in lisp, which is sent down and transcoded into a intermediary proprietary version of Perl 4, before that is compiled into a combination of jquery and YUI functions. And of course no recursion is used (outside of lisp of course) so I can eat up six, eight gigs of memory per page?
Isn’t that… Isn’t that how we are supposed to do it these days?
This must be why I instinctively read about current events on Wikipedia on my phone. The fault is mostly in ‘news’ sites, which seem to want to monetize every square millimeter of their pages and eat up my monthly allotment of JS code blob downloads.
My own website is built of raw, inscrutable HTML that renders in milliseconds.
But they can do something about it!
They can call everybody who bothers to complain about it a bunch of atypical troglodytes! There, problem solved.
My project pages are written in a task-specific language vaguely similar to wikipedia markup. The server is grossly underutilized so they are parsed in real time. They could be easily rendered to static HTML, though.
The proliferation of huge frameworks is a pain. Like driving a trailer truck where a humble SUV would do comfortably.
Edit: Browsing the web in elinks from a terminal is often way faster and more pleasant than using a full-graphics browser. Does wonders for e.g. news sites.
Also, pictures. The JPEGs are often immensely and unnecessarily big. A way to shrink them significantly is as simple as
mkdir pub;for i in *.jpg; do jpegtopnm $i | pnmtojpeg > pub/$i - this shrinks all the image sizes without noticeably impairing the quality, and puts the publication copies to the pub directory.
Puffy, in his capacity as the OpenBSD mascot, would probably resent having the pufferfish made into a symbol of bloat and inscrutable complexity.
elinks is OK, although a lot of pages won’t even load their content anymore with it. Best case scenario is a few oddball text only sites that load all the data quickly.
BB runs really well with adblock, ghostery, and noscript all set up to remove the extra cruft and screen distractions. But just like a hooker, I wouldn’t touch this site (or any other) without appropriate protection.
Needs more Haskell and Hadoop.
So much Smalltalk, now let us go FORTH!
I thought Boing Boing had a case of MUMPS.