Originally published at: https://boingboing.net/2017/12/26/latency-why-typing-on-old-com.html
…
Fascinated at Next Cube having terrible keyboard latency, almost unique in its era. Typing latency drives me batty, but it’s one of those things that is hard to pin down when it’s on the margins on acceptability. 150ms is close to unusable for music (anyone who uses midi keyboard controllers will recoil in horror from that sort of latency)
I wonder if anyone noticed at the time? “There’s just something I hate about Next and I can’t quite put my finger on it”
EDIT: I know Symbolics’ latency was even worse, but if you’re typing on a Symbolics keyboard it’s already plenty wrong anyway.
100ms is 0.1 seconds. That’s DEFINITELY a noticeable amount of time. That’s roughly as quick as most people can click on and off a stopwatch. I can’t imagine having .2 seconds of delay on my computer.
The 300ms latency for the Symbolics 3620 just goes to show how Lisp machines were way too advanced for their own good. It’s pretty much the same case with the NeXT Cube.
This is a pet peeve of mine, the lack of immediate feedback from digital systems. With the old electro-mechanical systems, you pressed a button and, immediately, something happened. Now, you press it, nothing seems to happen, you wonder whether you have pressed it hard enough, or whether it didn’t take, you press it again, and it reacts to the two pressings. “First provide feedback” should be the prime directive of user interface design. Instead of that, now it seems that “check back with the mothership” takes precedence. I hate it.
Your concerns are very important to us here at The Mothership™. Your peeve has been received and will be promptly dropped into the waste bin. Please allow between 3 and ∞ business days for a response.
Part of the secret sauce of clicky mechanical keyboards is that they provide the illusion of immediate feedback.
Yup. I used a friend’s kindle Fire stick, over Christmas, and the latency on its remote was infuriating, and the fact that the D-Pad immediately skipped episodes if I hit left or right, when that was part of the OSD was bullshit.
Either have track-skip buttons, or don’t. Don’t put them on the screen as well as setting contextual buttons on the remote for something that’ll badly interrupt your viewing experience.
“Latency: why typing on old computers just feels better”
Message from your carpal tunnels: “We beg to differ.”
Ordinary amounts of typing latency doesn’t really bother me. I know what is going to show up anyway and even if I make a mistake I can tell and correct it by feel. Its the long tail of latency that bothers me, when the computer thrashes and latency can shoot up over a second. Mouse latency is more annoying, since that is a UI that requires visual feedback to operate. If I click a mouse button and it takes 100 ms for the window to open, I hate it – I can’t decide where to start moving the mouse until I can see exactly how the system decided to place the menu. The little latency hiding animations don’t help much.
I went to work for a company in 1982 that made bank teller systems, based on what essentially was an IMSAI 8080, but we’d put a Z-80 in it. The code was pure assembly language, and ran silly fast. The tellers knew what the key sequences were for common transactions and the keyboard/screen updated as fast as they could hit the keys.
A decade later when we were “encouraged” to use PC-R-US hardware and Microsoft Windows, we lost more than a few accounts and got tons of complaints because the new systems were so slow that a teller would hit the keys for a deposit transaction, same speed as before, and then find themselves looking at an error screen because they’d typed faster than the system could handle.
When “Hello World” went from <100 B in assembler to 25 MB in visual studio …
That’s Microsoft quality right there for ya!
The modern graphical user interfaces have lots of software to render the characters. Systems before these GUIs had hardware character generators - give it ASCII and it put pixels in place on the screen. How much of a factor is software font rendition?
Maybe that was caused by display postscript? It always seemed like a silly idea to me, like using MS word as your GUI.
Did some non-scientific testing of this on my desktop (i5-4950, GTX650Ti, running Arch Linux) by capturing slo-mo video with my phone (the phone claims it is 120 fps, video properties say 117.70 fps - so granularity of about 8.5 ms).
On a GPU-composited desktop terminal emulator window, the interval between keypress and the letter appearing on screen is 9–11 video frames, or 76.5–93.5 ms.
On a bare virtual terminal screen (Ctrl+Alt+F2), the interval is 3–5 frames, or 25.5–42.5 ms.
The two frame variation is accounted for by the screen refresh rate of 60 Hz: if the input is registered right as a new frame is being prepared, it’ll be shown immediately, otherwise it has to wait up to 16.7 ms to show up.
Either way, this amount of latency doesn’t seem to bother me. I tried typing some longer texts on the bare terminal (that I generally only use to deal with emergencies) and didn’t find it any more satisfying than typing on the desktop.
Chrome on my tablet takes roughly the length of the Paleocene.
It was the beginning of DLL Hell. You might need just one small function from a library, but you had to link and load into
memory something massive to use what you needed. I used to give Linux/BSD credit for not making that mistake, but then they
fell into the same trap with Shared Libraries. Yeah, I know why, but the granularity means you’re always taking up memory with
stuff nothing will ever actually use.
And nobody seems to be taught small/efficient/fast in classes now. No, the answer to “It’s not running fast enough” is NOT a
faster processor, more memory, or a bigger disk. When you’re programming for an embedded device that doesn’t even have a disk,
just a flash drive that you better not abuse, you can’t get sloppy.
I wonder if having at least some of the userbase coming from terminals dialed in to someone else’s timesharing system helped hide how bad that is. Even pathological local consoles beat a couple of hops over dial-up.
I learned the “Rule of Threes” as I was learning UX design. It says that a system that responds in under 300ms will seem “fast enough” for most uses. (Think “clicking Submit”, not typing characters). And anything that takes over 3 seconds will be understood by its users as it seems like it’s “thinking”. But systems that take between 0.3 and 3.0 seconds to respond just seem frustratingly sluggish.
As far as typing characters, clicking a button, or other UI responsiveness goes, you want to keep it faster than 8ms, which is a fairly fast human’s typical reaction time. (A drag racer can get this down to under 2ms, but they are not typical humans, and you can waste a lot of money without perceptible gain trying to achieve that speed.)
Is the Apple //e design open sourced yet? Maybe we should fork out and add internet ability. Also add a keyboard with mechanical key switches and decent layout. Then call it ”Blockchain” et voila! Profit!