8 weird punctuation marks that faded from the English language

I am less tolerant of numbered footnotes. I have read too many books/articles where the superscript numerals are just too light to easily distinguish the number (exacerbated by using italic numerals or typefaces that do not have true superscript numerals). Whereas asterisk, double asterisk, dagger, double dagger are generally more than enough unless a technical book is referring to multiple footnotes*.

*and the reading habit is generally different.

1 Like

No disagreement on the dagger as an aesthetic choice; it just seemed relevant, to its status within or without the language, that is essentially an aesthetic choice; much like fonts and text sizes, rather than being a character that has no functionally equivalent substitute. The substitution may or may not be a good idea for readability or attractiveness of the text; but you can make it; unlike periods, exclamation points, or the like.

1 Like

I never noticed how quickly my eyes zip around a page until I started reading to my kids. I’ll subconsciously dash to the end of the paragraph to notice “Lyra said mournfully” to be able to put it both in Lyra’s voice and a mournful tone. And if a line of speech ends with an interruption, my eyes have to have already started the next line so I know how I’m going to interrupt it.

I discovered my eyes are usually reading about a sentence ahead of the one my mouth is currently saying.

The biggest issue, though, is that when I’m tired and my eyes are on the next line, I’ll often substitute the name of whoever is mentioned next. “We’ve got to leave now, Pan!” said Ms Coulter" or something, and the kids will scream out in protest “you mean Lyra!!”

Anyway, it was just something I noticed I subconsciously do, and now that I’ve noticed it I realized my eyes zip around all the time even when I’m reading to myself.

2 Likes

There are a lot of these alternatives to US-ASCII. But ASCII was first, actually ASA X3.4-1963, and these are derived from ASCII with substitutions. And from the very beginning it was expected that @ (ASCII 4/0) would be replaced with regional variations.

ISO 646 clarified how these substitutions worked.
§, É, Ó, é, à, Š, Ð, ¡, ´, instead of @
ß, °, ÷, (too many to list) instead of ~
¤, ¥, ï instead of $
£, ç, é, ù, §, », Γ instead of #

Additionally in Japan, JIS C 6220:1969 and JIS X 0201 substitutes ¥ instead of \

There are quite a few ASCII symbols, even some that C-like languages depend on, that are very unreliably encoded in 7-bit character sets used across the globe. Especially in C’s heyday.

I used to work IT at a Japanese company’s US office. Having Windows machines show “C:¥>” when I dropped to the command-line kind of blew my mind the first day.

(EDIT: I bumped the send button before I was finished. had a bit more to go.

4 Likes

It’s definitely nonstandard; but there is the option of the Bioware sentiment prefix (I think) first used for HK-47 in KoTOR(slightly odd in that case, since that droid was supposed to have considerable protocol droid capabilities, in addition to the murder); then considerably elaborated for the Mass Effect Elcor.

‘Informal’ footnotes, I said. Generally for only a few notes, and generally not for citations, but for clarification or context in brief documents.

After double or triple asterisks, something else may be needed that works like an asterisk, and won’t be mistaken for punctuation. The daggers still work, and Word still has them.

1 Like

Indeed it was once the practice to use “dead keys” that did not advance the carriage for diacritics on typewriters. When computers came along, this led to some debate as to whether composing diacritics should be entered as character diacritic or diacritic character eg. do you type a¨ or ¨a to get ä? Implementations varied.

I used to work with a library catalog that displayed composing “dead key” diacritics as the letter of the key that was used to enter them. So the book Gödel, Escher, Bach showed as Ghodel, Escher, Bach. Keep in mind that the catalog records themselves used a character set that included the diacritics but not their composed form to reduce the number of bits per letter. But somewhere between downloading the characters from OCLC to displaying them to the users via a VT-100 emulation they were converted to a letter. And not a random letter, but the one that was used to enter the diacritic, which makes me think it happened as part of the downloading records part.

3 Likes

Yes; I wasn’t clear enough about my meaning. I mentioned Unicode just so anyone unfamiliar with the symbols could track them down. The common mathematical symbols for AND and OR predate both Unicode and C. I think they were developed in the 19th century by Boole or possibly De Morgan. These days students are more likely to first encounter De Morgan’s Laws in the set theory version using union and intersection, but De Morgan also (initially?) stated them in their logic version using AND and OR.

I get that C had the problem of limited character set. But its choice of symbol would have looked weird to anyone familiar with the mathematical logic symbols. Not that I can find any better candidates for the XOR in the limited character set available to the developers.

1 Like

I notice all the comments are keyboard related. I have been using the caret to signify “insert this extra text here” since junior school in the 1960’s. All the desks had inkwells and we had dip pens made of a stick with a nib stuck on the end.

2 Likes

I’ve seen the multi daggers in a recent science paper, so it’s literally part of cutting edge technology.

3 Likes

4 Likes

I was wondering if reading a lot of older texts had warped my notion of it appearing in recently published work, so I’m glad I’m not hallucinating.

1 Like

That was presumably before e-mail became a thing in Germany.

Back in the early 1980s our printer was a dinky little Centronics 737 9-pin dot-matrix printer. This dealt strictly with 7-bit ASCII, and if you wanted to print German text on it you would have to turn the printer off and flip a bunch of DIP switches on the inside (no sending ESC sequences or anything). That would cause square brackets etc. to be printed as German umlauts – and “@” as “§”. It was just as well that on the Apple II at the time we used to program in BASIC, not C!

2 Likes

In German, the word paragraph is so inextricably linked to the law that is used in everyday language as metonymy[1]. Everyone, whether they have interacted with the legal sphere or not, knows the word and its attendant symbol. Even small children.

Note the difference when googling “symbol for the law” in English and German


  1. e.g. Paragrafenreiter = “Paragraph rider” for an overly fastidious, letter of the law person ↩︎

6 Likes

I hope not, and not just for code (though I mostly use Python these days, so…)

As a writer, I appreciate the semicolon for how it “feels” different when linking two complete sentences; it’s like they’re waltzing together on the page. It’s more linking than a period. A full stop is separating, dividing.

It’s also vital for complex lists. He ate red, juicy apples; firm, green grapes; and the words he spoke in haste, an embarrassing moment he would not forget.

And hands off the colon: the fragment joiner, abused by me often online.

6 Likes

The use of tilde to indicate approximation is common in mathematics.

One of the local breweries makes a decent Belgian Tripel called Interrobang.

5 Likes

When I did maths we used ≏ for approximate, it seems to have disappeared but still exists in Unicode.

Huh. You’re sure you don’t mean ≃ or ≈ instead? ≏ is supposed to be difference between. :confused:

U224F
Hover over the symbol.

≈ is also indicated as approximation.

Weird. Hovering shows it links to approximation, but ≏ doesn’t actually show up on that page. Wiktionary and Unicodepedia call it difference between, anyway.

2 Likes