That’s a “light” /s, not an acerbic one; you can tell from the particular slant of the slash.
For me, it stems from the experience that novice users have administering their OS. An Ubuntu instance can be relatively easily installed and set to run a browser and have an 80% functional computer. Then it comes time to try to configure peripherals, and the printer is the primary one from what I’ve seen, and novice progress stops at that point.
Sure. But in that case (a Roman Empire-like collapse - most technology still intact - or a more total collapse with a recovery within a few decades) it’s not just that there’ll be a lot more CD-R and DV-R drives around. It’s that there will be vastly more data discs around, because it’s a common standard around the world.
How many Rosetta Disks are there? How many different Rosetta Disks - with different subject matter are there? Even if you make a few hundred copies and scatter them around the world, CD-R and DV-Rs and drives will still be far more common.
In a total back-to-the-stone-age-for-centuries collapse (this seems VERY unlikely without humanity becoming extinct altogether) - the one-in-a-million still-working CD-R and DV-R drives will likely outnumber the Rosetta Disks.
In a REALLY long-term collapse, Rosetta Disks have the advantage being microscope readable. But they’re readable only a century earlier. And the far smaller number means a lesser chance of some surviving. (Or being recognized for what they are.)
I think micro-etching is very interesting as storage medium for long-term document archives like the Barbarastollen. Microfiche has a believed life span of 500 years when stored in a controlled enviroment. The etched disks, made of a stable and rust-proof metal, have a nearly umlimited longevity.
For medium-term archival purposes widely scattered cheap digital copies are probably easier to recover and use.
You wouldn’t want to pick any deliberately perverse formats, obviously saner is better to the degree that space allows; but even fairly primitive techniques should at least be able to tell you that something was purposefully written there, even if deciphering it eludes you. Al-Kindi was, obviously, a really sharp guy, so his results probably aren’t typical; but he had worked out the use of frequency analysis back in the 9th century. More sophisticated statistics came later, and information theory is comparatively new; but a sanely chosen digital format should be fairly clearly and intriguingly neither random nor merely decorative based on frequency analysis alone.
Doesn’t much matter if The Future only cares about chipping the archival glass into crude spear-points in order to fight off the radroaches that come out of the wastes to carry off the weak and unwary every night; but that’s not a data encoding problem.