That’s certainly true, but consider how difficult it’s already been for people to patch machines that already use automation to receive and install patches. Also, your “18% of 1996 Hondas” analogy works better as, “We decrypted one automotive control unit that allows access to 18% of all cars”, and updating that control unit could be as simple as rebuilding your DH keys (ala @renke’s openssl dhparam 2048, a process likely too difficult for many/most end-users), but it could be as hard as speaking with the company that builds your [accounting/communications/graphics] software package and trying to tell them to update the DH keys used in their software (not that you’d know that anyway).
tl;dr it’s as simple as telling the internet-users and providers of the world to update their webservers and any software that uses DH. Which isn’t easy at all Soooper-duper easy to do.
EDIT
Absolutely agreed. However, it’s my impression that the tools to manage big software changes have gotten much better over the years, so I hope it’s that much easier for devs to get patches into the hands of admins and laypeople alike.
Also agreed, implicitly, that we need to be better about actively supporting strong encryption measures for public and private use. I’ve always built certs/keys and such with a higher-than-recommended level of entropy–for giggles I’ve gone as high as 8192 bits IIRC, although I haven’t the faintest recall of whether that caused glitches or not.
At the very least, the Snowden revelations set a carrot out there for programmers to be better about incorporating encryption where necessary (consider blackphone, WhatsApp, etc.) and I think that will continue to funnel down to the bog standard user who has little or no interest in doing anything complex behind the scenes of their OS.
I’m not a number theorist, but I gave this a little thought and the answer seems to be yes. Diffie-Hellman, and other schemes like RSA, are based on variations on Fermat’s little theorem which, glibly stated, says that a^n mod p cycles with period p-1.
Now, in Diffie-Hellman, both communicants share some number g and a prime p and generate random private exponents e and d. Some things happen and both end up computing a shared key K=g^(ed) mod p.
You don’t require that e and d to be relatively prime to p-1. However, if they are not, then the space of possible values for K may be significantly smaller. (The size of the space will be a proper divisor of p-1.)
For some primes, p-1 might have a large number of distinct small prime factors. (For example factorial primes PrimePage Primes: Factorial primes ) Using any of those primes would a poor choice. The best primes to use would seem to be those which are of the form 2 p+1 for some other prime p.
I don’t know how the standard list is formed, but I’d bet they are all of that form.
For IPSec the 1024 bit prime (group 2) is defined in RFC 2409:
6.2 Second Oakley Group
[…]
The prime is 2^1024 - 2^960 - 1 + 2^64 * { [2^894 pi] + 129093 }.
The longer standard primes are calculated similar and are defined in RFC 3526. No idea if they are identical to the primes used in TLS and other applications using DH KEX.
Just double checked that number, since the pi in there threw me. (The brackets seem to indicate the floor function. I’m assuming that this was calculated by analytic number theory… or eldritch magick, either one. ) The result is:
This is explained in RFC 2412 Appendix E. As I’m no mathy I’m not able to summarize the explanation around “classical remainder algorithm” and “Montgomery-style remainder algorithm”.
Electric drill and wire cutters kills the power then drill straight through the lens into the circuit boards.
If your really going for it drill a hole in the top fill with thermite long magnesium fuse get back.
So you end trying every number that could be mersenne and not other primes. That soon means you need rediculous digit jumps to get to the next usable prime. And the attacker only has to try a much smaller list for any digit length.