I (very, very, very) strongly strongly doubt that the security of these devices is even close to adequate; but the fact that ‘dumping the firmware was trivial!’ doesn’t seem like a problem(indeed, it’s arguably a virtue).
You don’t want reflashing the device to be trival; and you don’t want modifying any voting/state data temporarily stored on the device to be trivial; but if the integrity of the election depends on the secrecy of the firmware the security model is so doomed that you should consult Ripley’s First Law of Security and start from scratch.
Preventing firmware dumping is a popular DRM/copy protection move; because those contexts make secrecy valuable and don’t treat user knowledge of exactly what the system is doing as a virtue(quite the contrary); and is sometimes treated as ‘more secure’ in general because of the assumption that there are definitely flaws, probably serious, in the firmware; but hopefully making it harder to dump will delay discovery by forcing the attacker to black-box it(subject to the input limits, rate limiting/lockout, etc. of the device) rather than being able to analyze the firmware at their leisure on their choice of system; but if your ‘security’ involves the fatalistic acceptance of the fact that your software is broken; it isn’t good enough for this job.
If anything, in a situation where the device must not have been tampered with, the question is not “How can we keep someone from dumping the firmware?” but “How can we be sure that even maliciously reprogrammed devices cannot falsely return the ‘correct’ firmware if asked to dump the firmware they are currently running?” Even if the software is not open in a licensing sense; ‘security’ that depends on it being vaguely inconvenient to get a look at the flaws is pitifully inadequate(worst case, having someone tear the chip down, gate by gate under a microscope isn’t that expensive, even if more polite dumping mechanisms have been disabled); and if you can’t verify that the hardware is actually running the software it should be; it doesn’t matter how solid the software you think you are using is, because you might be using different software and not know that.
Unfortunately, robust verification that a device is, hardware and software, exactly what it should be; rather than modified/backdoored/built with partially compatible counterfeits/etc. is considered Hard(DARPA has a strong interest in the problem, since the DoD depends heavily on being able to get parts, often for hardware no longer in wide civilian use, which makes them vulnerable to counterfeits; and has plenty of opponents who would love to slip them some backdoors, this has ensured more research but not changed the problem’s status as ‘Hard’); especially if you want a reasonably cheap and nondestructive test; which you do in this case.