UN wants to ban killer robots but US won't comply

Originally published at: UN wants to ban killer robots but US won't comply | Boing Boing


Next up, civilians wanting to own and stockpile them but absolutely never, ever use them to hurt other people. That would be silly, they just want to keep them and polish them and look at them and perhaps, sometimes, fantasise about stopping a Bad Guy with a Killer Robot.


Killer Robots don’t need food, shelter or vaccinations so not surprised that the US is gravitating toward their use.



The Future of Life Institute recently released this video to try to sway Americans to sign the convention.


It would be a shame to outlaw killer robots. We have a long history of killer robot ownership in the United States, and millions of responsible killer robot owners enjoy hunting and other killer robot sports safely. After all, if we criminalize the ownership of killer robots, only criminals will have killer robots.

Donate generously to the NKRA.


oh, I’m sure we’re completely safe…

1 Like

Where do the killer drones, embraced so lovingly by our last four presidents (at least), fall in the spectrum of killer robots?


If they’re not autonomous they’re not being addressed by this particular UN killer robot convention. Most of those killer drones are remotely operated by humans, but I suppose the U.S. wants to keep its options for cutting labour costs open.


“If I want my killer robot to buy guns then let him. It’s my second amendment right!”

1 Like

Does anyone know how the proposal intends to parse out the difference between the deathbots we all imagine from science fiction and the (common, downright ubiquitous among people who can afford them) weapons systems that are fired by humans but have independent terminal guidance(potentially including re-targeting if the original target is lost)?


What do you mean by re-targeting?

I believe that existing systems that require a human to hit a button before firing mostly aren’t covered by this proposal, but if you’re talking about a system where someone commands a drone to drop a bomb on a specific car, then the car is lost so the computer says “oh well, might as well bomb this sketchy looking wedding party instead,” that probably would get banned.

1 Like

I was thinking along the lines of more or less any fire-and-forget missile with not-terribly-sophisticated IR guidance(presumably radar and optical image recognition guidance could also qualify, depending on how much detail the guidance system is given about the target prior to launch and how aggressively it is configured to risk false positives in the attempt to avoid false negatives); which, while not necessarily explicitly designed to do so, may pick up a different IR source to chase if the original target successfully evades, is destroyed, or pops a flare but there are other aircraft in the vicinity producing a good IR signal.

It would also cover more explicitly semi-autonomous things like the CBU-97, where the operator specifies the target for the munition dispenser; but the 40 submunitions independently sweep the area they are dispersed over with IR and laser sensors, looking for armored vehicles; and each one either fires an explosively formed penetrator(while still descending, in the hopes of taking advantage of the typically weak top armor) if it detects something that looks right; or self destructs shortly before hitting the ground if it does not detect anything.

1 Like

Today’s Reith Lecture was on this subject

1 Like

I wonder if there are exemptions for already existing indiscriminate killer robots: nuclear missles (once they’re launched, there is no recall as that would be a weakness,) land mines (Trump ordered their reintroduction after years of treaty-enforced bans, the US currently has about 3 million in inventory,) and no doubt many other weapons systems hidden in black budgets and/or used by the many mercenaries hired by the government.

This topic was automatically closed after 5 days. New replies are no longer allowed.