"Warfare is increasingly guided by software. Today, armed drones can be operated by remote pilots peering into video screens thousands of miles from the battlefield. But now, some scientists say, arms makers have crossed into troubling territory: They are developing weapons that rely on artificial intelligence, not human instruction, to decide what to target and whom to kill.Fearing Bombs That Can Pick Whom to Kill - NYTimes.com
As these weapons become smarter and nimbler, critics fear they will become increasingly difficult for humans to control — or to defend against. And while pinpoint accuracy could save civilian lives, critics fear weapons without human oversight could make war more likely, as easy as flipping a switch."
Wednesday, November 12, 2014
Fearing Bombs That Can Pick Whom to Kill - NYTimes.com
For a bigger-picture drone perspective, see Book Review: Analyzing the Drone Debates: targeted killing, remote warfare, and military technology by James DeShaw Rae (LSE Review of Books)