U.S. Military Wants Drones that Detect ‘Adversarial Intent’
September 28th, 2011Oh sure.
Via: Wired:
Perhaps the idea of spy drones already makes your nervous. Maybe you’re uncomfortable with the notion of an unblinking, robotic eye in the sky that can watch your every move. If so, you may want to click away now. Because if the Army has its way, drones won’t just be able to look at what you do. They’ll be able to recognize your face — and track you, based on how you look. If the military machines assemble enough information, they might just be able to peer into your heart.
…
The Pentagon isn’t content to simply watch the enemies it knows it has, however. The Army also wants to identify potentially hostile behavior and intent, in order to uncover clandestine foes.
Charles River Analytics is using its Army cash to build a so-called “Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS)” tool. The system would integrate data from informants’ tips, drone footage, and captured phone calls. Then it would apply “a human behavior modeling and simulation engine” that would spit out “intent-based threat assessments of individuals and groups.” In other words: This software could potentially find out which people are most likely to harbor ill will toward the U.S. military or its objectives. Feeling nervous yet?
“The enemy goes to great lengths to hide his activities,” explains Modus Operandi, Inc., which won an Army contract to assemble “probabilistic algorithms th[at] determine the likelihood of adversarial intent.” The company calls its system “Clear Heart.” As in, the contents of your heart are now open for the Pentagon to see. It may be the most unnerving detail in this whole unnerving story.
“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself–anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face…; was itself a punishable offense. There was even a word for it in Newspeak: facecrime…”
– George Orwell, 1984
I believe that under the current DOD ROE’s, autonomous drones are permitted to fire on enemy personnel without human intervention as long as the drone says “He’s coming right for us!” first.
I don’t mean to go all crazy, but wasn’t this the exact problem with Skynet? It detected a potential threat and acted to remove it with a preemptive strike.
The really frightening thing is that this is an entirely logical approach. Of course you want automated systems. Of course you want to preempt your enemies. Of course you want to detect hidden threats.
Each step is sensible and justified… and one step closer to the edge.