jump to navigation

The military will lead in robotic development 2010-Dec-29 at 16:27 PST

Posted by Scott Arbeit in Blog.
Tags: , ,
add a comment

And that’s OK.

It makes sense.  The U.S. military has the most to gain from using robots on the battlefield… every robot that does the job that a human might do keeps a man or woman out of harm’s way.

From War Machines: Recruiting Robots for Combat, by John Markoff, 27-Nov-2010:

And while smart machines are already very much a part of modern warfare, the Army and its contractors are eager to add more. New robots — none of them particularly human-looking — are being designed to handle a broader range of tasks, from picking off snipers to serving as indefatigable night sentries.

Three backpack-clad technicians, standing out of the line of fire, operate the three robots with wireless video-game-style controllers. One swivels the video camera on the armed robot until it spots a sniper on a rooftop. The machine gun pirouettes, points and fires in two rapid bursts. Had the bullets been real, the target would have been destroyed.

“One of the great arguments for armed robots is they can fire second,” said Joseph W. Dyer, a former vice admiral and the chief operating officer of iRobot, which makes robots that clear explosives as well as the Roomba robot vacuum cleaner. When a robot looks around a battlefield, he said, the remote technician who is seeing through its eyes can take time to assess a scene without firing in haste at an innocent person.

Although there is a risk of lowering the barrier-to-entry for war by sending robots rather than humans to do the fighting, our main threat (and therefore our main area of military focus) over the next 40 years is going to be terrorists, not other nations, and most terrorists won’t be able to afford these robots.  Of course, they have suicide bombers, which enable some sophisticated kinds of attacks that aren’t otherwise possible.

Better they bomb a bunch of robots, for now.

One thing I wonder about… when the military has a significant number of these kinds of robots, and AI grows up, someone will mix that peanut butter and chocolate and we’ll have some military-specific robots that are capable of performing some sophisticated decision-making.  When that happens, will we still be so cavalier about sending them into battle, when we know that they’re exhibiting identifiable signs of intelligence?  And will we eventually have to establish a command structure in the military that includes AI-based commanders?

I’m guessing that, at some point in the next 40 years, we’ll have artificial intelligence capable of holding an officer’s rank, and probably General or Admiral.

P.S. Sorry I’ve been away… working on the book!  I’ll try to do both now… walk and chew gum.  I can do this.

In the SysAdmin world, a medal for NOT firing your weapon 2010-May-16 at 17:27 PDT

Posted by Scott Arbeit in Blog.
Tags: ,
1 comment so far

Hold fire, earn a medal, by William H. McMichael, 12-May-2010

U.S. troops in Afghanistan could soon be awarded a medal for not doing something, a precedent-setting award that would be given for “courageous restraint” for holding fire to save civilian lives.

The proposal is now circulating in the Kabul headquarters of the International Security Assistance Force, a command spokesman confirmed Tuesday.

“The idea is consistent with our approach,” explained Air Force Lt. Col. Tadd Sholtis. “Our young men and women display remarkable courage every day, including situations where they refrain from using lethal force, even at risk to themselves, in order to prevent possible harm to civilians. In some situations our forces face in Afghanistan, that restraint is an act of discipline and courage not much different than those seen in combat actions.”

In the SysAdmin force, which will be deeply embedded in civilian areas everywhere it deploys, military personnel face difficult decisions every day around the use of force.  Highlighting the importance of restraint in that decision-making process is a natural and simple evolution in how these troops are trained.  I look forward to seeing the first of these presented.

As for the objections in the article – “The enemy already hides among noncombatants, and targets them, too. The creation of such an award will only embolden their actions and put more American and noncombatant lives in jeopardy.” – well, our troops already face these decisions.  Nothing about the creation of this award makes their lives more difficult than they already are.

The military learns quickly from its own mistakes 2009-Dec-30 at 22:16 PST

Posted by Scott Arbeit in Blog.
Tags: , ,
add a comment

Army History Finds Early Missteps in Afghanistan, by James Dao, 30-Dec-2009

“A Different Kind of War,” which covers the period from October 2001 until September 2005, represents the first installment of the Army’s official history of the conflict. Written by a team of seven historians at the Army’s Combat Studies Institute at Fort Leavenworth, Kan., and based on open source material, it is scheduled to be published by spring.

Though other histories, including “In the Graveyard of Empires” by Seth G. Jones and “Descent Into Chaos” by Ahmed Rashid, cover similar territory, the manuscript of “A Different Kind of War” offers new details and is notable for carrying the imprimatur of the Army itself, which will use the history to train a new generation of officers.

As always, the military is forced to adapt quickly to unexpected circumstances, and it’s good to see their own evaluation of what they did and what had to change to succeed.  I look forward to reading at least some of it.  My expectation for our efforts in Afghanistan in 2010 is that our men and women in the military will have a fair bit of success, mostly because we did it the wrong way already, and we know what to do better now.  This book is the proof.

We need this kind of deep and honest self-reflection as we adapt the military to the long-term “winning the peace” initiatives we’ll be sure to take on in the next 40 years or so.

The New York Times’ copy of this document is here.