Wizards of Oz

"Life is fraughtless ... when you're thoughtless."

29.7.07

Future Warfare?

Mountain Runner has provocatively asked how robots fit into 21st century warfare, and what impact they have on perception management, counterinsurgency, and reconstruction.

As a science advisor for the Dept of the Navy (in my previous career), unmanned systems offered a compelling promise of "security without risk". After all, the greatest limitation in modern systems engineering is making a platform hospitable for humans. Remove the human from the fighter aircraft, watch John Boyd's Energy-Maneuverability Theory grow quadratically -- and get a platform that can turn 25Gs while evading even the most advanced surface-to-air missiles.

Today MQ-9 REAPER unmanned aerial vehicles, armed with HELLFIRE missiles, roam the skies over Iraq and Afghanistan. Their pilots sit comfortably in a U.S. homeland Air Force base. Minimal risk, minimal U.S. casualties, and all is well, right?

Except that any power that chooses to trade its hardware for the adversary's lives is no longer conducting a "Just War". In particular, the notion of "proportionality" in conducting a just war is defeated -- and, more worrisome, the insurgency is incentivized to grow.

And what of advances in artificial intelligence, or A.I.? What if we develop sensor grids that can pass the Turing Test and demonstrate the capacity for independent thought and action? (The U.S. Navy already does this to a lesser degree aboard their AEGIS cruisers and destroyers: the SPY radar system has rigid "rule sets" to detect and engage threats, like anti-ship cruise missiles.)

The technology is emerging to allow the U.S. to project power without endangering its citizen soldiers -- akin to Rome's outsourcing of risk and security in the latter days of Empire. Mountain Runner's question is provocative because it identifies the core issue: not technology, but rather the perception of that technology and its moral implications.

Labels: , , ,

15 Comments:

At 29/7/07 16:51 , Blogger Dan tdaxp said...

Except that any power that chooses to trade its hardware for the adversary's lives is no longer conducting a "Just War". In particular, the notion of "proportionality" in conducting a just war is defeated -- and, more worrisome, the insurgency is incentivized to grow.

Are you sure about this?

Hardware is merely a form of technology -- and training, doctrine, and cohesion are other technologies. So saying that superior hardware makes a war unjust would sound like saying that superior doctrine makes a war unjust.

 
At 29/7/07 19:49 , Blogger deichmans said...

Dan,

Consider how the adversary -- or, more importantly, the undecided citizen in the combat zone -- sees the situation. Which force seems more courageous: A force that chooses to delegate security operations to robotic systems, or a less-technical force that is in close personal contact with the battlespace?

This is more than a mere matter of doctrine. The moral dimension outweighs the technical and the doctrinal, and war is ultimately fought on a moral level. (Note that I am still pondering your latest postings in the D5GW series that run counter to that notion.)

For an adversary who possesses cultural propensities that are tribal in nature, removing yourself from the battlefield and sending in your hardware in its place risks showing a fundamental lack of courage.

For the sake of "Justness", making war 'bloodless' for one side only is more consistent with Militarism and Prescriptive Realism (Realpolitik) than with the principles of Just War.

 
At 30/7/07 08:51 , Blogger Dan tdaxp said...

Shane,

A force that chooses to delegate security operations to robotic systems, or a less-technical force that is in close personal contact with the battlespace?

Answer: a force that applies an economically correct mix of capital and land to the production of war. The mix depends on both the local environment and the host economy.

This is more than a mere matter of doctrine. The moral dimension outweighs the technical and the doctrinal, and war is ultimately fought on a moral level. (Note that I am still pondering your latest postings in the D5GW series that run counter to that notion.)

Those who wish to define the Long War primarily as a "moral conflict" should prepare themselves for Qaedist chic, in exactly the same way that our prior Long Wars produced Communist chic and Dixie chic. (What's the difference between this and this? Just which system of tyranny is fashionable.)

For an adversary who possesses cultural propensities that are tribal in nature, removing yourself from the battlefield and sending in your hardware in its place risks showing a fundamental lack of courage.

Tribal societies are based on long-term power- and financial- relationships. Courage in this context is a means, not an end. The Sunni Arab tribes which suddenly embrace us and oppose al Qaeda are neither courageous nor cowardly: they are being wise.

For the sake of "Justness", making war 'bloodless' for one side only is more consistent with Militarism and Prescriptive Realism (Realpolitik) than with the principles of Just War.

The best war, of course, is bloodless (or nearly so) for both sides. But do not confuse the battlefield with the war.

 
At 30/7/07 11:18 , Anonymous Anonymous said...

As Hassan al-Turabi said, you always need "your blood in the battlefield".
Warfare 2050


http://warfare2050.blogspot.com/2007/07/semper-fi-wizard-of-oz.html

 
At 31/7/07 12:34 , Blogger MNFamilyHistorian said...

Proportionality has nothing to do with who (or what) is doing the fighting. It's solely based on the effects of attacking a particular target.

 
At 1/8/07 12:56 , Blogger deichmans said...

Nathanm,

Proportionality also has an implicit component of parity with respect to risks. Zeroizing your own risk while maximizing the opponent's is imcompatible with _Jus in Bello_.

 
At 1/8/07 17:29 , Blogger Jay@Soob said...

One should also consider the ease of waging war with intelligent non-human weaponry and the desensitizing effect it may well have.

Personally I find much of the "Just War" philosophy ridiculous. That aside the easier we make commiting war the more likely we are to use as opposed to more intelligent and less drastic measures. Furthermore such security has a tendency to breed complacency and an even dimmer vision of the future (if there can be a vision of the future.)

We possessed all the stealth technology on the planet and still, 9/11.

 
At 2/8/07 05:55 , Blogger Dan tdaxp said...

Proportionality also has an implicit component of parity with respect to risks. Zeroizing your own risk while maximizing the opponent's is imcompatible with _Jus in Bello_.

Why?

 
At 2/8/07 06:14 , Blogger deichmans said...

Soob,

Good points, esp. the notion of "desensitizing" people to conflict. (Aside: Do you remember the original Star Trek series' episode "A Taste of Armageddon"? In order to preserve their cities and infrastructure, battles were fought by computers -- and people whom the computer declared "dead" had to walk into disintegration chambers.)

As for "Just War" philosophy, and its descendants like the Geneva Conventions, they may be ridiculous. But it is a rule set of the Core that I believe is essential.


Dan,

Proportionality has to be a two-way street. While autonomy may seem entirely "proportional" to one side, the recipient will see instead that their lives are being equated to hardware. Win the battle, but lose the war of perception.

 
At 2/8/07 08:11 , Blogger D Blair said...

This comment has been removed by the author.

 
At 2/8/07 08:19 , Blogger D Blair said...

Hello, I found this blog from a posting on Thomas P.M. Barnett's weblog & have enjoyed reading the post. I came across this news bit about a robot carrier deck:

Robot Carrier

I guess it was only a matter of time before they started making them.

D. Blair

 
At 4/8/07 06:20 , Blogger Dan tdaxp said...

If I can rephrase, why does risk management impact invalidate Jus in Bello? Is there a critical region of acceptable risks (say, .3 to .7) outside of which a war is fought unjustly? How is this morally derived?

 
At 5/8/07 14:50 , Blogger deichmans said...

David, Thx for the "Robot Carrier" link -- I suppose it was inevitable (though I suspect Maverick would disagree... :-).

Dan, I submit that there is a big difference between "risk management" and "risk avoidance". Managing risk is moral; completely avoiding risk while denying your adversary the same is immoral.

 
At 7/8/07 09:41 , Anonymous Anonymous said...

Shane, you may be interested in a survey I just posted on robots. I'd appreciate your thoughts. Click here for a quick survey on robots. It is by no means comprehensive, but it should be informative as it queries a discrete area of their use.

 
At 8/8/07 18:04 , Anonymous Anonymous said...

Shane,
It's been years since I've seen 'Turing Test'...Lovely. There are strong-AI people that believe the average thermostat has exactly 3 'thoughts' - too cold, just right, and too hot. I commented over at MountainRunner too. But, I completely agree with your take on risk management vs. avoidance as well as your point on 'our side' being too removed or 'insulated' in the minds of the people in-theater. While certainly useful tools, UAVs, for example, run counter to the overall type of engagement which is most beneficial strategically - they're necessarily divisive on all levels.
Isaac

 

Post a Comment

Subscribe to Post Comments [Atom]

<< Home