Tuesday, November 19, 2019

Not Antiseptic War via Technology; War Will Only Get Deadlier

Part of the drive to unmanned systems it, at least from the side that uses it, a desire to minimize casualties. 

Sure, there are also theoretical cost savings ... and micro-managers love the idea of systems who follow orders w/o 2nd thoughts or any thoughts ... but what are we missing?

Something scifi writers have built careers on is not being discussed well enough. Especially when both sides will have similar technologies ... are we simply building a new way for mass slaughter on a scale no one is ready for?

Anthony Swofford in Technology Review, author of the well known book made in to a movie, Jarhead, makes some superb points worthy of consideration.

I think he is more right than anyone will be comfortable with;
Many now profess that the young Marine or soldier with a rifle is obsolete. The greatest weapons race of all is among academic scientists trying to win DARPA funding for new warfighting technology they insist will require scant human interface with the killing act, thus relieving the combatant of the moral quandary and wounds of war. Private-sector startups sell a myth of smart war through AI, or robotic soldiers. In labs where the newest and cleanest ways to kill are being invented, the conversation is not about the morality of going to war, but rather the technology of winning. But when you rely on a myth of technology and distance killing to build a rationale for easy war, your country will lose its soul.
...
If fighting war is like swiping your smartphone for an order of groceries or posting a meme to Instagram, how bad can it really be? And if a politician is seduced by the lies and supposed ease of technological warfare and leads us into a mistaken conflict, is it really his or her fault? Didn’t we all think it would be a breeze?

The moral distance a society creates from the killing done in its name will increase the killing done in its name. We allow technology to increase moral distance; thus, technology increases the killing. More civilians than combatants die in modern warfare, so technology increases worldwide civilian murder at the hands of armies large and small.

The person with the least amount of distance from the killing—typically an infantryman or special operator—is the most morally stressed and compromised individual in the war’s chain of command. When close-quarters combatants understand that the killing they have practiced is not backed by a solid moral framework, they question every decision taken on the battlefield. But they also question the meaning of the fight. They count their dead friends on one or even two hands. They count the men they have killed on one or two hands, or by the dozen. The moral math will not compute.

The photos and videos of war on our television screens, on our computers, on our smartphones, tell us nothing about the moral computations of the warfighter. The warfighter understands that when a friend is killed on patrol, that is just part of the package. Another part of the package is going back out on another patrol tomorrow. But as you live and operate for longer in a hostile environment, your hatred of the enemy increases and your trust in leadership decreases. You create a moral wound against yourself.

War was supposed to be easy or fast, because of smart bombs and the latest bit of warfighting technology. But this means nothing when years later you only see dead men, women, and children when you try to sleep.

When we believe the lie that war can be totally wired and digitized, that it can be a Wi-Fi effort waged from unmanned or barely manned fighting apparatus, or that an exoskeleton will help an infantryman fight longer, better, faster, and keep him safe, no one will be held responsible for saying yes to war. The lie that technology will save friendly, civilian, and even enemy lives serves only the politicians and corporate chieftains who profit from war. The lie that technology can prevent war, or even create compassionate combat, is a perverse and profane abuse of scientific thinking.
Read it all.

No comments: