Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Mental Stress And Drone Warfare

4 posts / 0 new
Last post
bibliophile20 bibliophile20's picture
Mental Stress And Drone Warfare
There was a very interesting (and highly disturbing in some of its implications) article in the GQ last week, regarding drone warfare and the stress on the pilots when they're sitting thousands of miles away and essentially executing people with missile strikes at no risk to themselves. [url=http://www.gq.com/news-politics/big-issues/201311/drone-uav-pilot-assass... Link[/url]. One point that I personally found very disturbing personally went as follows:
Quote:
In 2011, Air Force psychologists completed a mental-health survey of 600 combat drone operators. Forty-two percent of drone crews reported moderate to high stress, and 20 percent reported emotional exhaustion or burnout. The study’s authors attributed their dire results, in part, to “existential conflict.” A later study found that drone operators suffered from the same levels of depression, anxiety, PTSD, alcohol abuse, and suicidal ideation as traditional combat aircrews. These effects appeared to spike at the exact time of Bryant’s deployment, during the surge in Iraq. (Chillingly, to mitigate these effects, researchers have proposed creating a Siri-like user interface, a virtual copilot that anthropomorphizes the drone and lets crews shunt off the blame for whatever happens. Siri, have those people killed.)
(Emphasis added) So I was discussing this point with one of my EP players, and we got to musing on responsibility, blame and stress for if-and-when you do have semi- and fully-autonomous artificially intelligent drones out there, with independent decision-making capabilities. Where does the blame and responsibility land, when the machine is capable of making its own decisions for who to kill? With the programmer? The superior officer who gave it the orders? With the drone itself? Thoughts?

"Democracy is two wolves and a lamb voting on what to have for lunch. Liberty is a well-armed lamb contesting the vote." -Benjamin Franklin

Smokeskin Smokeskin's picture
Are we talking morally, or
Are we talking morally, or psychologically?
bibliophile20 bibliophile20's picture
Smokeskin wrote:Are we
Smokeskin wrote:
Are we talking morally, or psychologically?
Either. Both. :)

"Democracy is two wolves and a lamb voting on what to have for lunch. Liberty is a well-armed lamb contesting the vote." -Benjamin Franklin

Arenamontanus Arenamontanus's picture
Hehehe... just the kind of
Hehehe... just the kind of question I am working with. In the case of psychological blame it is likely sticky: if you feel you deserve blame, you will feel it. No matter how many steps between your action and the eventual result. It can be weakened through various tricks (one blank shot per execution, training to dehumanize the enemy) but the Siri trick is not going to work much, unless the system starts to feel like an autonomous agent. At which point you have a very dangerous drone. The machine responsibility issue is fairly straightforward if it was a full moral agent like an EP AGI: it is morally responsible for its actions, since it can consider what it is doing and decide to do something else. An AGI can decide that an order was not lawful and say no. But if it is a dumb AI with no real choice in what it was doing, yet flexible autonomous action patterns? Here there is no moral agency. So responsibility must lie elsewhere. I am arguing in a paper that systems like these are moral proxies: they act on behalf of others, and those people are responsible for letting loose the proxy and its attempts to approximate their intentions. There might be some shared responsibility between the programmer trying to implement whatever rules of engagement there are, the officer deciding certain drones are "safe enough", and the drone operator giving commands. In many ways it is like having a dangerous animal - the animal is acting on its own, but you are responsible for letting it off the leash. http://blog.practicalethics.ox.ac.uk/2013/06/cry-havoc-and-let-slip-the-... http://blog.practicalethics.ox.ac.uk/2013/08/would-you-hand-over-a-moral...
Extropian