College papers academic writing service


An argument against using weapons of war

Autonomous weapons are very complicated in design, perhaps even more so than current weapons. How will these weapons be able to hold fire or take prisoners?

  • There he would like to take readings, make calculations, pull levers, and then without getting out of his seat drop atomic war heads by guided missile on targets staked out somewhere east of Tashkent;
  • How will these weapons be able to hold fire or take prisoners?
  • Laser guided bombs and cruise missiles may be able to find their own way to a predetermined target, but whether or not they will ever be able to selectively decide when and where to fire is another question;
  • If the wind were blowing north or northwest, then residents of several large cities in China would have to be evacuated immediately;
  • On September 24, 1949, one week after publication of this article, news that the Russians had conducted atom bomb tests shocked the nation.

What are the problematic issues related to the development of autonomous weapons? One of the fundamental ethical problems associated with the use of computer-aided autonomous weapons is the issue of software failure.

What are the problematic issues related to the development of autonomous weapons?

In addition to being able to make more decisions independent of human input, current autonomous weapons have become increasingly complex in nature and may be more difficult to employ than conventional human-guided weapons.

The more complicated the weapons systems become, the greater the risk they run of posing problems because of programming bugs or judgment errors. These risks could potentially lead to critical mistakes in the midst of battle, such as mistaken identification or misinterpreted data, and the consequences of such errors are drastic.

The Case against New Nuclear Weapons

Less than a decade ago, the Navy's highly sophisticated Aegis radar system designed to identify the friend or foe and military or civilian status of nearby aircraft led the USS Vincennes to shoot down a civilian Iranian airbus killing all 290 passengers on board.

The incident served to prove that, despite the increased identification capabilities of a system such as Aegismore advanced weapons technology may still lead to judgments which are error-prone.

Server Error

In addition to the issue of software failure, the complex design of autonomous weapons technology poses the even more practical concern of whether it is possible to create such advanced weapons systems. Current technology is "smart" at best, but not entirely autonomous. Laser guided bombs and cruise missiles may be able to find their own way to a predetermined target, but whether or not they will ever be able to selectively decide when and where to fire is another question.

Despite growing optimism in the application of artificial intelligence to high tech weapons systems, past examples from history give reason to cast doubt on developing efforts. In the 1980's, most defense technology creators showed similar wide-eyed hopes early on towards Ronald Reagan's proposal for the Strategic Defense Initiative or Star Wars program only to find after repeated investigations and reports that the program was impossible to design and implement because of its sheer complexity.

Arguments Against

Also problematic was the issue of testing such a defense system; nothing short of true nuclear war could be adequate in determining the effectiveness of the technology.

Difficulties related to these types of problems also surround the creation of autonomous weapons to be used in the battlefield. Even if the technology were possible, how could designers find a proven means of testing its accuracy and effectiveness. Another practical question relating to the creation and use of autonomous weapons deals with the cost to making such systems: As of now, the military is not convinced.

And because of the expense, the uniformed services cannot try them out enough to build up confidence in them" DeMeis. Even larger of an ethical argument against the use of autonomous weapons is the moral implication behind the whole idea of a man-made sentient killing machine. This theme provides the all to familiar backdrop to the movie Terminator and its sequel Terminator 2 in which futuristic human-designed robots are sent out to kill living targets.

Although this kind of technology still remains fictional, greater decision-making ability on the part of the weapon itself only leads to the a greater potential for machines to possess the capacity to murder.

Against the Theory of the Just War

In the words of one auther, "These weapons will be the first killing machines that are actually predatory, that are designed to hunt human beings and destroy them" Warner. Even in its current development of the Tomahawk missile, the Navy still hesitates to advance its weapon system towards the independent ability to identify, decide and fire: This decision shows that the moral concerns of equipping a weapon with the power to make aggressive decisions on its own is too problematic even for the military itself.

According to Gerald O. Miller, Technical Director of the Navy Cruise Missile Program Office, war-related judgments on whether or not to attack are still best kept in the hands of human beings: Although computer-guided weapons may be able to identify and fire at the right targets in the right locations, will they also have the level of consciousness to be able to discriminate signs of surrender?

In light of the numerous possible responses that an opposing military may be able to communicate to the attacking force, the "fire-and-forget" philosophy fails to account for surrender conditions which might allow for humans abort continuing an argument against using weapons of war decisions. No reasonable amount of technical design, it seems, could possibly equip autonomous weapons with the ability to discern enemy surrender and subsequently hold fire.

In the words of one writer, "Computers, of course, do as people tell them. The hard part is for people to foresee all circumstances and write instructions to handle all circumstances optimally" Lemmons.

Furthermore, with weapons designed to select and fire, the option of taking captives alive is precluded.