What's new

U.N. official calls for study of ethics, legality of unmanned weapons

pkpatriotic

SENIOR MEMBER
Joined
Apr 2, 2008
Messages
2,317
Reaction score
0
U.N. official calls for study of ethics, legality of unmanned weapons
By Patrick Worsnip
Sunday, October 24, 2010; 12:05 AM


UNITED NATIONS - A United Nations investigator called on the world body Friday to set up a panel to study the ethics and legality of unmanned military weapons - an apparent reference to U.S. drones that have targeted suspected Islamist militants.

In a report to the U.N. General Assembly human rights committee, Christof Heyns said such systems raised "serious concerns that have been almost entirely unexamined by human rights or humanitarian actors."

"The international community urgently needs to address the legal, political, ethical and moral implications of the development of lethal robotic technologies," said Heyns, the U.N. special rapporteur on extrajudicial executions.

It was the second time this year that a U.N. official has brought up the issue. In June, Heyns's predecessor, Philip Alston, called for a halt to CIA-directed drone strikes on al-Qaeda and Taliban suspects in Afghanistan and Pakistan.

Alston said that killings ordered far from the battlefield could lead to a "PlayStation" mentality. The CIA contested his findings, saying - without confirming it carried out the strikes - that its operations "unfold within a framework of law and close government oversight."

Heyns, a South African law professor, said Friday that there was a need to discuss responsibility for civilian casualties, how to ensure that the use of robots complied with humanitarian law, and standards for developing the technology involved.

He added that the United Nations should take a lead on the issue, and he urged Secretary General Ban Ki-moon to convene a group of national representatives, human rights experts, philosophers, scientists and developers to promote a debate on the legal and moral implications of robotic weapons.

Among the issues it should study is "the fundamental question of whether lethal force should ever be permitted to be fully automated," he added.

- Reuters
 
.
Today's "robotic" weapons have NO autonomy. They operate (and the attack decision is made) under human supervision.

As to a "PlayStation" mentality, the concept was born when the first long-ranged artillery was invented. Or an ICBM. People well away from the destruction decide to fire, and push a button or turn a key.

Fully autonomous (true robots) with no human in the loop - I can't see them being given authority to attack any time soon. It's too prone to error.
 
.
^^
But that too can happen provided the framworks of actions in certin Defined Scenarios
Artificial Intelligence can be created provided there are more attributes of analyzing , identifying a true target .. perhaps later these drones can be so accurate that they would only hit a defined target and not the civillians but to get there one has to go a long way through R&D...!!!
 
.
we proud to be a Major non-NATO ally which are hit by drone attacks
shameful for all of us
 
.
^^
But that too can happen provided the framworks of actions in certin Defined Scenarios
Artificial Intelligence can be created provided there are more attributes of analyzing , identifying a true target .. perhaps later these drones can be so accurate that they would only hit a defined target and not the civillians but to get there one has to go a long way through R&D...!!!

Remember the movie "2001: A Space Odyssey"? HAL9000 was a computer that had true AI. The movie was made in 1968, and they were predicting HAL9000-level intelligence in 33 years. They've been trying ever since.

Sure you can make expert systems, and software can be pretty advanced, but I just cannot see your scenario (allowing software to authorize autonomous weapons release) barring a tremendous breakthrough in the field of artificial intelligence.
 
.

Latest posts

Pakistan Defence Latest Posts

Pakistan Affairs Latest Posts

Back
Top Bottom