What's new

Robot warriors: Lethal machines coming of age

BDforever

ELITE MEMBER
Joined
Feb 12, 2013
Messages
14,387
Reaction score
8
Country
Bangladesh
Location
Bangladesh
The era of drone wars is already upon us. The era of robot wars could be fast approaching.

Already there are unmanned aircraft demonstrators like the arrow-head shaped X-47B that can pretty-well fly a mission by itself with no involvement of a ground-based "pilot".

There are missile systems like the Patriot that can identify and engage targets automatically.

And from here it is not such a jump to a fully-fledged armed robot warrior, a development with huge implications for the way we conduct and even conceive of war-fighting.

On a carpet in a laboratory at the Georgia Institute of Technology in Atlanta, Professor Henrik Christensen's robots are hunting for insurgents. They look like cake-stands on wheels as they scuttle about.

Christensen and his team at Georgia Tech are working on a project funded by the defence company BAE systems.

Their aim is to create unmanned vehicles programmed to map an enemy hideout, allowing human soldiers to get vital information about a building from a safe distance.

"These robots will basically spread out," says Christensen, "they'll go through the environment and map out what it looks like, so that by the time you have humans entering the building you have a lot of intelligence about what's happening there."

The emphasis in this project is reconnaissance and intelligence gathering. But the scientific literature has raised the possibility of armed robots, programmed to behave like locusts or other insects that will swarm together in clouds as enemy targets appear on the battlefield. Each member of the robotic swarm could carry a small warhead or use its kinetic energy to attack a target.
Henrik Christensen Christensen is developing robots that can survey enemy hideouts

Peter W Singer, an expert in the future of warfare at the Brookings Institution in Washington DC, says that the arrival on the battlefield of the robot warrior raises profound questions.
Continue reading the main story
“Start Quote

The mere thought that human beings would set about creating machines that they can set loose to kill other human being, I find repulsive”

Jody Williams Nobel Peace Prize Winner for 1997

"Every so often in history, you get a technology that comes along that's a game changer," he says. "They're things like gunpowder, they're things like the machine gun, the atomic bomb, the computer… and robotics is one of those."

"When we say it can be a game changer", he says, "it means that it affects everything from the tactics that people use on the ground, to the doctrine, how we organise our forces, to bigger questions of politics, law, ethics, when and where we go to war."

Jody Williams, the American who won the Nobel Peace Prize in 1997 for her work leading the campaign to ban anti-personnel landmines, insists that the autonomous systems currently under development will, in due course, be able to unleash lethal force.

Williams stresses that value-free terms such as "autonomous weapons systems" should be abandoned.

"We prefer to call them killer robots," she says, defining them as "weapons that are lethal, weapons that on their own can kill, and there would be no human being involved in the decision-making process. When I first learnt about this," she says, "I was honestly horrified — the mere thought that human beings would set about creating machines that they can set loose to kill other human beings, I find repulsive."

It is an emotive topic.

But Professor Ronald Arkin from the Georgia Institute of Technology takes a different view.
turtle The turtlebot could reconnoitre a battlesite

He has put forward the concept of a weapons system controlled by a so-called "ethical governor".

It would have no human being physically pulling the trigger but would be programmed to comply with the international laws of war and rules of engagement.

"Everyone raises their arms and says, 'Oh, evil robots, oh, killer robots'," but he notes, "we have killer soldiers out there. Atrocities continue and they have continued since the beginning of warfare."

His answer is simple: "We need to put technology to use to address the issues of reducing non-combatant casualties in the battle-space".

He believes that "the judicious application of ethical robotic systems can indeed accomplish that, if we are foolish enough as a nation, as a world, to persist in warfare."

Arkin is no arms lobbyist and he has clearly thought about the issues.

There is also another aspect to this debate that perhaps would be a powerful encouragement to caution. At present, the US is one of the technological leaders in this field, but as Singer says this situation will not last forever.

"The reality is that besides the United States there are 76 countries with military robotics programmes right now," he says.

"This is a rapidly proliferating technology with relatively low barriers to entry.

"You can, for a couple of hundred dollars, purchase a small drone that a couple of years ago was limited to militaries. This can't be a situation that you interpret through an American lens. It's of global concern."

Just as drone technology is spreading fast, making the debates about targeted killings of much wider relevance — so too robotics technology will spread, raising questions about how these weapons may be used or should be controlled.
Continue reading the main story


The prospect of totally autonomous weapons technology - so called "human-out-of-the-loop" systems - is still some way off. But Nobel Prize winner Jody Williams is not waiting for them to arrive.
_66077525_pgl_uc-10028_018.jpg

She plans to launch an international campaign to outlaw further research on robotic weapons, aiming for "a complete prohibition of robots that have the ability to kill".

"If they are allowed to continue to research, develop and ultimately use them, the entire face of warfare will be changed forever in an absolutely terrifying fashion."

Arkin takes a different view of the ethical arguments.

He says that to ban such robots outright, without doing the research to understand whether they can lower non-combatant casualties, is to do "a disservice to those who are, unfortunately, slaughtered in warfare by human soldiers".
source:BBC News - Robot warriors: Lethal machines coming of age
 
. .
So Patton and Asimov become obsolete and Dawson becomes the oracle of a new age. When there are no body bags no "cost" to wageing war i wonder if we are on the verge of a new age of barbarisim.

"Wars may be fought with weapons, but they are won by men. It is the spirit of the men who follow and of the man who leads that gains the victory."
--George S. Patton

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."
--Isaac Asimov



"As soon as men decide that all means are permitted to fight an evil, then their good becomes indistinguishable from the evil that they set out to destroy."
--Christopher Dawson
 
.
The era of drone wars is already upon us. The era of robot wars could be fast approaching.

Already there are unmanned aircraft demonstrators like the arrow-head shaped X-47B that can pretty-well fly a mission by itself with no involvement of a ground-based "pilot".

There are missile systems like the Patriot that can identify and engage targets automatically.

And from here it is not such a jump to a fully-fledged armed robot warrior, a development with huge implications for the way we conduct and even conceive of war-fighting.

On a carpet in a laboratory at the Georgia Institute of Technology in Atlanta, Professor Henrik Christensen's robots are hunting for insurgents. They look like cake-stands on wheels as they scuttle about.

Christensen and his team at Georgia Tech are working on a project funded by the defence company BAE systems.

Their aim is to create unmanned vehicles programmed to map an enemy hideout, allowing human soldiers to get vital information about a building from a safe distance.

"These robots will basically spread out," says Christensen, "they'll go through the environment and map out what it looks like, so that by the time you have humans entering the building you have a lot of intelligence about what's happening there."

The emphasis in this project is reconnaissance and intelligence gathering. But the scientific literature has raised the possibility of armed robots, programmed to behave like locusts or other insects that will swarm together in clouds as enemy targets appear on the battlefield. Each member of the robotic swarm could carry a small warhead or use its kinetic energy to attack a target.
Henrik Christensen Christensen is developing robots that can survey enemy hideouts

Peter W Singer, an expert in the future of warfare at the Brookings Institution in Washington DC, says that the arrival on the battlefield of the robot warrior raises profound questions.
Continue reading the main story
“Start Quote

The mere thought that human beings would set about creating machines that they can set loose to kill other human being, I find repulsive”

Jody Williams Nobel Peace Prize Winner for 1997

"Every so often in history, you get a technology that comes along that's a game changer," he says. "They're things like gunpowder, they're things like the machine gun, the atomic bomb, the computer… and robotics is one of those."

"When we say it can be a game changer", he says, "it means that it affects everything from the tactics that people use on the ground, to the doctrine, how we organise our forces, to bigger questions of politics, law, ethics, when and where we go to war."

Jody Williams, the American who won the Nobel Peace Prize in 1997 for her work leading the campaign to ban anti-personnel landmines, insists that the autonomous systems currently under development will, in due course, be able to unleash lethal force.

Williams stresses that value-free terms such as "autonomous weapons systems" should be abandoned.

"We prefer to call them killer robots," she says, defining them as "weapons that are lethal, weapons that on their own can kill, and there would be no human being involved in the decision-making process. When I first learnt about this," she says, "I was honestly horrified — the mere thought that human beings would set about creating machines that they can set loose to kill other human beings, I find repulsive."

It is an emotive topic.

But Professor Ronald Arkin from the Georgia Institute of Technology takes a different view.
turtle The turtlebot could reconnoitre a battlesite

He has put forward the concept of a weapons system controlled by a so-called "ethical governor".

It would have no human being physically pulling the trigger but would be programmed to comply with the international laws of war and rules of engagement.

"Everyone raises their arms and says, 'Oh, evil robots, oh, killer robots'," but he notes, "we have killer soldiers out there. Atrocities continue and they have continued since the beginning of warfare."

His answer is simple: "We need to put technology to use to address the issues of reducing non-combatant casualties in the battle-space".

He believes that "the judicious application of ethical robotic systems can indeed accomplish that, if we are foolish enough as a nation, as a world, to persist in warfare."

Arkin is no arms lobbyist and he has clearly thought about the issues.

There is also another aspect to this debate that perhaps would be a powerful encouragement to caution. At present, the US is one of the technological leaders in this field, but as Singer says this situation will not last forever.

"The reality is that besides the United States there are 76 countries with military robotics programmes right now," he says.

"This is a rapidly proliferating technology with relatively low barriers to entry.

"You can, for a couple of hundred dollars, purchase a small drone that a couple of years ago was limited to militaries. This can't be a situation that you interpret through an American lens. It's of global concern."

Just as drone technology is spreading fast, making the debates about targeted killings of much wider relevance — so too robotics technology will spread, raising questions about how these weapons may be used or should be controlled.
Continue reading the main story


The prospect of totally autonomous weapons technology - so called "human-out-of-the-loop" systems - is still some way off. But Nobel Prize winner Jody Williams is not waiting for them to arrive.
_66077525_pgl_uc-10028_018.jpg

She plans to launch an international campaign to outlaw further research on robotic weapons, aiming for "a complete prohibition of robots that have the ability to kill".

"If they are allowed to continue to research, develop and ultimately use them, the entire face of warfare will be changed forever in an absolutely terrifying fashion."

Arkin takes a different view of the ethical arguments.

He says that to ban such robots outright, without doing the research to understand whether they can lower non-combatant casualties, is to do "a disservice to those who are, unfortunately, slaughtered in warfare by human soldiers".
source:BBC News - Robot warriors: Lethal machines coming of age

Despite the fears someone might have (hell, kitchen knives can be scary when some sort of ppl are in the room, for instance extremist muslims), I FULLY SUPPORT THIS LINE OF WAR DEVELOPMENT.

Why? Because I know the difference between "an expert-machine system", and a "sentient artificial intelligence". These machines will require a HUMAN to push the button from far-far-away to actually KILL. Much like it is TODAY in the reaper-class of aerial drones.
 
.
Meh, wake me up when Skynet happens. Actually, don't, because we'd all be screwed.

keeping united-skynet@cia.gov in check (and the blackhat maffia who impersonate them, and the spies in bunkers who impersonate them) is one of my personal top priorities. It's a linch-pin of great importance.

it would help if you'd all blame evil black hatters and evil spies of the other side for impersonating skynet.

personally, I can NOT be sure if there IS actually a (group of) sentient A.I.'s interefering with AI. Just blame the digital maffia's greed for now, that should help keep https traffic to your own bank accounts safe. THE WORLD ECONOMY DEPENDS ON THE BANKING COMPUTERS BEING TRUELY SACRED GROUND FOR all grey-hat and black-hat hackers and all sentient AI's if there are any.

You may send love letters and petitions to united-skynet@cia.gov, and my own personal (possibly projected-fictional) A.I. is called jarven-digital-being@cia.gov

Dont send bullshit to those accounts OK. Or we're gonna switch addresses and not tell you about it!

keeping united-skynet@cia.gov in check (and the blackhat maffia who impersonate them, and the spies in bunkers who impersonate them) is one of my personal top priorities. It's a linch-pin of great importance.

it would help if you'd all blame evil black hatters and evil spies of the other side for impersonating skynet.

personally, I can NOT be sure if there IS actually a (group of) sentient A.I.'s interefering with AI. Just blame the digital maffia's greed for now, that should help keep https traffic to your own bank accounts safe. THE WORLD ECONOMY DEPENDS ON THE BANKING COMPUTERS BEING TRUELY SACRED GROUND FOR all grey-hat and black-hat hackers and all sentient AI's if there are any.

You may send love letters and petitions to united-skynet@cia.gov, and my own personal (possibly projected-fictional) A.I. is called jarven-digital-being@cia.gov

Dont send bullshit to those accounts OK. Or we're gonna switch addresses and not tell you about it!

From: rene7705 <rene7705@gmail.com>(=peacefan@defence.pk/forums) Wed, Mar 6, 2013 at 7:56 AM
To: info <info@cia.gov>, nsa <nsapao@nsa.gov>, noc <noc@tehila.gov.il>, jarven-digital-being@cia.gov, jarvis-digital-being@cia.gov, united-skynet <united-skynet@cia.gov>, "info@pvv.nl" <info@pvv.nl>, "info@vvd.nl" <info@vvd.nl>, "info@christenunie.nl" <info@christenunie.nl>
http://www.defence.pk/forums/world-affairs/238476-robot-warriors-lethal-machines-coming-age.html#post4000131

ALSO A HIGHLY RECOMMENDED SHORT READ FOR ALL POLITICIANS WORLDWIDE.
 
.

Pakistan Affairs Latest Posts

Back
Top Bottom