go_away

Author Topic: The Three Laws of Robotics bug  (Read 615 times)

0 Members and 1 Guest are viewing this topic.

Offline NinoTopic starter

  • Beginner
  • *
  • Posts: 1
  • Helpful? 0
The Three Laws of Robotics bug
« on: June 05, 2011, 01:24:19 PM »
The Three Laws of Robotics will only survive a bug free world. However, it is the nature of mankind to create bugs in machines and it is the nature of bugs to ignore rules. Hence, the world as we know it today will cease to exist once robots understand the Three Laws of Robotics.

http://ninoransenberg.com/2011/06/01/the-three-laws-of-robotics/

Offline mstacho

  • Supreme Robot
  • *****
  • Posts: 375
  • Helpful? 10
Re: The Three Laws of Robotics bug
« Reply #1 on: June 06, 2011, 10:17:23 AM »
Asimov himself agreed that there were bugs in the three laws.  Consider this:

During war, people hurt other people.  As a result, any robot in a war zone should be striving to protect the people.  Since soldiers are fighting each other, then conceivably the robot should be protecting soldiers of both sides.

So far so good, but here's the catch: the first law states that a robot must not only not harm a human being with its own actions, it must also not allow a human being to come to harm through its own *inaction*.  A sufficiently versatile robot, then, would never be useful as soon as it heard about a war going on, since it would have to act.  Even if it's in North America and the war is taking place in Europe, if the robot knows it is capable of going to Europe, it would have to do so. So unless mankind stopped wars completely before implementing the three laws, any robot that could move to a conflict would have to, thus becoming completely useless with regards to its original programming :-P

I guess that's sort of a bug that's due to the features of the laws, though...

MIKE
Current project: tactile sensing systems for multifingered robot hands

 


Get Your Ad Here