Nebraska's #1 Voted Computer Repair Company

Ethics, technology and you.

When I asked Siri what the three laws of robotics are, Apple’s intelligent assistant replies, “Something about obeying people and not hurting them. (Pauses) I would never hurt anyone.” While Siri is only joking about the laws, (we hope), society is asking what should be the laws governing decision making for computers. In his 1942 short story, Runaround, Issac Asimov introduces the laws concerning computers, or the Three Law of Robotics.
They are:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Sounds like a solid start, right? The problem is humans are building machines designed specifically to harm and kill humans so these law don’t really work. With the emerging technology like self-driving-cars many ethical questions are being asked. If a self-driving-car full of passengers must make a decision between driving off a cliff and killing the people inside or drive over children what should it do? Should the car make a decision based on who is in the car?
With more and more decision making options being surrendered over to computers, what should the laws be concerning these computers and robots?
What do you think? Let us know.

Share this post