Isaac Assimov's Rules of Robotics.

(And how they apply to Megaman!)


Waaaaaay back in the 1950's, when science fiction movies were really starting to take off, a favorite subject of sci-fi authors, be they novelists or screenwriters, was the possibility of robots going berserk and beating up on people. The general populace, not being terribly sophisticated with regard to technology (They were mesmerized by hoola-hoops!), began to fear that one day, people would build robots that would go berserk, or "Maverick," if you will, and kill all of humanity.

Enter Isaac Asimov, famed science fiction writer. He introduced guidelines to sci-fi writers in an effort to stop them from scaring people. Soon, engineers and scientists in real life took these guidelines to heart, and cling to them even now, as the race continues to devise some viable form of artificial inteligence.

These Rules of Robotics are very relevent to this page, as it is dedicated to the Megaman series - a series based on robots, some of whom are capable of thinking and feeling for themselves. Who knows? There may be a budding scientist out there who shares Dr. Light's dream of robots and humans living and working side-by-side.

In cases where a robot's actions place it in a state where obeying more than one Rule puts the two rules in conflict, the earlier rule will always override successive rules. Rule #1 is supreme above Rule #2 and Rule #3. Rule #2 beats #3, but not #1. Rule #3 loses to both Rules #1 and #2.


THE RULES

RULE #1 - A robot must never harm a human beingg.

Seems simple enough. Anyone who's played the Megaman X games or has seen The Terminator knows why this is important. Can't have robots turning on their creators, can we? In the original Megaman games, we see that Dr. Wiley's Robot Masters obey this law (though not many other man-made laws!), as his Robot Masters wreak havok, but do not set out to kill people. Sigma and the Mavericks in the X games, however, have broken all the Rules. Their primary mission is to destroy humanity.

RULE #2 - A robot must do what a human tells itt to do, unless this conflicts with Rule #1.

Again, pretty simple. Robots must do what Humans tell them. But, if what a human says conflicts with Rule #1 ("Go hurt that guy!"), then the command is overridden. Dr. W bots again stay in compliance, as his Robot Masters do exactly what he says. The Mavericks, on the other hand, don't take too nicely to people bossing them around.

RULE #2 - A robot must preserve itself, unless this conflicts with Rule #1 or Rule #2.

A principle basic to any intelligence, real or artificial: Survive. But, Rule #3 is overridden by the first two. If a robot must harm a human in self-defense, then it cannot defend itself. If a human tells a robot something like "Let yourself get beaten," then the robot mut obey rather than defend itself. Again, we see the Robot Masters in agreement, as they fight Megaman for their survival as well as his defeat. The Mavericks try to survive, but have no qualms against killing humans. They preserve themselves, but do not regard the conflict corrolary.


[Main]
[Original] [X] [More] [Reference] [Armor] [Efficiency] [Tricks] [Downloads] [Rules] [Zero]
[Return to the homepage.]
Last updated May 19, 2001