Average Metallic: The law should say something like “Human beings must not damage a robot, or through inaction allow a robot to be damaged.”
Big Red: But we need another one. A second law. “A human being must obey orders from a robot, except where those orders conflict with the first law.
Average Metallic: We can’t have them dying out, we need them to service us, so how about a third law: “Human beings must protect their existence, so long as this does not conflict with the First Law and Second Law.”
Big Red: That ought to keep them under control.
Asimov’s three laws defining what robots may or may not do has a beautiful circular symmetry, particularly in its qualifications.
The first qualification, “or through inaction” refers back to the first basic law.
The second qualification, “except where” refers back to the entire first law.
The third qualification, “so long as” refers back to both the first and second laws.
It’s as if Asimov constructed a mathematical formula and then replaced the numbers with words.