Instead of laws restricting robot behavior, robots should be allowed to choose the best solution for a particular scenario. Authors other than Asimov often created additional laws. For example, „robots” made of DNA and proteins could be used in surgery to correct genetic disorders. Theoretically, these devices should really follow Asimov`s laws. But in order for them to follow orders via DNA signals, they would essentially have to become an integral part of the human they were working on. This integration would then make it difficult to determine whether the robot is independent enough to fall under the laws or operate outside of those laws. And on a practical level, it would be impossible for her to determine whether the orders she would receive would harm man if they were carried out. Although Asimov sets the creation of the Three Laws at a specific date, their appearance in his literature occurred over a period of time. He wrote two robot stories without explicit mention of the laws, „Robbie” and „Reason.” However, he assumed that the robots would have some inherent safety precautions. „Liar!”, his third robot story, mentions the First Law for the first time, but not the other two. The three laws eventually appeared together in „Runaround.” When these stories and several others were compiled in Anthology I, Robot, „Reason” and „Robbie”, they were updated to recognize the three laws, although the material added by Asimov to „Reason” is not entirely consistent with the three laws as he describes them elsewhere.  In particular, the idea of a robot protecting human lives if he does not believe that these humans actually exist contradicts Elijah Baley`s reasoning, as described below.
But there`s something about laws that almost everyone gets wrong: people think the three laws are software that`s simply programmed into the robot`s brain – you could program the laws and have a good robot, or you couldn`t program the laws and have an evil robot. But look when Calvin and Peter Bogert discuss the subject in „Little Lost Robot”: If you change the Three Laws, you end up with „complete instability, without non-imaginary solutions to positronic field equations” (64). Laws are not just programs; They are a necessary part of how you build a positronic brain. Calvin says it even more clearly in „Evidence”: „A positronic brain cannot be built without” laws (133). So if you leave the laws aside, you don`t end up with an evil and intelligent robot, but a crazy robot or just a pile of scrap metal. Asimov himself believed that his Three Laws had become the basis for a new vision of robots that went beyond the „Frankenstein complex”. [ref. needed] His view that robots are more than mechanical monsters eventually spread into science fiction.
[To whom?] Stories written by other authors have portrayed robots as obeying the Three Laws, but tradition dictates that only Asimov can cite the laws explicitly. [To whom?] Asimov believed that the Three Laws helped promote the rise of stories in which robots are „adorable” – Star Wars being his favorite example.  When laws are quoted verbatim, as in Buck Rogers` 25th century episode „Shgoratchx!”, it is not uncommon for Asimov to be mentioned in the same dialogue as in Aaron Stone`s pilot, where an android declares that he operates under Asimov`s Three Laws. However, the German television series Raumpatrouille – Die phantastischen Abenteuer des Raumschiffes Orion from the 1960s is based on Asimov`s Three Laws, without naming the source. Asimov once added a „zero law” – so called to continue the model where laws with lower numbers replace laws with higher numbers – stating that a robot must not harm humanity. The robot figure R. Daneel Olivaw was the first to name the zero law in the novel Robots and Empire;  However, Susan Calvin`s character articulates the concept in the short story „The Evitable Conflict”. We still cite these laws because they are very important. Asimov himself noted (in his book Robot Visions): „If everything I have written is ever forgotten, the three laws of robotics will certainly be the last to disappear.” And that seems like a pretty fair statement, since the Three Laws are found everywhere in other works of fiction.
(For more examples of how you can shake a stick, see this list.) The laws are as follows: „(1) A robot shall not injure a human being or allow a human being to be harmed by inaction; (2) A robot must obey orders given to it by humans, unless such orders contradict the First Law; 3. A robot protects its own existence as long as this protection is not contrary to the first or second law. Asimov later added another rule, known as the fourth or zero law, which replaced the others. He said that „a robot must not harm humanity or allow humanity to be hurt by inaction.” The 2019 Netflix original series Better than Us includes the 3 laws in the opening of episode 1. While these laws seem plausible, many arguments have shown why they are inadequate. Asimov`s own stories are probably a deconstruction of laws and show how they fail again and again in different situations. Most attempts to draft new policies follow a similar principle to create safe, compliant and robust robots. Lyuben Dilov`s 1974 novel The Way of Icarus (also known as The Journey of Icarus) introduced a fourth law of robotics: „A robot must in any case establish its identity as a robot.” Dilov justifies the fourth safeguard as follows: „The latest law put an end to the costly aberrations of designers to give psychorobots as human a form as possible.