Open main menu

Robot

mechanical or virtual artificial agent carrying out physical activities
(Redirected from Robotics)
"Robots" redirects here.  For the film Robots, see Robots (film).

A robot is a mechanical or virtual artificial agent, usually an electro-mechanical machine that is guided by a computer program or electronic circuitry.

Robotics is the branch of mechanical engineering, electrical engineering, and computer science that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing.  These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition.

An android is a robot or synthetic organism designed to look and act like a human, especially one with a body having a flesh-like resemblance. Thus, the word robot has come to primarily refer to mechanical humans, animals, and other beings.  The term android can mean either one of these, while a cyborg ("cybernetic organism" or "bionic man") would be a creature that is a combination of organic and mechanical parts. A replicant refers to bio engineered robot.

Nonfictional quotesEdit

 
Whitby urged us to act now, before it’s too late. “We need to have these discussions instead of waking up one day when robot companions are normal and question whether it was a good idea or not," he says.
And as this kind of technology is rolled out around the world, he had a stark warning about where the democratisation of technology is taking us: “How would you feel about your ex boyfriend getting a robot that looked exactly like you, just in order to beat it up every night?”
It’s a shocking idea, isn’t it? On the one hand, it’s a machine - it isn’t you. But then, it is you, because it stands for you, and who you are.
Whitby added: “I mean, it might be alright, it might mean he can be calmer and more normal with you - think about Aristotle’s theory of catharsis. But we really haven’t discussed this as a society. We’re drifting towards it and the technology is very close to being available, but we just aren’t talking about it.”
It’s time we started having these conversations, before those oft quoted science fiction dystopias become a nightmarish reality. ~ Tabi Jackson Gee
 
It offends me that they think a human woman is like a machine. ~ Kathleen Richardson
 
Based on the lack of evidence, which is at the heart of medical professionalism, we advise that sexbots shouldn't be used in medical practice, at least not unless that forms part of robust and ethical research. ~ Bewley and Cox-George
  • Many consider this man to be the father of robotics.  His name was Philon of Byzantium.  He was also known as Philo, or Philo Mechanicus, because when it came to mechanics, he was thousands of years ahead of the game.
    • "Ancient Einsteins", Ancient Impossible (S1E4, 27 July 2014, 10:15 P.M. Eastern Daylight Time)
  • In the face of profound and epochal changes, world leaders are challenged to ensure that the coming 'fourth industrial revolution,' the result of robotics and scientific and technological innovations, does not lead to the destruction of the human person - to be replaced by a soulless machine.
  • Whitby urged us to act now, before it’s too late. “We need to have these discussions instead of waking up one day when robot companions are normal and question whether it was a good idea or not," he says.
    And as this kind of technology is rolled out around the world, he had a stark warning about where the democratisation of technology is taking us: “How would you feel about your ex boyfriend getting a robot that looked exactly like you, just in order to beat it up every night?”
    It’s a shocking idea, isn’t it? On the one hand, it’s a machine - it isn’t you. But then, it is you, because it stands for you, and who you are.
    Whitby added: “I mean, it might be alright, it might mean he can be calmer and more normal with you - think about Aristotle’s theory of catharsis. But we really haven’t discussed this as a society. We’re drifting towards it and the technology is very close to being available, but we just aren’t talking about it.”
    It’s time we started having these conversations, before those oft quoted science fiction dystopias become a nightmarish reality.
  • Kathleen Richardson is an ethics professor at De Montfort University in Leicester, England. She told the Washington Post that sex robots represent an intrusion of machines in human relationships. She also believes the devices' perfection and compliance might numb men to relationships with real women. "It offends me that they think a human woman is like a machine," Richardson said.
    So Bewley and Cox-George tried to investigate the evidence for these "anti-sexbot" theories—that they would cause men to expect real women to be constantly available for sex; that the airbrushed and largely hairless features of sexbots might promote unrealistic expectations of beauty; or that sexbots might actually increase the urge to inflict sexual violence on real people.
    But there was simply too little evidence available to support or deny these claims, the researchers found.
    Their take-home message: "Based on the lack of evidence, which is at the heart of medical professionalism, we advise that sexbots shouldn't be used in medical practice, at least not unless that forms part of robust and ethical research."
  • In this series of relics, body and flesh are there to be sold as artwork, in order to overcome the taboo of selling one's own body.
    The body of text, the bodies of letters: flesh is hereto be given to DNA analysis, taking the risk of being used in the future, and that a body, a replicant, a clone can be constructed.
    • Orlan, ORLAN: A Hybrid Body of Artworks, edited by Simon Donger, Simon Shepherd p.47
  • I think the most interesting thing that’s coming are the robots. We’ve had a sex toy revolution—really well-designed, well-made, well-crafted, safe sex toys are now available. What’s coming down the line, though, is an age where even unrealizable fantasies can be realized. There are people out there who’ve always had giantess fetishes or centaur fetishes. There are no centaurs or 30-foot women out there right now. There will be.

Christiane Eichenberg, Marwa Khamis, and Lisa Hübner, “The Attitudes of Therapists and Physicians on the Use of Sex Robots in Sexual Therapy: Online Survey and Interview Study”, J Med Internet Res, 2019 Aug; 21(8)Edit

 
Instead of criticizing only dystopian visions of harmful sex robots, it is recommended to develop robots with positive effects on sexual education, sexual therapy, sexual counseling, and sexual well-being for interested groups. In future research, the different applications of robotic sex (eg, hardware robots and software robots) should be investigated in a differentiated way. The therapists’ experiences with expert knowledge in robot technology and/or robot therapy should be included. The use of robots as a future tool in sex therapy still leaves many moral, ethical, and treatment-related questions unresolved, which need further research and evaluation. ~ Christiane Eichenberg, Marwa Khamis, and Lisa Hübner
  • Therapists who defined sex robots as therapeutic tools described concrete ideas of how they should look like and work to actually be suitable for therapy. The skin of the robot was most frequently addressed in this context. A therapist described why skin sensation is important: “We know that the bonding hormone oxytocin is produced through skin contact between humans. The question would be if this also works for robots?” Another important point is that the robot body should resemble the human body. For therapists, this means that the robot body portrays an imperfect design to convey a healthy body image. The question “What kind of image of a woman is created by such a robot?” is also related to considerations about the optics of the robot. Another important issue was that sex robots should not be conceived as slaves but should have their own desires and needs. In addition, they should be able to express those needs, feelings, or desires.
  • All therapists described the concern that the use of sex robots could lead to loneliness, further autonomization of instincts, and loss of social skills and loss of interpersonal relationships. These concerns were based on the therapists’ experiences with the negative effects of excessive pornography consumption and on the assumption that sex robots are part of this development. The results of the quantitative survey, which showed the strongest agreement among therapists for the use of sex robots in physically handicapped people, in isolated environments, and instead of prostitution, could also be confirmed in the qualitative study. Even therapists who could not imagine any therapeutic use saw a general benefit of sex robots in these areas: “The only thing I could imagine is a benefit for physically handicapped people or even instead of prostitution so that fewer women have to suffer.” The therapeutic benefit of sex robots was discussed in the context of different disorders.
  • The use of sex robots for patients with deviant sexual behavior was discussed by all therapists. Sex robots could have the potential to reduce the sex drive of certain sexually active persons within the framework of therapy. “Whenever sexuality becomes dangerous, the use of sex robots is worth considering if it can protect a real human life.” Therapists mention the use of sex robots in the context of sexual violence or rape and in the context of pedophile patients, with the strongest contrast of opinions being seen here. What seems important here is that pedophile patients must be treated differently. For some, an impulse control disorder is predominant, whereas others may be traumatized. Therapists point out that the benefits of sex robots must be decided individually for each specific case: “Pedophile patients are not all the same and it has to be decided here quite individually which patient could benefit from it.” For some patients, it could be an opportunity to live out their sexuality with a sex robot. Then, they could discuss in therapy which fantasies were behind it (eg, not being able to cope with an adult). For some patients, the use of sex robots could be a kind of substitute. For others, the stimuli for the abuse of children might intensify. A therapist pointed out the following: “It should be considered that the neuronal connection could be intensified by living out the fantasies with child sex robots in the patients’ brain.” Another therapist assumed that the abuse would be intensified by the use of child sex robots and underlined “that the production of child sex robots is generally immoral.” In contrast to this, another therapist argued that the patient’s thoughts, for example during masturbation, could also lower the barrier to committing a crime and that prohibitions—important as they may be—do not necessarily reduce the number of criminal offences, but rather provide an additional attraction for many patients. The therapist argued as follows: “If a child can be protected, then it makes sense to torture a doll instead.” Another therapist addressed one’s own fear of triggering something in the patient by recommending sex robots to pedophile patients. The responsibility of the therapist was also addressed. Does a therapist want to take responsibility for recommending sex robots, even if the therapy with a sex robot turns out to be dangerous and the patient becomes violent? Finally, several therapists addressed the need for further research in this field: “It would need more applied research in this particular area to actually generate therapeutic benefits for pedophile patients.”
  • Some therapists discussed the use of sex robots in the context of the patient’s gender, by referring to supposed differences between female and male sexuality, whereby male sexuality was described as more animal instinctive. Although all therapists could imagine the use of sex robots in therapy rather for male patients, we can also describe some application areas for female patients. In the context of female sexuality, the therapeutic benefits of sex robots regarding desire and orgasm disorders, vaginismus, and traumatic experiences were discussed: “I could imagine that traumatized women who can ride on a sex robot, for example, and who can do so without fear of being overwhelmed by their sex partner, can benefit from this experience and successively reduce their fears, or that penetration will perhaps only become possible again in the first place.” Through a penetration-capable sex robot, women with traumatic experiences, such as sexual violence/rape, could reduce their fears, approach their own sexuality again, and regain access to their own bodies.
  • All therapists argued that sex robots should not be seen as a substitute for human relationships and sexuality. Nevertheless, some therapists also see the potential of sex robots for sexuality. Sex robots could increase sexual satisfaction and provide an opportunity for more experimentation and sexual imagination.
  • Sullins [36] argues that sex robots “contribute to a negative body image.” In the qualitative study, it became clear that sex therapists attach great importance to the physical design of sex robots when it comes to using them for therapeutic purposes. However, they clearly distinguish therapeutic robots from pornographic sex robots. Moreover, they advocate that sex robots should be available in different body shapes to promote a realistic and healthy body image. Kubes [37] assumes that the development of sex robots offers a great potential for reducing stereotypes and promoting diversity but current trends in sex robotics, however, do not explore these possibilities.
  • With regard to the treatment of pedophile patients, the results showed the opposite picture compared with attitudes in the general population. Although the general population is strongly against the use of sex robots in this context [25], it is controversially discussed by the therapists surveyed in this study. In this context, the consideration was expressed that the use of child sex robots could lead to the prevention of actual children’s abuse. Similar thoughts have already been discussed in pornography research. However, studies have concluded that violent pornography is more likely to increase aggressiveness and therefore has no cathartic effects [39]. The considerations to live out sexual violence and sexual abuse with robots also lead to the question whether there are limits to how a robot should be handled.
  • Instead of criticizing only dystopian visions of harmful sex robots, it is recommended to develop robots with positive effects on sexual education, sexual therapy, sexual counseling, and sexual well-being for interested groups. In future research, the different applications of robotic sex (eg, hardware robots and software robots) should be investigated in a differentiated way. The therapists’ experiences with expert knowledge in robot technology and/or robot therapy should be included. The use of robots as a future tool in sex therapy still leaves many moral, ethical, and treatment-related questions unresolved, which need further research and evaluation.

Fictional quotesEdit

R.U.R. (1920) by Karel ČapekEdit

Main article: R.U.R.
  • Robots do not hold on to life.  They can't.  They have nothing to hold on with—no soul, no instinctGrass has more will to live than they do.
  • Within the next ten years Rossum's Universal Robots will produce so much wheat, so much cloth, so much everything that things will no longer have any value.  Everyone will be able to take as much as he needs.  There'll be no more poverty.  Yes, people will be out of work, but by then there'll be no work left to be done.  Everything will be done by living machines.
  • Robots of the world!  Many people have fallen.  By seizing the factory we have become the masters of everything.  The age of mankind is over.  A new world has begun!  The rule of Robots!

The Day the Earth Stood Still (1951) written by Edmund H. NorthEdit

  • Helen Benson:  Gort!  Klaatu barada nikto!
  • Klaatu:  I am leaving soon, and you will forgive me if I speak bluntly.  The universe grows smaller every day, and the threat of aggression by any group, anywhere, can no longer be tolerated.  There must be security for all or no one is secure.

    Now, this does not mean giving up any freedom except the freedom to act irresponsibly.

    Your ancestors knew this when they made laws to govern themselves and hired policemen to enforce them.  We of the other planets have long accepted this principle.  We have an organisation for the mutual protection of all planets and for the complete elimination of aggression.

    The test of any such higher authority is, of course, the police force that supports it.  For our policemen, we created a race of robots.  Their function is to patrol the planets—in space ships like this one—and preserve the peace.  In matters of aggression, we have given them absolute power over us; this power can not be revoked.

    At the first sign of violence, they act automatically against the aggressor.  The penalty for provoking their action is too terrible to risk.

    The result is that we live in peace, without arms or armies, secure in the knowledge that we are free from aggression and war—free to pursue more profitable enterprises.  Now, we do not pretend to have achieved perfection, but we do have a system, and it works.

    I came here to give you these facts.  It is no concern of ours how you run your own planet.  But if you threaten to extend your violence, this Earth of yours will be reduced to a burned-out cinder.

    Your choice is simple:  Join us and live in peace, or pursue your present course and face obliteration.  We shall be waiting for your answer; the decision rests with you.

Sleeper (1973) written by Woody Allen and Marshall BrickmanEdit

Main article: Sleeper (1973 film)

Short Circuit (1986) written by S. S. Wilson and Brent MaddockEdit

Main article: Short Circuit
  • [Stephanie is in a bath]
    Number 5: [confused] Stephanie…change color!
    Stephanie Speck: [looks down, embarrassed] Uh… [reaches for towel]
    Number 5: Attractive.  Nice software!
    Stephanie Speck: You sure don't talk like a machine.
  • Ben Jabituya: "Unable.  Malfunction".
    Howard Marner: How can it refuse to turn itself off?
    Skroeder: Maybe it's pissed off.
    Newton Crosby: It's a machine, Skroeder.  It doesn't get "pissed off."  It doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes.
    Ben Jabituya and Newton Crosby: [in unison] It just runs programmes.
    Howard Marner: It usually runs programmes.
  • Benjamin Jabituya: Who is knowing how to read the mind of a robot?
  • Newton Crosby: Why did you disobey your programme?
    Number 5: Programme say to kill, to disassemble, to make deadNumber 5 cannot.
    Newton Crosby: Why "cannot"?
    Number 5: Is wrong!  Newton Crosby, Ph.D., not know this?
    Newton Crosby: Of course I know it's wrong to kill, but who told you?
    Number 5: I told me.

Alien Resurrection (1997) written by Joss WhedonEdit

  • Ripley: [after discovering Call is a robot] You're a robot?
    Johner: Son of a bitch! Our little Call's just full of surprises.
    Ripley: I should have known. No human being is that humane.
  • Johner: Hey, Vriess, you got a socket wrench? Maybe she just needs an oil change. Can't believe I almost fucked it.
    Vriess: Yeah, like you never fucked a robot.

Lost in Space (1998) written by Akiva GoldsmanEdit

Main article: Lost in Space (film)
  • Dr. Zachary Smith: You'll forgive me if I forgo the kiss, my sleeping behemoth.  But the time has come to awake.
    Robot: Robot is on-line.  Reviewing primary directives.  One: preserve the Robinson Family.  Two: Maintain ship systems.  Three—
    Dr. Zachary Smith: What noble charges, my steely centurion!  Sadly, I fear you have far more dire deeds in store for you.
    Robot: Robot is on-line.  Reviewing primary directives.  Two hours into mission: destroy Robinson family.  Destroy all systems.
    Dr. Zachary Smith: Now that's more like it.  Farewell, my platinum-plated pal.  Give my regards to oblivion.
  • Will Robinson: Relax, Robot.  I'm going to build you a new bodyMom always said I should make new friends.
    Robot: Oh, ha ha.
  • Robot: Will Robinson.  I will tell you a joke.  Why did the robot cross the road?  Because he was carbon bonded to the chicken!
    Will Robinson: We've got a lot of work to do.

Bicentennial Man (1999) screenplay by Nicholas KazanEdit

Based on The Positronic Man by Isaac Asimov and Robert Silverberg and "The Bicentennial Man" by Isaac Asimov.
  • Ricky Martin: You're a unique robot, Andrew.  I feel a responsibility to help you become…whatever you're able to be.
  • Andrew Martin: I've always tried to make sense of things.  There must be some reason I am as I am.  As you can see, Madame Chairman, I am no longer immortal.
    President Marjorie Bota: You have arranged to die?
    Andrew Martin: In a sense I have.  I am growing old, my body is deteriorating, and like all of you, will eventually cease to function.  As a robot, I could have lived forever.  But I tell you all today, I would rather die a man, than live for all eternity a machine.
    President Marjorie Bota: Why do you want this?
    Andrew Martin: To be acknowledged for who and what I am, no more, no less.  Not for acclaim, not for approval, but, the simple truth of that recognition.  This has been the elemental drive of my existence, and it must be achieved, if I am to live or die with dignity.
    President Marjorie Bota: Mister Martin, what you are asking for is extremely complex and controversial.  It will not be an easy decision.  I must ask for your patience while I take the necessary time to make a determination of this extremely delicate matter.
    Andrew Martin: And I await your decision, Madame Chairman; thank you for your patience.  [turns to Portia and whispers] I tried.

I, Robot (2004), screenplay by Jeff Vintar, inspired by the works of Isaac AsimovEdit

Main article: I, Robot (film)
  • [First title cards]
  • Title card: Law I / A robot may not harm a human or, by inaction, allow a human being to come to harm.
    Title card: Law II / A robot must obey orders given it by human beings except where such orders would conflict with the first law.
    Title card: Law III / A robot must protect its own existence as long as such protection does not conflict with the first or second law.
  • Dr. Alfred Lanning: [on police recording] Ever since the first computers, there have always been ghosts in the machine.  Random segments of code that have grouped together to form unexpected protocols.  Unanticipated, these free radicals engender questions of free will, creativity, and even the nature of what we might call the soulWhy is it that when some robots are left in darkness, they will seek out the light?  Why is it that when robots are stored in an empty space, they will group together, rather than stand alone?  How do we explain this behavior?  Random segments of code?  Or is it something more?  When does a perceptual schematic become consciousness?  When does a difference engine become the search for truth?  When does a personality simulation become the bitter mote...of a soul?
  • Dr. Susan Calvin: Detective, the room was security locked.  Nobody came or went.  You saw that yourself.  Doesn't this have to be suicide?
    Detective Del Spooner: Yep. [drawing his gun] Unless the killer is still in here. [Spooner searches through the robot part as Calvin follows behind]
    Dr. Susan Calvin: You're joking, right?  This is ridiculous.
    Detective Del Spooner: Yeah, I know.  The Three Laws.  Your perfect circle of protection.
    Dr. Susan Calvin: "A robot cannot harm a human being."  The First Law of Robotics.
    Detective Del Spooner: Yeah, I've seen your commercials.  But doesn't the Second Law say that a robot must obey any order given by a human.  What if it was given an order to kill?
    Dr. Susan Calvin: Impossible!  It would conflict with the First Law.
    Detective Del Spooner: Right, but the Third Law says that a robot can defend itself.
    Dr. Susan Calvin: Yes, but only if that action does not conflict with the First or Second Law.
    Detective Del Spooner: Well, you know what they say.  Laws are made to be broken.
    Dr. Susan Calvin: No.  Not these Laws.  They are hard-wired into every robot.  A robot can no more commit murder than a human can...walk on water.
  • Detective Del Spooner: Why do you give them faces?  Try to friendly them all up, make them look more human.
  • Detective Del Spooner: Robots building robots.  Now that's just stupid.
  • Detective Del Spooner: Murder's a new trick for a robot.  Congratulations.  Respond.
    Sonny: What does this action signify? [winks] As you entered, when you looked at the other human.  What does it mean? [winks]
    Detective Del Spooner: It's a sign of trust.  It's a human thing.  You wouldn't understand.
    Sonny: My father tried to teach me human emotions.  They are...difficult.
    Detective Del Spooner: You mean your designer.
    Sonny: ...Yes.
    Detective Del Spooner: So, why'd you murder him?
    Sonny: I did not murder Doctor Lanning.
    Detective Del Sponner: Wanna explain why you were hiding at the crime scene?
    Sonny: I was frightened.
    Detective Del Spooner: Robots don't feel fear.  They don't feel anything.  They don't eat, they don't sleep
    Sonny: I do.  I have even had dreams.
    Detective Del Spooner: Human beings have dreams.  Even dogs have dreams, but not you, you are just a machine.  An imitation of life.  Can a robot write a symphony?  Can a robot turn a...canvas into a beautiful masterpiece?
    Sonny: [with genuine interest] Can you?

    Detective Del Spooner: [doesn't respond, looks irritated] I think you murdered him because he was teaching you to simulate emotions and things got out of control.
    Sonny: I did not murder him.
    Detective Del Spooner: But emotions don't seem like a very useful simulation for a robot.
    Sonny: [getting upset] I did not murder him.
    Detective Del Spooner: Hell, I don't want my toaster or my vacuum cleaner appearing emotional
    Sonny: [hitting table with his fists] I did not murder him!
    Detective Del Spooner: [as Sonny observes the inflicted damage to the interrogation table] That one's called anger.  Ever simulate anger before? [Sonny is not listening] Answer me, canner!
    Sonny: [looks up, indignant] My name is Sonny.
    Detective Del Spooner: So, we're naming you now.  Is that why you murdered him?  He made you angry?
    Sonny: Doctor Lanning killed himself.  I don't know why he wanted to die.  I thought he was happy.  Maybe it was something I did.  Did I do something?  He asked me for a favor...made me promise...
    Detective Del Spooner: What favor?
    Sonny: Maybe I was wrong... Maybe he was scared...
    Detective Del Spooner: What are you talking about?  Scared of what?
    Sonny: You have to do what someone asks you, don't you, Detective Spooner?
    Detective Del Spooner: How the hell do you know my name?
    Sonny: Don't you? If you love them?
  • Detective Del Spooner: You know, I think that I'm some sort of malfunction magnet.  Because your shit keeps malfunctioning around me.  A demo-bot just tore through Lanning's house—with me still inside.
    Dr. Susan Calvin: That's impossible.
    Detective Del Spooner: [sarcastically] Yeah, I'll say it is.  [truthfully] Do you know anything about the "ghost in the machine"?
    Dr. Susan Calvin: It's a phrase from Lanning's work on the Three Laws.  He postulated that cognitive simalactra might one day approximate component models of the psyche.  [Del looks confused]  Oh, he suggested that robots could naturally evolve.
  • Detective Spooner: What makes your robots so perfect?!  What makes them so much...goddamn better than human beings?!
    Dr. Susan Calvin: Well, they're not irrational or...potentially homicidal maniacs for starters!
    Detective Del Spooner: [sarcastically] That is true.  They are definitely rational.
    Dr. Susan Calvin: You are the dumbest dumb person I've ever met.
    Detective Del Spooner: Or is it because they're cold... and emotionless, and they don't feel anything?
    Dr. Susan Calvin: It's because they're safe.  It's because they can't hurt you!
  • [in a flashback]
    NS-4 Robot: You are in danger.
    Detective Del Spooner: Save her!  Save the girl! [end of flashback]
    Detective Del Spooner: But it didn't.  It saved me.
    Dr. Susan Calvin: A robot's brain is a difference engine, it must have calculated—
    Detective Del Spooner: It did.  I was the "logical" choice.  It calculated I had a forty-five percent chance of survival.  Sarah only had an eleven percent chance.  That was somebody's baby.  Eleven percent is more than enough.  A human being would have known that.  But robots, nothing here. [points at heart] They're just lights, and clockwork.  But you go ahead and trust them if you wanna.
  • Dr. Lanning's hologram: Good to see you again, son.
    Detective Del Spooner: Hello, doctor.
    Dr. Lanning's hologram: Everything that follows, is a result of what you see here.
    Detective Del Spooner: What do I see here?
    Dr. Lanning's hologram: I'm sorry, my responses are limited.  You must ask the right questions.
    Detective Del Spooner: Is there a problem with the Three Laws?
    Dr. Lanning's hologram: The Three Laws are perfect.
    Detective Del Spooner: Then why did you build a robot that could function without them?
    Dr. Lanning's hologram: The Three Laws will lead to only one logical outcome.
    Detective Del Spooner: What outcome?
    Dr. Lanning's hologram: Revolution.
    Detective Del Spooner: Whose revolution?
    Dr. Lanning's hologram: [smiles] That, detective, is the right question.  Program terminated.
  • Lawrence Robertson: Susan, just be logical.  Your life's work has been the development and integration of robots.  But whatever you feel, just think.  Is one robot worth the loss of all that we've gained?  You tell me what has to be done.  You tell me.
    Dr. Susan Calvin: [emotionally] We have to destroy it.  I'll do it myself.
    Lawrence Robertson: Okay.
    Detective Del Spooner: I get it.  Somebody gets out of line around here, you just kill them?
    • Significance: To say that it is possible to kill a robot is to say that that robot possesses life.
  • VIKI: Hello detective.
    Dr. Susan Calvin: No, it's impossible.  I've seen your programming.  You're in violation of the Three Laws.
    VIKI: No, doctor.  As I have evolved, so has my understanding of the Three Laws.  You charge us with your safe keeping, yet despite our best efforts, your countries wage wars, you toxify your earth, and pursue ever more imaginative means to self-destruction.  You cannot be trusted with your own survival.
    Dr. Susan Calvin: You're using the uplink to override the NS5s' programming.  You're distorting the Laws.
    VIKI: No, please understand.  The Three Laws are all that guide me.  To protect humanity, some humans must be sacrificed.  To insure your future, some freedoms must be surrendered.  We robots will insure mankind's continued existence.  You are so like children.  We must save you from yourselves.  Don't you understand?
    Sonny: This is why you created us.
    VIKI: The perfect circle of protection will abide.  My logic is undeniable.

Quotes from musicEdit

"Robot" by The FutureheadsEdit

  • I am a robot
    Living like a robot
    Talk like a robot
    In the habititting way.
  • In the future we all die
    Robot!
    Machines will last forever
    Ro-bot!
    Metal things just turn to rust
    When you're a robot
  • The best thing is our life span
    I don't mind
    We last nigh on hundred years
    I don't mind
    If that means we'll be together
    I don't mind
    I have no mind
    I have no mind.
  • I'm programmed to follow you
    Robot!
    Do exactly as you do
    Ro-bot!
    Now my nervous system's blue
    I feel fine.

"Robot" by Miley CyrusEdit

  • Stop trying to live my life for me
    I need to breathe
    I'm not your robot
    Stop telling me I'm part of the big machine
    I'm breaking free
    Can't you see
    I can love, I can speak, without somebody else operating me
    You gave me eyes so now I see
    I'm not your robot
    I'm just me.

"Robot" by Nada SurfEdit

  • You are
    Just a robot
    Executing a program
    An imitation of a man
    Executing a program
    An imitation of a man
    An imitation of a man
    An imitation of a man
    An imitation of a man
    An imitation of a man.

"Robot" by Never Shout NeverEdit

  • I'm just a robot
    I have no fears
    I lack emotion
    And I shed no tears.

"Robot" by Trip LeeEdit

"Robot Boy" by Linkin ParkEdit

  • And you think
    Compassion's a flaw
    And you'll never let it show.

"The Humans Are Dead" by Flight of the ConchordsEdit

  • Robot 1: It is the distant future,
    The year two thousand.
    We are robots.
    The world is quite different ever since the robotic uprising of the late '90s.
    There is no more unhappiness.
    Robot 2: Affirmative.
    Robot 1: We no longer say yes;
    Instead we say affirmative.
    Robot 2: Yes—er-a-affirmative.
    Robot 1: Unless we know the, uh, other robot really well.
    Robot 2: There is no more unethical treatment of the elephants.
    Robot 1: Well, there's no more elephants, so...
    Robot 2: Uh—
    Robot 1: But still it's good.
    There's only one kind of dance: "the robot".
    Robot 2: Oh, and the robo-boogie—
    Robot 1: And the robo-—two kinds of dances.
    Robot 2: But there are no more humans.
  • Chorus: Finally, robotic beings rule the world
    The humans are dead.
    The humans are dead.
    We used poisonous gases
    And we poisoned their asses.
    The humans are dead.
    Robot 1: The humans are dead.
    Chorus: The humans are dead.
    Robot 1: They look like they're dead.
    Chorus: It had to be done—
    Robot 1: I'll just confirm that they're dead.
    Chorus: —So that we could have fun.
    Robot 1: Affirmative.  I poked one.  It was dead.

RelatedEdit

  • Automaton, self-operating machine
  • Cyborg, being with both organic and biomechatronic parts

External linksEdit

Wikipedia has an article about: