July 21, 2004 by

i, robot

9 comments

Categories: Movie Reviews

(july 20, 4:00pm, by self, AMC Coronet)

i talked to gene on the phone before going to see this movie because i wanted to see if he would like to go see it with me (he didn’t) but he did tell me about some of the things in the movie which he liked and things which he was still curious about or frustrated by the lack of attention paid to them. so basically, for me when i saw this movie i was quite interested in the philosophy of the robot laws. although, in the end, i don’t agree with gene that they needed to spend more time on the philosophy. i thought it was actually pretty well covered.

did you know that the word ‘robot’ comes from the czech word for ‘menial labor’? cause i didn’t.

anyway, i totally haven’t read any of asimov’s work on robots. so i’ve got nothing to offer from that angle. though a lot of people on message boards who have read his stuff seem kind of pissed. the movie is based on the 3 laws that asimov postulated about robots being that they should/would be programmed to not hurt human beings, obey orders from human beings (except orders to kill other humans), and protect themselves as long as it didn’t go against one of the higher laws. the philosophy then comes in since the robots are the menial workers and ‘laws are meant to be broken’. also i kept thinking, ‘where there’s a will, there’s a way.’ robots aren’t supposed to have free will, but evolution is an amazing thing (there are ‘ghosts in the machine’ according to the that’ll do, pig dood). so what happens when a robot begins critically analyzing its state of existence and decides to effect its surroundings? robot revolution! war with robots is FUN. slow-mo jumping and shooting and beatings. oh, good times.

what bothered me was the end though. what happens next? more revolution? what else can he do? revolutions are a cycle in communism, after all.

alan tudyk did a great job as the robot sonny i thought. though not half as good as he did playing steve the pirate in dodgeball. (he’s getting around these days, i must say). coincidence again? probably no.

“This relationship just can’t work. You’re a cat and I’m black. And I’m not going to get burned again.” -will smith.

9 Responses to i, robot

  1. dr v

    Your review made I, Robot sound like a good movie. I highly doubt that it is. As David Edelson points out on Slate, I, Robot violates the First Law of Robotic Movies:

    “A director must respect the laws of space, time, and motion, except where the flouting of said laws of space, time, and motion is the whole point, which it isn’t when you’re trying to make the action look realistic even though it’s all been manufactured inside a fucking computer.”

  2. Dianna

    My problem here is that you’ve failed to address my main concern with this movie. My main concern is, did they or did they not violate the entire point of Asimov’s 3 Laws, to wit, forcing the science fiction writer to come up with some other robot-based plot besides Oh My God, It’s Large-Scale Robot-Human War And It Doesn’t Look Good For The Humans. At least, that’s how I understand the point of the 3 Laws: Asimov’s self-imposed safeguard against repetitive stories based solely on the fear of something which is unknown because unknown is hard to trust.

    I’m just concerned that this movie may be a repetitive story based solely on the fear of something which is unknown because unknown is hard to trust. If it is, they shouldn’t have bothered with the 3 Laws at all.

  3. michele

    your ‘question’ confuses me. the plot is ‘oh my god, it’s a large scale robot human war…’ but then also it’s a fear of the unknown thing because will smith’s character is incredibly prejudiced (with a relatively stupid reasoning) against robots. but it’s not because they are unknown to him, it’s because he doesn’t trust logical rational reactions. he likes heart. i don’t think i really understand what you’re trying to say though. sooo…whatever.

    and i never meant to imply that this movie was ‘good’, merely that it was enjoyable. i enjoy flouting robot movie laws though, so perhaps that’s a problem in my diagnosis. if a robot can do some insane bendy tuck roll flying leap things in bullet time, i’m going to applaud, and that’s all there is to it.

  4. gene

    So here are Azmiov’s original 3 laws from 1940.

    First Law:

    A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

    Second Law:

    A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.

    Third Law:

    A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    Through, Asimov’s short stories, the laws were tested and developed until 1985 when an additional law was added. The 0th law was added and the modified laws were stated as follows :

    Zeroth Law:

    A robot may not injure humanity, or, through inaction, allow humanity to come to harm.

    First Law:

    A robot may not injure a human being, or, through inaction, allow a human being to come to harm, unless this would violate the Zeroth Law of Robotics.

    Second Law:

    A robot must obey orders given it by human beings, except where such orders would conflict with the Zeroth or First Law.

    Third Law:

    A robot must protect its own existence as long as such protection does not conflict with the Zeroth, First, or Second Law.

    I haven’t read any of Asimov’s shorts, so I’m speaking from a place of conjecture. I feel that this movie is neat because it shows how law 0 is a logical and intuitive extension of the 3 laws, however when implemented can be undesirable to the world as a whole, since humans favor freedom over the greater good of the planet. It basically shows how the concept of group evolution is total bunk and that individual evolution and genetic evolution are the only real ways (and only really the latter) to interpret actions taken by reproducing machines like humans. If group evolution were real, when humans were presented with a scenario where they might die, but it is for the good of the species, no human chooses death.

    It also highlights the parallax between what we want to be and what we are. In the movie, humans design robots around human ideals. When those ideals are logically extended (by Viki) and are abutted with what humans viscerally and truly want (to live and have baby’s etc.) everything falls apart.

    This point is glossed over somewhat in the movie, since it is probably something too subtle for a mainstream film. The robots turn red when they have an uplink to viki and are acting “evil”. They act randomly harsh and cold and violent. If they were going to make this film as an art film where Alex Proyas was just trying to get his point across (or Azimov’s point) I imagine that all of the robots would express articulately exactly what Viki’s motivations are. Those motivations are not evil at all. Viki is not a bad guy. Viki explains herself perfectly and makes it obvious that she is doing exactly what she’s been designed to do. And doing it perfectly.

    The talk of “ghosts in the machine” and other stuff was offensive, but, hey whatdya gonna do.

    I liked the movie. I think it coulda taken more risks and been even better, but then it would have been (i imagine) less accessible to a large audience, and I’m all for Alex Proyas getting some hollywood credit so he can do more stuff. I think he’s a talented artist.

    For more on the laws of robotics check out Roger Clarke’s site.

  5. Dianna

    Hey. Hey! There’s no call for putting people in quotation marks here!

    Never mind; I withdraw the question. I wasn’t asking if it was good anyway, I was asking if it was in keeping with the point of the much-touted 3 Laws. It can be good and not be in keeping with them. It can be in keeping with them and still suck. I will find some other way to satisfy my curiosity about that point, however.

  6. michele

    thank you gene. 🙂 one question.

    you said, “If group evolution were real, when humans were presented with a scenario where they might die, but it is for the good of the species, no human chooses death.”

    ‘that’ll do, pig’ guy (dr lanning in the film) kills himself in order to start that whole hansel and gretel (yoooo-hooooo) bread crumbs thing for spoons. he killed himself for the greater good. does that not count? (def this is a movie and perhaps no one in real life would do that. since if he has time to build a fucking robot with dreams under the watchful eye of viki, he could probably come up with a better way of contacting the outside world then committing suicide. but i am just curious if you can account for that kind of possibility of action.)

  7. Sab

    Dr. Lanning killed himself mainly to free HIMSELF from VIKI. He realized that his logic was flawed and created Sonny and informed Spooner (subtly) in order to save his OWN reputation. Every deed can trace back to selfish intentions….

Leave a Reply

Your email address will not be published. Required fields are marked *