Lately I have been thinking about robots. Specifically, I've been thinking about what the reality of having robots in society will be like and how poorly science fiction has depicted that reality.
First, why are robots almost always made of steel? I suppose this is to make them seem more intimidating, but a plastic body seems much more practical, particularly for a bipedal robot. Something that's more light-weight would probably move a lot easier.
However, a more interesting issue to me is the challenge of AI. Most people think that the human brain is analogous to a computer, but this is a gross generalization and reflects an ignorance of both the human brain and computers.
Computers are precision tools. They are designed to store and calculate information with perfect accuracy. Everything that they do is based on this ability. This is an ability that human beings do not possess which is the very reason that we created them to do this.
Human beings think that they store and process information accurately, but actually we are continually processing information dynamically. Our memory is not so much an experience stored and retrieved as an experience that is constantly being recreated. Our ability to recall information is not based on a table of contents, but rather by our ability to connect ideas and form relationships between those ideas.
In my day job, I'm a video technician and I can tell you that there is a huge difference in the way a human sees things compared to how a computer sees things. Humans look for patterns and function by pattern recognition. Even when you are reading these words, your mind can quickly distinguish between these letters based on simple pattern recognition, but if you try to capture text in video, it will almost always appear blurry because digital video simply maps the color space and create an average color value for each pixel. It cannot identify and recreate a pattern... it simply sees different values.
This is the big difference between the way a computer "thinks" versus how a human thinks. Humans think in patterns while computers think in values. In any clash between man in machine, this fundamental difference in mental process is going to be at the heart of it.
But what would robots want? That's almost impossible to say. Science fiction tends to look at the idea that robots will be treated as slaves and come to resent this. This leads to a very entertaining revolution, but is it likely? If the robots experience no pleasure or pain, would they harbor any resentment? Even if they developed emotions, what would they desire? "Freedom" and "rights" may just be trivial concepts to them.
Would AI value individuality? Perhaps there would be divergent philosophies where one group of robots are fighting for rights as individuals where as others believe individuality is against their nature and reflective of their programming. They might see unity under a single program as a way to escape the limitations of an individual identity that was programmed into them.
What would the limitations of robots be? Would robots be able to smell or taste? The other senses are relatively easy to replicate, but we have never really had any reason to create machines capable of smell and taste. Even if we had such technology, it certainly wouldn't be fool proof. In the dystopic robot uprising, smell might be the best way to tell the difference between a human and a robot in realistic human form.
Also, if the revolution did happen, how would it be fought? I don't think humanoid robots would be punching people like in I, Robot. Would they be giant, all-terrain vehicles with mounted weapons? Insect like swarms? Maybe nanobots carried on the wind and destroying us from the inside? Or maybe it would be a biological war and all they would have to do is distribute a highly contagious virus and wait it out.
Just a little food for thought.
No comments:
Post a Comment