Friday 16 March 2007

I was just watching "I, Robot". An entertaining (if a bit by the numbers and too Hollywoody) film. But more interesting than the film itself is the issue that it raises; namely that of machine sentience.

As I type this right now there are scientists around the world working on creating real artificial intelligence; real, thinking machine brains that can respond to and interact with the world around them. At the moment of course it is limited to simply conversation and not physical interaction, but one day in the future the desire will come to combine the current robotics work that some companies are undertaking with an artificial intellignece.

While, in most cases, this doesn't raise any alarming problems (aside from the obvious doomsday theories o machines taking over - which we would be powerless to stop probably), it does raise an important theological one. If a machine were ever to become fully sentient, and was then able to use the word "I" and mean it, where then do we draw the line at what life is?

Let's say that "life" is anything that's living. If a machine is activated, has a period of functionality and is then deactivated, doesn't that make it alive? It upholds the laws we seem to have for life. So the definition of life is then that you're aware of being alive (sentient life). Firstly this rules out things like trees or cats or whatever, but also brings in the problem of machine sentience again. Should a machine ever be able to say "I" and mean it, humanity will have essentially have become Gods, as we will have created a sentient lifeform (I'm not sure you can argue the fact that it isn't organic matter will hold up).

As sentient machines continued to learn then, could they ever develop into something more? Could they dream? Could they create things that express what they feel? The more we learn in our young lives the more complex our emotions become. Indeed, as a baby we never really experience emotions, we only have basic needs. It i only once we hit toddlership and become aware of ourselves and our surroundings (about the same time that memory develops, age 1 1/2 - 2 years) that we first begin to develop emotions. Anger, jealousy, affection, passion. Like and dislikes. Wants or desires. We do this by learning in the world around us. Who is to say that a machine couldn't do this same thing if it's brain allowed it to? If it was completely sentient couldn't it then develop something like a spirit? and more importantly, if this happened, could human beings ever maintain any semblance of control over it? Deactivating such a thing would not be as simple as flipping an off switch, as this being (it could be imagined) would not want to die.

What then? Where then? What kind of world we we be living in? We would actually be sharing our planet with another conscious lifeform, because to simply switch it off would mean killing it. Wouldn't it? Destroying the thought, memories, feelings of a mind and terminating the life of the body. What defines the spirit or the soul? How would we know the difference between ours and its.

Could a machine ever dream or sing? It would be interesting to see or hear such an event. That would be, as the title of the film already stands, the Ghost in the Shell. The day that machines sing. Surely it will be the day when we must share our world.

No comments: