The world got a first taste of robotics in 1950
through the works of Isaac Asimov.
It happened in the classic 1950 novel "I, Robot". First published
in an edition of 5,000 copies.
Its a collection of short stories about humans, robots and morality.
Shocking by the standards of 1950. And actually, even more so 50 years later.
The main theme throughout the book is the malfunction of robots,
and the use of "robopsychology" to sort it all out.
Asimov's famous Three Laws of Robotics also makes
its first appearance in this collection.
Certainly, I Robot was way ahead of its time!
Set in 2057, "I Robot" introduces us to robopsychologist Susan Calvin.
By then age 75, and through her we learn
about how the field of robotics was developed in the 21st century.
Not bad for a 1950 novel !!!
-You are a US robot psychologist ?
-Oh,are robots so different from men ? mentally ?
-Worlds different. She allowed herself a frosty smile.
Robots are essentially decent.
She went back to her desk and sat down:
-How old are you now, she wanted to know.
-Thirty two, I said.
-Then you don't remember a world without robots....
There was a time when humanity faced the universe alone
and without a friend....
-Certainly, said Bogert,
-A robot may not injure a human.
-How nicely put, sneered Calvin.
-But what kind of harm ?
-Any kind ?
-Exactly. So what about hurt feelings, what about
deflation of ego. The blasting of ones hopes. Is that injury ?
Just as the structure of geological strata and fossils
seem to be evidence of a past, our brains contain physical
structures consistent with the appearance of
recent and distant events.
I.e. our brains were not built to live in a 21st century
Blade Runner environment. Instead, we were built to live on the
savannahs of Africa - No wonder then, that it will take
a little getting used to this new age.
But we better start. The Blade Runner year 2019 is coming
up shortly. And actually, 2057 is not all that far away.
Meanwhile, we can enjoy the works of Ishiguro:
The Three Laws of Robotics:
1. A robot may not injure a human being,
or through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it
by human beings except when such orders conflict with the
3. A robot must protect its own existence as long as
such protection does not conflict with the First
or Second Law.
revised Nov. 2008.