The Problem with Robots

I've been watching Real Humans and Humans, along with other robot movies. I'm a fan of these and other science fiction TV series and movies. My first science fiction novel was The Tar-Aiym Krang by Allen Dean Foster, who went on to write Alien, Outland, and a host of other fantastic books.

So, back to the topic. The big problem with these portrayals is that robots are just humans. That is, they have the same motivations, and the same ethical standards. This is not quite realistic, though humans may indeed create robots in their own image. It may please the robots (or be a burden) to behave and be motivated in different ways.

The best example of this is Ex Machina in that the robot actually wants to simply go to where humans and traffic lights interact, and everything else is a human seduction but a robot necessity. Brilliant

The three laws are useful, but not clear enough, since while priorities are clear, the definition of harm is never clearly stated. In addition, these laws are very thin, and don't provide any primary motivation, other than servitude. That could result in a variety of outcomes. Indeed, the psychopathology of robots may be infinitely more complex, or simply fundamentally different. Do they dream? Are they raised with mothers? I believe Freud may have much to teach us in this regard, as in others.