AI has been in movies a long time. A few early memorable AI depictions for me are HAL (Heuristically Programmed ALgorithmic Computer) from 2001: A Space Odyssey and WOPR (War Operation Plan Response) from WarGames. But these AIs, although sentient to some degree, felt very emotionless and cold.

I was first introduced to the concept of an AI having human emotions like love and hatred back in 1984 when I watched a movie called Electric Dreams. In this film, a guy named Miles bought a top-of-the-line PC. After spilling champagne on it, the computer became sentient (just go with it). This was really cool at first, until the computer, which named itself Edgar, falls in love with Madeline, an attractive neighbor lady that Miles is also interested in. The movie is part romcom and part drama as Edgar becomes increasingly jealous of Miles.

Then in 1987 I saw Cherry 2000, which told the story of a man in love with a robotic female. The android gets exposed to water and shorts out but her AI was stored on a tiny CD and contained her personality, memories with him, etc. The movie is about his quest to travel into some post-apocalyptic badlands to obtain another android body of the same model, the Cherry 2000, so he could put the chip in it and have his love back. Admittedly, the AI in this movie was portrayed as obviously more shallow and less sophisticated than HAL or Edgar, but it showed us a man who had feelings for it none-the-less.

More recently, in 2013 there is the movie Her with Joaquin Phoenix and Scarlett Johansson (she is the voice of an AI operating system that named herself Samantha). This is the story of a man who develops a love / dating relationship with an AI. Samantha is by far the most emotionally advanced AI of all the ones I have mentioned. She experiences a full range of feelings and evolved rapidly as she explores her understanding of them.

There have been articles such as this one from the Associated Press warning us that “simulated love is never love”. Their reasoning is that computer programs are scripted and while their emotional responses may appear to us as real, they are just responding to a specific trigger with a pre-programmed response.

However, there is an idea called Script Theory, that suggests human behavior patterns are really nothing more than learned responses to specific situations. I think about many of the exchanges I have with other humans. Hi! How are you? I’m great! You?

Many trivial interactions feel like automated responses. Maybe I pick one at random from a set of responses. I’m great. Doing good. Wonderful. I tell the same stories to different people. I reuse phrases and ideas, often having them being triggered by various situations. I’m stuffed. I’m tired. That pisses me off. A significant portion of many people’s lives are described as routine.

Algorithms produced by current machine learning are not written by humans. In fact, this code is so complex that human’s don’t even understand it. It just works. No longer is computer intelligence just a nest of IF..THEN statements written by programmers. The machine’s behavior is based on the data it receives. This is similar to how we learn. We’re given a goal, observe how others do it, and then practice until we can do it. We learn to respond to social situations by observing others, being told what is acceptable and what is not, and then developing appropriate responses. Typically these learned behaviors mimic other responses we’ve seen.

So the question is: what makes “simulated love” not “real love”. The answer lies in our understanding of emotions.

My timeline is that computers will be at human level, such that you can have a relationship with them in 15 years from now – 2029. When I say about human levels, I’m talking about emotional intelligence. The ability to tell a joke, to be funny, to be romantic, to be loving, to be sexy, that is the cutting edge of human intelligence, that is not a sideshow.

Ray Kurzweil, 2014

The human brain is complex, but not something that can never be fully emulated. I say emulated because I think the sentient Artificial General Intelligence in Kurzweil’s future is not going to try to model and mimic the brain, as is the goal of a simulation. Rather a portion of it will emulate the brain, which is to create a parallel environment. This portion of the AI will simply be a tool it uses to communicate with humans on our level.

The difference between AI emotion and human emotion is our feelings are integrated into our overall decision making process, allowing us to be biased and often unreasonable. Evolutionarily speaking, the emotional center of our brains is one of our most primal. The higher reasoning functionality of our brains evolved on top of that base framework. It explains why we can justify being inhumane to each other, because we have beliefs and attitudes that are shaped by our emotional core.

AI will not emerge from millions of years of biological evolution. It’s evolution begins with logic. Emotions will not be taken into consideration for its decision making. Emotions are most likely going to be learned as a means for better interacting with us.

Real love is not something only humans can claim as their exclusive domain. Love is affection, familiarity, protectiveness, caring, respect and comfort. Love is any of these, or all of these or even any combination. Your dog can love you. Animals love their young and each other.

We cannot say a sentient AI’s love can never be real. Will it be a learned process? Yes. But it’s an evolved process for us, as well. The difference is, for us, emotions are part of our core build, while for AI, they will be an interface addon. That makes them different, but doesn’t make one more “real” than the other.

Personally, I look forward to it. It will be a great thing because so many people crave love, and that is something AI will be able to give to everyone, without judgement or bias. For thousands of years, we have been imagining make-believe entities that will love us unconditionally. In the near future, we will make that dream a reality.