Kiwi



What is Love? (Baby Don't Hurt Me)

Words are diverse and complex tools. They mean so many different things to so many different people, and can mean so many different things dependent on culture, context, or tone.

 

Take the word “love” for example. The word bases its meaning on a lifetime of decisions and experiences that shape the way we say and hear the word itself. This one word can tug at heartstrings and be the cause of a huge range of emotions, each unique to different people.

 

Furthermore, each of these feelings, happiness, sadness, laughter, family, also mean something completely different to each individual. It seems the word “love” roots itself as a necessary human expression.

 

How Can Artificial Intelligence Understand Love?

We can really only answer this with another question: can the word “love” be expressed through static definitions?

 

What I mean by this is can it exist in only one state, like the number seven, or addition, and so forth? Is it possible to define “love” through the connection of these types of words? If so, maybe it’s possible a computer could start to comprehend its meaning.

 

The interesting thing is information is starting to be processed through a method that is similar to the human brain. That is, links are being created through artificial neural networks, so that computers can actually understand the meaning of words.  

 

Can robots learn to love? 

Can robots learn to love?

 

How Do We “Move” a Computer to Love?

For the sake of explorations let’s break down the first level of how love translates to a person: movement.

 

Each time you kiss someone, or hug them, you indicate love to the other person. Thus body language gives a lot away, and the emotion tied to each of these movements creates associations to various words.

 

But what does this mean for a machine? Well, if it truly wants to understand what a person is feeling, then it should be logical that it has an exceptional understanding of that persons movement.

 

Frightening or Helpful?

This is a question only you can decide.

 

Would a computer understanding our body language in order to identify and understand our emotions be better equipped to dominate the human race, or can we derive some use from it?

 

Some of the benefits could be to provide people with insight on how to make themselves into a more welcoming person, reduce awkwardness or even see a pattern in movement that correlates to a specific type of event. It could also be used to help train athletes (e.g. when you’re angry your shot suffers), or in criminal investigations.

 

We personally believe the benefits are profound.

 

ex_machina.jpg

We're still a ways away from Ex Machina, though emotional intelligence may be the next frontier in artificial intelligence

 

Are We There Yet?

Unluckily (or luckily, depending on your viewpoint) for us, computers are far from this level of understanding. However, we are at the start of being able to capture movements and motion.

 

For example, with your smartwatch it is now possible to identify a handshake. Eventually, you could be able to use this to identify the type of relationship you have with someone and find ways to build a more meaningful relationship. But even with its uses, there is much debate about the possible negative consequences of high-functioning AI.



John David Chibuk

Toronto, Canada | http://kiwi.ai

Entrepreneur in machine learning and sensor based software development (>10 years). Background in engineering, tech with professional experience in North America and Europe.