Artificially intelligent systems map our journeys, unlock our homes, entertain us, and foretell the weather. But could our electronic assistants also start to learn our emotions and use that knowledge to serve us better?
In other words, does Alexa know when you get mad? Or sad?
In fact, Amazon teams have been working on analyzing your emotions from your vocal intonations for over a year. But what about our phones?
Most of us use touch screens hundreds of times a day, and many of them are force sensitive. One researcher, Alicia Heraz of the Brain Mining Lab in Montreal, trained an algorithm to recognize anger, awe, desire, fear, hate, grief, laughter, love, plus no emotion … simply from the way we use touchscreens.
Get the full story in my post at Forbes …