British Columbia

Microsoft is developing eyeglasses that detect emotions in others

Technology giant Microsoft recently received a patent to develop emotion-sensing glasses.

Question remains whether it will ever become commercial product

Microsoft’s wearable emotion detection and feedback system that can detect how those around you feel. (Microsoft/U.S. Patent And Trademark Office)

Microsoft has received a patent to develop eyeglasses that could allow us to see what other people are thinking or feeling by sensing their emotions.

The patent is for what the technology giant calls "a wearable emotion detection feedback system."

"Basically it's a pair of glasses that can tell whether the people around you are happy or sad, friendly or unfriendly, excited or calm … just by looking at them" said On The Coast technology expert Dan Misener. 

The emotion-sensing glasses also claim to be able to tell if someone is flirting. 

The spectacles are regular see-through glasses that have cameras and microphones built into them. 

When you're looking at someone, the sensors measure the person's facial features and where their eyes are focussed, while the microphone measures speech patterns and rhythm. 

That information is then sent to Microsoft in real time, and compared against databases of facial and audio patterns.

Microsoft's servers then crunch the data, and send back an analysis to whoever is wearing the glasses.

"To be clear, this is a patent application," said Misener.

"We don't know if this will turn into a product that you or I could go out and buy in the store. And Microsoft isn't saying."

This type of technology could be helpful for those on the autism spectrum.

But some say it should be approached with caution. 

"Technology companies who are working in this area really need to consider the context of what they're putting out there and what is the need and the emotional state of the person who is using it," said Scott Smith, a futurist with the research and consulting group Changeist. 

There are also privacy considerations and Smith worries that in the rush to commercialize emotional technology, there is a risk in trivializing real human emotion. 


Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.

Become a CBC Member

Join the conversation  Create account

Already have an account?

now