Washington: A Binghamton University researcher wants computers to understand inputs from humans that go beyond the traditional keyboard and mouse.
Lijun Yin and team have developed ways to provide information to the computer based on where a user is looking as well as through gestures or speech.
Yin says the next step would be enabling the computer to recognize a user``s emotional state.
"Computers only understand zeroes and ones. Everything is about patterns. We want to find out how to recognize each emotion using only the most important features,” said Yin.
Yin is also considering use of photographs, and even three-dimensional avatars that are able to display a range of emotions.
"We want not only to create a virtual-person model, we want to understand a real person``s emotions and feelings. We want the computer to be able to understand how you feel, too. That``s hard, even harder than my other work,” said Yin.
"This technology could help us to train the computer to do facial-recognition analysis in place of experts,” added Yin.