London: Video conferencing may undergo a sea change, with the latest technology, which would enable a computer monitor to ape human motions.
Researchers David Sirkin and Wendy Ju from Stanford`s Center for Design Research, who came up with this idea, created a motorized flat-screen display that mimics various human motions like shrugging, nodding, and laughing.
The team managed to do so by adding motors to the anglepoise-style arm of an Apple iMac G4 - the popular “screen-on-a-stick” model - to make it robotically controllable.
They then linked it to software that reads a person``s movements and instructs the G4``s moveable arm to perform one of nine motions.
These motions, which consist of nodding up and down for yes and moving side to side for no, as well as leaning in and out, all are controlled by the user via a Wii game controller.
A robotic arm was also added for extra effects, such as tapping on the table to catch someone`s attention.
The scientist duo revealed that they got a good response to their idea by the volunteers at the Human Robot Interaction conference, held last month in Boston.
“Consistency between physical and on-screen action improved understanding of the messages that remote participants communicated,” New Scientist quoted the researchers as saying.