Washington: Researchers have developed a computer programme that can monitor mental health by analysing 'selfie' videos recorded by a webcam as a person uses social media.
The new programme, developed by researchers at the University of Rochester in the US, can turn any computer or smartphone with a camera into a personal mental health monitoring device.
Professor of Computer Science Jiebo Luo explained that his team's approach is to "quietly observe your behaviour" while you use the computer or phone as usual.
He said that their programme is "unobtrusive; it does not require the user to explicitly state what he or she is feeling, input any extra information, or wear any special gear."
For example, the team was able to measure a user's heart rate simply by monitoring very small, subtle changes in the user's forehead colour.
The researchers were able to analyse the video data to extract a number of "clues," such as heart rate, blinking rate, eye pupil radius, and head movement rate.
At the same time, the programme also analysed both what the users posted on Twitter, what they read, how fast they scrolled, their keystroke rate and their mouse click rate.
Not every input is treated equally though: what a user tweets, for example, is given more weight than what the user reads because it is a direct expression of what that user is thinking and feeling, researchers said.
To calibrate the system and generate a reaction they can measure, Luo explained, he and his colleagues enrolled 27 participants in a test group and "sent them messages, real tweets, with sentiment to induce their emotion."
This allowed them to gauge how subjects reacted after seeing or reading material considered to be positive or negative.
They compared the outcome from all their combined monitoring with the users' self reports about their feelings to find out how well the programme actually performs, and whether it can indeed tell how the user feels.
The combination of the data gathered by the programme with the users' self-reported state of mind (called the ground truth) allows the researchers to train the system.
The programme then begins to understand from the data gathered whether the user is feeling positive, neutral or negative.
Their programme currently only considers emotions as positive, neutral or negative. Luo said that he hopes to add extra sensitivity to the programme by teaching it to further define a negative emotion as, for example, sadness or anger.
Currently, no "app" exists for the programme, but researchers plan to create an app that would let users be more aware of their emotional fluctuations and make adjustments themselves.
The research was presented at the American Association for Artificial Intelligence conference in Austin.