Work by researchers at the Islamic University of Technology in Gazipur, Bangladesh, combining two methods of behavioural analysis, may well have brought us a step closer to the day when a computer can recognise the emotions of its user.
Emotion is a cognitive process and is one of the important characteristics of human beings that makes them different from machines. It has been suggested that if machines could feel emotion, our relationship with them would become far more efficient and pleasurable. This implies the advent of machines capable of detecting and analysing a user’s state of mind and reacting according to his/her mood. With this goal, a group of Bangladeshi researchers* took a combined approach based on keystroke dynamics – i.e. the way users type on their keyboards, plus text pattern analysis, i.e. the actual words that they type. It has been previously shown that keystroke dynamics are very personal to the individual. As long ago as the year 2000 researchers developed a system of weak authentication using keystrokes to identify whether the person typing at the keyboard was actually its rightful owner.
Combining two methods of analysis
The researchers took for their analysis seven states of emotion – joy, fear, anger, sadness, disgust, shame, and guilt. The behaviour of twenty-five test volunteers was analysed while they typed on their keyboards. A software programme recorded each key hit, plus ‘dwell time’ – the time between a key press and release, and ‘flight time’, i.e. the time between one key release and the next key press. In the first phase of the test the volunteers were given passages of fixed text to type, while the second phase consisted of writing free text. The research team asked volunteers to note the emotion which corresponded most closely to their own state of mind each time a pop-up appeared – every 30 minutes – while in the background the software programme continued to analyse their keystrokes. In order to categorise the data from the typed texts, the researchers drew on a vector space model using the Jaccard similarity method, which enabled them to compare the two aspects. While previous work on identifying users’ emotions relied on a single method, the Gazipur SSL team’s combined approach achieved an accuracy rate of above 80% in identifying the volunteers’ state of mind.
Highest-ever scores for recognising emotions
The researchers claim that their combined method for identifying state of mind has proved a great success. With the exception of the emotions of shame and fear, their approach achieved more accurate results than those previously attained using separate techniques. Moreover, the hardware they chose to use for data-gathering – a computer keyboard – is a widely used interface which is much less expensive than the range of sensors capable of capturing user emotions – including thermal imaging, cameras and physiological sensors attached to the skin. It remains to be seen how far such ‘cognitive’ capacity can be used in computers and whether cognitive ability could for example be incorporated into the operating systems. In the longer term, however, it seems clear that methods will have to be found to enable robots to recognise emotions in order to improve the way they interact with humans.
*at the Systems and Software Lab (SSL) in the Department of Computer Science and Engineering (CSE), Islamic University of Technology, Gazipur