Instructions
Do the preparation task first to help you with the difficult vocabulary. Then read the article and do the exercises to check your understanding.
As we increasingly depend on digital technology for almost everything in our lives, a new smartphone app offers help in understanding our moods and emotions.
In the future perhaps new technology will help us understand ourselves a little better, like the new app from the Cambridge-based ei Technologies – ei stands for ‘emotionally intelligent’. The company is developing an app that will be able to identify people's moods from smartphone conversations, from the acoustics instead of the content of a conversation. This technology has obvious commercial usages in a world where we interact with computer voices for services such as banking. ‘In call centres,’ says CEO Matt Dobson, ‘it’s about understanding how satisfied my customers are. As a consumer you have a perception and that is influenced by changes and tone in their voice.’
Engineers have natural curiosity
Dobson worked in healthcare and he developed an interest in mental health where this technology offers many possibilities. ‘I really wanted to do something in the area of emotion recognition and mental health,’ says Dobson. Then a friend of his in Cambridge showed him an article, they looked at some technical papers and thought they could build something. ‘If you look at the mental health market it is one of the biggest needs, bigger than cancer and heart disease, yet has about a tenth of the funding.’ Dobson gives examples like the media coverage of the CEO of Lloyds taking time off due to stress, as an example of greater public awareness of psychological issues.
Before Dobson did an MBA at Cambridge, his first degree was in Mechanical Engineering at Bath – this experience in science gave him an advantage. ‘Engineering is all about natural curiosity, not being afraid to play with stuff,’ says Dobson. ‘I am not an expert in this area but I know enough to ask the right, smart questions and can review a research paper and get a good idea what the limits and possibilities are.’
Speech recognition
At the beginning, they needed expertise in the area of speech and language, and machine learning. So they called on Stephen Cox, a specialist in speech recognition and Professor of Computing Science at the University of East Anglia, who is now an adviser.
The ‘empathetic algorithm’ is based around the idea that we can differentiate between emotions, without necessarily knowing what words mean – think of watching TV or films in a different language. ‘It’s about understanding what parts of the voice communicate emotions, acoustically what features show emotion – we use probably 200 to 300 features in each section of speech we analyse.’ The researchers collected data to train the system. The system uses statistics to identify the most probable emotion of the speaker.
Emotional life-tracking
Soon, says Dobson, they will have a free app where the conversation we have just had can be emotionally analysed and the users can tweet to a Twitter page. ‘It will say “Matt had this conversation”, I can include your Twitter handle and it creates the dialogue between us and say “I had a happy conversation with John”.’
But the next step, involving a kind of emotional life-tracking is more complicated. ‘That is quite a sophisticated piece of software,’ says Dobson. The idea being that we will be able to cross-reference our emotional states with other bits of our data from other parts of our day. ‘How can we use this information to monitor and understand human behaviour?’ says Dobson. Dobson also wants to investigate how and why people get depressed.
If you could design a new app that could do absolutely anything, what would it be able to do?
Comments