From a spider-like helmet on my head my alternating happy, sad and stressful thoughts are channeled to a screen on the wall – which is filled by growing graphs in different colors. Sensors in the helmet read EEG signals as indicators of my feelings. I’m testing Emotiv, which is one of the many new technologies that collect biometric data to measure emotional currents. Today, facial recognition software can pinpoint happiness if you smile into the camera. Our faces are one big field of expression that is easy for machines to learn. A heart bursting with feelings of love can be recorded by the heart rate monitor in a smart watch. Sweat secretions when someone is late for work can be captured by sensors that analyze stress levels.
The happiness of nations and customers’ feelings have long been evaluated with the help of surveys. Now there’s a whole host of new methods that, with input from bodily data, can calculate the most likely emotion at any given time. There is even software that analyzes social media activities, not just our word choices and their import, but also context and patterns. In other words, we have some rather adept happiness meters.
By moving from self-reported happiness in surveys to measuring actual feelings in real-time – and by connecting them with world events – research will come closer to real experiences. Harvard has conducted the study “Track Your Happiness” with the help of an app that measures life happiness. The Hedonometer project has charted happiness levels in US cities by analyzing 37 million tweets by 180,000 people. Now even the great technology giants are getting on the bandwagon. Apple recently invested in a company that measures emotional-based brainwaves and Mark Zuckerberg said last week that Facebook is developing software that can read thoughts and feelings in order to turn them into text. For real.
Technology that reveals our feelings can definitely be an invasion of privacy, but advertisers are rejoicing about how this will improve methods of measuring the effects of marketing. Customer service centers will be able to read customers’ moods using voice analysis, security police will get a new layer of information at airports and other sensitive sites, and it could even help people with autism to better interact with others. We are also going to start seeing products based on emotional data. Nikon is experimenting with a sensor camera that can read location, sound and temperature to customize the photo with an appropriate filter, according to the emotional mood.
One of the most interesting application areas for emotional data would be to expand our rigid GDP measurements to include the value of social interactions that have previously not been measurable, such as friendship, family happiness, ethics and a sense of meaning in life. Data shows, for example, that we have the clearest feelings of happiness when we help others. Several nations talk about well-being as an important goal for sustainable community development. Can we calculate how much we actually take pleasure in parks, sporting events or in having a visible police force in the vicinity? A study in Amsterdam showed that a noise increase from the airport made people unhappier than a decrease in their own incomes.
It follows that in the future we will be able to provide informed answers to the question of how we feel. In my case, with the spider on my head, I could only watch the screen. Yes, thanks, the data says I feel totally fine. Hopefully, it will make us more aware of what contexts we most like to thrive in. The American Meteorological Society recently made an emotional map that shows that happiness is maximized at 57 degrees Fahrenheit (14 degrees Celsius). So enjoy spring and autumn – summer is too hot for happy days.