The Potential of Emotion in UX Design

Tech & Experience Design #3

Looking back on today, how would you describe your feelings in one word? It could be yesterday or this week as well. Try answering in 10 seconds. 10, 9, 8, … 1, 0!

How did you all feel? Positive emotions such as “happy”? Negative emotions such as “difficult” and “angry”? Or is it “relaxed”? There are likely some people who said “tired” as well. I said to describe your emotions in one word, but are “tired” and “dull” considered emotions? For Japanese people, when it comes to classifying emotions, I think that the four categories of emotions (joy, anger, sadness, and pleasure) are familiar. There is no doubt that the word “tired” is a word that expresses a mental and physical condition, but opinions are likely divided on whether it should be classified as an emotion. Was your response an emotion? Or was it your mental or physical condition?

Emotions are inseparable from the human experience. I think that the sense of taste is dominant when it comes to eating a special and delicious meal, but as a result, it leads to feelings such as “I am happy that I can eat delicious food today too.” This time, we’re going to talk about “emotion,” which is very important for the human experience and UX.

Categories of emotions

I have one more question for you. If you could sense the emotions of others almost perfectly, what kind of service would you create? What would you do yourself? If people’s emotions and thoughts were all exposed, society would likely go into chaos and people would feel uneasy, but there is still a lot of room for improvement in the accuracy of sensing people’s emotions and thoughts using systems.

I used to do research and development on emotional interaction. We would sense the emotions of users and change the content and services according to those emotions. For example, to make a space more enjoyable when it is “fun,” we created a space by controlling the lighting, music, projection of images on the walls and ceiling, images, scents, and so on. In a game, we made a prototype that suddenly raised the difficulty level and made you flustered when you got bored while playing. We also created and tested a prototype in which a robot or agent would give words of encouragement when the user’s emotions were extracted from their utterances and it was determined that they were depressed.

There were two points of difficulty in this emotional interaction. The first was that emotions could not be sensed accurately (the problem of emotion recognition and estimation). And the second was, upon figuring out those emotions, finding out what should be done with the services and contents for them to be effective and what should be done for optimal results (the problem of expression). Firstly, I think the reason why emotions are difficult to sense is that emotions are difficult to classify in the first place. Do you know how emotions are categorized? As mentioned earlier, emotions can be classified into four types, although it is quite forced. Here, I shall briefly introduce Plutchik’s Wheel of Emotions and Russell’s Circumplex Model, which are well-known classifications of emotions.

Plutchik’s Wheel of Emotions

Plutchik’s Wheel of Emotions divides the basic emotions into eight. Joy, trust, fear, surprise, sadness, disgust, anger, and anticipation (expectation) belong to the ring outside the center of the emotional wheel. Different emotions are mapped according to their strength against these eight basic emotions. Strong emotions are at the center of the emotional circle, and weak emotions are on the outside. When looking at disgust, the stronger emotion, hatred, is in the center, and boredom is on the outside. And there are eight combinations of basic emotions in between each one. Love, submission, awe, disapproval, remorse (regret), contempt, aggressiveness, and optimism. Some people may feel that the boundaries of each emotion are very subtle when looking at this diagram and it is difficult to be aware of them clearly. In addition, Plutchik’s Wheel of Emotions was originally written in English, so there is no Japanese vocabulary that fully expresses the emotions and there may be some parts that don’t feel right.

Russell’s Circle of Emotions

Russell’s Circle of Emotions (Added to and modified by Neoma Design Co., Ltd.)

In Russell’s Circle of Emotions, emotions are arranged on a plane with the vertical axis ranging from AROUSAL, ACTIVE to PASSIVE, CALM, and the horizontal axis ranging from PLEASANT to UNPLEASANT. The axis of arousal and non-arousal may seem strange, but I think it makes a lot of sense from the perspective of the autonomic nervous system. Autonomic nerves are composed of two types of nerves, the sympathetic nerves, and the parasympathetic nerves, and each functions by working in a well-balanced manner like a seesaw to control the human body. When these sympathetic nerves are in a dominant state (like an accelerator in a car), one is on the arousal side of the vertical axis of Russell’s Circle of Emotions, and when the parasympathetic nerves are in a dominant state (brake and rest), one is on the non-awakening side of the vertical axis. In addition, biometric information such as heart rate, pulse rate, respiration, skin resistance, blood flow, and perspiration are almost synchronized with the autonomic nervous system. For example, sweating increases when the sympathetic nerves are dominant, and it can be estimated that the vertical axis of Russell’s Circle of Emotions is positive (the area where Y>0), that is, one is feeling excitement and tension. Don’t you think that by judging whether the emotion at that time is negative or positive, you can estimate the emotion to some extent?

I think that Russell’s Circle of Emotions is very suitable for sensing emotions in this way. In fact, Plutchik’s Circle is often mentioned in academic conferences related to psychology, but I feel that there are many examples of emotional estimation by referring to Russell’s Circle of Emotions in articles published by academic societies in the field of information and communication, sensibility, and robotics.

Of course, various proposals have been made regarding the classification of emotions in fields that have been studied for a long time, such as psychology and emotion/affect theory. On the other hand, we ourselves define each emotion differently in the first place. Can you describe the difference between the emotions “frustration” and “nausea”, and the differences between “leisure”, “calm” and “relaxed” in Russell’s Circle? Each language has many words to express emotions. Does the definition of the word that expresses a particular feeling in the dictionary match your actual feeling and sensation? If someone describes their current feeling with the word “happy,” does that match your “happy” emotion?

In this way, it is necessary to be careful about the difference between the definitions of emotions and the perception of individuals. Especially for those who regularly conduct user research and interviews, please pay attention to the definitions and personal perceptions of these terms.

Biometric information and emotion

How can we sense human emotions? One way is a method of extracting from faces and voices, and the other is sensing from biometric information. Biometric information refers to a variety of physiological, anatomical, and physical (organs’) information emitted by living organisms. It includes electrocardiograms, heart rate, electroencephalograms, and so on, as well as skin temperature (body temperature), breathing, and blinking.

Firstly, as a method of sensing emotions from faces and speech, algorithms such as machine learning are used on facial expression image patterns (feature points and so on). Among face recognition technologies, smile determination is highly accurate, regardless of region or culture. There are also cameras and apps that have a smile shutter function that reacts to your smile. However, we still have a long way to go to sense whether the emotion behind the smile is happiness, whether having fun, or whether finding something funny. In addition, research is being conducted to extract emotions from the prosody and utterance content of speech. Until a little while ago, it was possible to roughly classify them into three categories: joy, anger, and sadness. Recently, it has evolved further, and although it is not perfect, we are in an era where we can analogize the classification into seven emotions: peace, happiness, anger, sadness, fear, disgust, and surprise.

Next, as for sensing from biometric information, there is also a method of inferring from vital signs data (vital data) that indicates that a person is alive, such as heartbeat, skin temperature, pulse, and perspiration. The difficulty with this method is that it is easily influenced by the environment. For example, the temperature of the room and the clothes worn at that time will change the body temperature and perspiration rate. In addition, meals and exercise or the lack thereof immediately before the measurements will also affect them. Particularly when it comes to sports, body temperature and perspiration are more dominated by exercise activity than emotional changes. A state of normality and stillness is preferable. Pre-measurement emotions can also affect the measurements. You likely have experienced prolonged emotions as well. For example, something irritating that happened right before caused you to respond to your partner’s casual words in a sharp manner…and so on. If we want to measure changes in emotions due to interactions, we want to eliminate the effects of previous emotions that are unrelated to the event we want to measure. When sensing emotions from vital data in this way, the key is to be able to sense them in an environment that does not affect one’s emotions.

Interactions using biometric information

For interactions and services that use emotions, it is important to accurately sense emotions and think about how to express them to users. At present, it seems that there are many products and services that focus on emotions that are easy to acquire or that are effective when expressed, rather than sensing all emotions. Emotions that are easy to acquire include anger, surprise, and joy. These are the emotions of the arousal region (the area of Y>0) in Russell’s Circle of Emotions, that is, the state of sympathetic nerve dominance.

In fact, there have been several products and services that use biometric information in the past. Konami’s arcade game Tokimeki Memorial ~Tell Me Your Heart~ was released around 1997 and made use of a mouse-shaped piece of hardware with a vital sensor attached to the tip of your finger. By gripping the mouse, it sensed the player’s pulse and perspiration rate. This game is a dating simulation game in which the main character manipulated by the user approaches female classmates in order to invite them on a date. When answering questions from the girls, it seems that the player’s heart rate and sweating, that is, the degree of tension, etc. were measured and used to decide which route the game would advance on. In 1998, Seta released Tetris 64 for the Nintendo 64, which was played with a biosensor in the player’s ear. It seems that the attached heart rate sensor would extract the degree of tension of the player, and the Tetris block would become a large distorted shape according to the degree of tension, or sometimes a small advantageous shape.

In this way, there have been several games in the past that used biometric information, mainly sweating and heartbeat, as a trigger to determine whether or not the player was tense. There are also games using brain waves such as Mind Ball, Mind Flex, and The Forth Trainer. However, I feel that none of them have been a big hit, unfortunately.

Summary

Experiences are innately emotional. When I talk about UX design, I sometimes describe it as the pursuit of adjectives related to people’s emotions and thoughts. If we verbalize the emotions, such as fun, satisfied, relaxed, happy, and wonderful, we need to define and classify them, but there are really various categories of emotions. And although verbalized emotions are common to some extent, they are not completely consistent among people. Even between Japanese and English alone, both sides have words that express emotions and thoughts that are difficult to translate. For example, in Japanese, the feelings of “anxiety, relief, and calmness” can be expressed in English as “calm, relaxed, and relieved,” but each has a slightly different nuance. It’s not one-to-one. This is just one example of how different languages have different taxonomies and experiential values.

I think one of the keys to success in emotional interaction design is how to accurately sense such difficult emotions and biometric information, and how to express those results.

Please answer how you feel now after reading this article…

Note

To those who have had their interest piqued after reading this article and are thinking of acquiring vital data:
These days, vital data can be obtained easily and easily with wearable devices. However, it cannot be said that there are no problems such as poor physical condition after wearing them. It goes without saying that the subject’s consent is required, but when conducting a study that acquires biometric information/vital data, please check the ethical guidelines of the company or organization, or consult with the relevant organization.

・・・

In Technology and Experience Design, we will not talk about textbook UX design, but we will talk about design drawing from a wide range of knowledge. Regardless of whether it is digital or analog, we will dig deeper into what it means to design an experience by interweaving practical development sites, world affairs, and familiar perspectives such as our living spaces and human sensibilities and emotions.

Written By

Michinari Kohno

Michinari is a BXUX Director & Designer and is the owner of NeomaDesign. He has worked on UIUX design at Sony for 22 years, mainly working on global products like Playstation 3, Playstation 4. After Sony, he became independent and now is a consultant for next-generation UIUX, doing anything from designing concepts to project management and direction. He loves dancing at musicals himself, watching motor races, and walking his dog.

Nanako Tsukamoto

Nanako is an editor for the English version of Spectrum Tokyo. After spending ten years in the US and graduating from Sophia University, she worked in finance for six years. She loves planning train trips with her 4-year-old son, an avid train enthusiast.

Partners

Thanks for supporting Spectrum Tokyo ❤️

fest partner Recruit Co., Ltd. fest partner freee K.K.
fest partner DMM.com LLC fest partner TOYOTA Connected Corporation
fest partner Gaudiy, Inc. fest partner note,inc.
fest partner STORES, Inc. fest partner Ubie, Inc.
partners Design Matters

Spectrum Tokyoとの協業、協賛などはお問い合わせまで