How it works
Sketch App, XMind
Principle App for microinteraction
Processing for visual elements
Finding the Problem
People can detect but can't overcome negative emotions.
From the interviews with students and work professionals, I found that no matter the participants were a student or a working professional, they all faced emotional issues. Working professionals had stress from work and family and faced more mood swings than students. Most of them learnt to be more aware when their stress levels rise and found the right way to respond to them but it is hard to make significant progress.
Defining users' goals
Working professionals have strong motivation to improve their emotional intelligence.
I built a persona based on the working professionals, who have suffered most from their emotions. Their goals were to improve their emotion-controlling ability, get along well with colleagues, and improve the relationship with familly.
Finding the opportunity
Emotional data should make users feel emotionally related.
I analyzed existing services in the market and categorized them into 3 main functions: emotion tracking, emotional analysis, and emotional journal.
I realized that the emotional analysis service could only translate emotions into numbers which are not intriguing or proactive. In the opposite, the emotional journal is reflective but it needs users to manually record their feelings and doesn't have objective data to show the progress.
I defined two axises which were emotional v.s. data-centric and inconsistently tracking v.s. consistently tracking. Then I put above products into this positioning matrix and found the opportunity.
The opportunity is to combine emotion tracking and analyzing with users' daily reflection in order to predict and prevent negative emotional outbursts and improve their emotional intelligence in the long term.
Finding physical and mental constrains
Analyzing tasks, identification, and gestures in five key contexts
I analyzed five main contexts and found that during meeting and working, users’ moods were most likely to become very bad, so the features of emotion tracking and alarming were very important. In hallways or driving were good individual moments for them to adjust their emotion. A good time to review the day and do a self-reflection was before bed. Then I analyzed the tasks, postures, identification, and gestures in each context.
exploring possible devices and interactions
Using a watch and a pair of earphones as the input and output
Based on constrains and tasks in each context, I ideated the devices and interactions. The band and earphones had these pros and cons.
Pros: Convenient for an entire day of wearing and tracking.
Cons: Can’t play the recordings.
Pros: Easy to use voice-control.
Cons: Can’t guaranteed to be worn throughout the whole day.
Device and interaction Ideation
odmo consists of a mobile app, a watch app, and a pair of earphones.
I finally chose the apple watch to consistently track users' emotions, earphones to replay the recordings, and phone to check the emotional data.
3 key features are emotion tracking, reflection moments capturing, and emotional data reviewing.
User Testing & Refinements
"Emotional data looks not emotional. I am curious about how the data is collected and how the related content is generated."
emotionally Visualizing the data
Using emotional colors and generative shapes
Started with using color to express different emotions.
Added graphic elements which I generated via Processing.
Placed the icons into a circle which made the interface more consistent.
Using processing as a tool to generate visual elements.
When trying to figure out how to visually express emotion, many common design tools are limiting. Processing is a flexible and expressive tool to generate dynamic visual arts via coding. I created three patterns corresponding to three emotional contexts and applied them to the interfaces underneath.