Felt, created during Design Engineering Studio at Harvard, aids the deafblind in communication through a device worn on the forearm and a pin worn on the chest. Unlike current deafblind communication devices that only focus on the text of a conversation, Felt allows the wearer to understand the emotions of others, enabling richer communication and ultimately deeper relationships.
There are over 15 million people in the world who have severe deafblindness. Most of these people have to rely on human translators to communicate. Even with a speech-to-braille reader, people with deafblindness often feel left out when they cannot understand the subtext, or underlying emotion, of a conversation or social setting.
In order to create a solution for our design question within our constraints as students, we narrowed down our target user group as people who are deafblind, verbal, and braille-reading. We also determined the values we wanted our product to embody as below.
We spent as much time as we could in this project on research and learning about our user. In this process, we engaged in the following:
• talked to 12 stakeholders including a person who is deafblind, teachers at a school for the deafblind, and a director of a center for the deafblind
• reviewed 8 research papers
• did 4 hours of field research and observation
Most people who are deafblind rely heavily on human translators, but they all have slightly different methods of communication. For instance, some can read braille; not all can. Protactile, a new deafblind language that is rapidly gaining popularity, has no standardization, either.
Based on various studies and also experimentation on ourselves, we found that the forearm is a part of the body that is comfortable for a communication device to be on, is easily accessible, and has enough space for tactile sensation.
Of course, we were limited to haptic feedback to convey emotion, such as heat, pressure, and vibration. We found that both heat and vibration can be very intense when felt for a long time, so we opted for gentle pressure in our solution.
Based on our findings and analysis, we decided to create a three part system. The armband would give haptic feedback to the wearer's forearm on the different emotions of the conversation. The detachable braille reader on top of the sleeve would be a speech-to-braille reader to convey the text of the conversation. The pin, worn on the wearer's chest and equipped with a camera and microphone, would detect emotion using computer vision and machine learning, as well as collect speech for the braille reader.
We analyzed human-to-human communication and divided it into three components: presence, text, and subtext. Our product addresses all of these parts in the user flow through various haptic feedback. The presence of someone wanting to engage in conversation is alerted with a light vibration. The text of the conversation is translated by the braille display. The subtext of the conversation is translated by the armband through a series of patterns.
We envisioned our solution to be gentle, soft, and beautiful — just like our project values. Our device would feel like a delicate extension of the arm that allows seamless communication for the wearer. Below is a moodboard we created to visualize our vision.
Translating emotion into haptic feedback is similar to creating a new language. To do this effectively, our team narrowed down the emotions to focus on: happiness, sadness, surprise, disgust, fear, anger, nervouseness, and excitement. We then created a visualization of each emotion to have a common understanding of each emotion. The colors of the emotions are based on Robert Plutchick's Emotional Color Wheel, and the shapes are based on our interpretation of each emotion.
Inspired by protactile, American Sign Language, and our visual exploration, we created a unique pattern for each emotion on a 2x4 grid. We decided the grid size based on the size of an average forearm.
We attached clay balls on pencils to emulate the actuators that would be used in our armband and tested the patterns on each other. Concurrent to research, we were able to distinguish each pattern with ease. We also experimented the minimum distance between two points for one to be able to distinguish the points, which we determined to be 3 centimeters. This dimension was later used in our product design.
While iterating on the form, we considered many factors — the product had to be comfortable, embody our aforementioned values, and be appropriately sized for the forearm.
Material was a very important factor in our product, since we wanted our solution to mimic human touch, which is the main method of communication for the deafblind. We chose materials that were soft, comfortable for long wear, and durable. We also wanted to emulate a fingerlike sensation with the actuators that would create the emotion patterns on the wearer's forearm. Thus, we opted for a lattice structure made of polyurethene for the tip of the actuators.
This was my first time designing a product for people without sight nor hearing, and while it was quite difficult to create interactions and flows without visual or auditory cues, I found that desigining for extremity is just like any other design process. We made decisions based on research insights and validated them with stakeholders in the field. On the other hand, something I did not expect was learning about the sometimes predatory nature of user research. Our interviewees voiced concerns that people with deafblindness often get disappointed when projects they participate in user testing for never materialize into anything useable. I learned to be cognizant of user expectations and recognize the responsibility of a designer in these circumstances.