Tangible Media | Tangible Smile
Project 1 for course Tangible Interfaces taught by Hiroshi Ishii at the MIT Media Lab (2019)
Tangible Smile is a soft wearable object that responds to positive facial expressions. When a person facing the wearer smiles, the object inflates, providing tangible feedback to the wearer which simulates a hug. Tangible Smile can be used by blind and low-vision persons who wish to capture the smiles in their physical space, letting them feel the happiness that surrounds them. The project was made using machine vision/AI or “smile recognition”, a Pneuduino board, and pneumatic silicone wearable.
Collaborators: Lins Derry, Chenlu Wang, Peitong Chen, Piyush Verma, and Annie Wang.
Step 1
A facial detection program senses a smile which triggers a byte to be sent over serial to the Pneuduino board.
(Lins)
Step 2
The Pneuduino board then inflates the silicone wearable that simulates a hug-like sensation.
(Chenlu)
Step 3
To prototype the wearable, we used clay molding, vacuum forming, and silicone casting techniques.
(Peitong)