Personal Project
2 weeks
[accessible technologies] [ubiquitous computing] [tangible interaction]
[natural user interface]
yellow foam, Rhino, 3D printing, Cinema4D, Arduino, coin vibration motors

What is Aero_?

Aero_ is a utensil set designed for visually impaired users, inspired by the concepts of ubiquitous computing and tangible interaction.

What can Aero_ do?

The 2018 Design

In my class Object Lessons in 2018, we were prompted to design for diabled users. I decided to design a utensil set for visually impaired users. I envisioned a utensil family that is both elegant and helpful.


3D Sketches

I used yellow foam for form-finding. I made 3D sketches by carving yellow foam, and did simple testings feeling these 3D sketches in my hand to choose which ones to carry on for further form development.


3D-Printed Prototypes

I learned Rhino watching online tutorials for this project. I built the final designs in Rhino, and 3D-printed my designs as the final deliverables.

Revisiting in 2020

What if technology is added to the design?

User Research
I conducted secondary research online reviewing articles, paper and first-person experience videos of dining experiences for the visually impaired, assistive technologies, the human sensory system, and topics regarding technology and design

I interviewed one visually impaired individual about his dining experiences.

I conducted a “dark dining” experiment at home.
(An experience that was first started as “Le gout de noir”/“a taste of darkness” by Michel Reilhac in Paris in 1997, and is now operated at many places around the world.)


Pain-points and Present Solutions

Many pain-points currently exist in visually-impaired users’ dining process.
There are proven-valid solutions for some of them, however, a visually-impaired user must study the solutions for a long time before being able to employ them in their everyday dining process.


Functionalities Analysis

Utensils are perhaps the only “tools” in dining processes. For visually impaired diners, to overcome the pain-points listed above, each utensil serves multiple functions.

Technology Research


Different Kinds of Feedbacks 

 The following are main “translations”/feedbacks in designing for visually impaired users.

︎ 01 - Tactile

The sense of touch. Tactile designs for visually impaired users often offer simple and intuitive guidance, communicating information in a non-intrusive way. A commonly seen example for tactile design is tactile pavements.

︎ 02 - Braille

Readable words for visually impaired users. Braille could be seen as an application of tactile design for visually impaired users. However, I listed it separately because as a part of the language for visually impaired users, braille itself could serve as a vehicle for communicating ideas.

︎ 03 - Haptics

Haptics technology is an emerging technology that provides a virtual simulation of touch and motion. The implementation of haptics technology includes: vibration, force feedback, air vortex rings, and ultrasound.

︎  04 - Auditory

The auditory sense is sometimes used to translate the visuals into for visually-impaired users. However, in some cases, auditory feedbacks could be interruptive.

︎  05 - Optical Aid

For visually impaired users with partial vision loss, optical aid could offer great help. For example, in the dining process, high-color-contrast plate settings could help those users distinguish plates.


Applications in Research Projects

To better understand the different kinds of feedbacks that could be potentially used in designing for visually impaired users, I studied the following research projects.

︎ Objective: Help visually impaired users navigate virtual reality.

︎ Feedbacks: Physical resistance generated by a wearable programmable brake mechanism, vibrotactile feedback, spatial 3D auditory feedback.

︎ Analysis: This project incorporates different kinds of feedbacks associated with different human senses. This approach makes it possible for a world-rebuilding in the visually impaired user’s mind of what is in the VR headset.

︎ Objective:  Help visually impaired users navigate, and give them more information of what’s in front.

︎ Feedbacks: Braille and haptic vibration.
︎ Analysis: This system takes input from a depth camera, calculate the input information with a computer and translates to two levels of feedback with different amounts of details. While the haptic vibration belt can offer quick guidance, the refreshable braille display also allows the user to know what is in front. 

︎ Objective: Allow visually impaired users to read information from the digital world.

︎ Feedbacks: Haptics (the use of ultrasound waves in particular) communicating Braille.

︎ Analysis: This project especially opened my eyes to the advancements in haptics technology. I couldn’t believe haptics feedback could be as detailed as letting the user read Braille. I also did not know that haptics feedback could be experienced mid-air.

︎ Objective: Help visually impaired users access information from text.

︎ Feedbacks: Audio feedback.

︎ Analysis: I think voice feedback would an obtrusive feedback in other contexts of interaction. However, in the context of reading, it makes perfect sense to use voice feedback that reads for the user.
From studying the research projects studied above, I gained the following insights :

︎︎︎ In many cases, there would be an input device that captures visual information, a computing process that translates the information, and the actual feedback mechanisms.

︎︎︎ Different kinds of feedback are appropriate for different contexts.

︎︎︎ Sometimes, the incorporation of different kinds of feedbacks together helps with a realistic world-building in the users’ minds.

︎︎︎ Different kinds of feedbacks can also be designed into a device for the user to choose between knowing different levels of details.


How can I leverage different assistive technologies in each utensil design,
with respect to its functionalities in the visually impaired’s dining process,
and improve the experience with natural interactions?




︎ The interaction is natural and calm.
︎ The interaction offers some room for the user to DIY.
︎ The interaction doesn’t counter a specific pain-point.
︎ The information offered by the feedback is not direct.

︎ The feedback offers direct guidance.
︎ Counter the pain-point on food waste.
︎ Clicking a button is not a natural interaction.︎ The button will sacrifice the overall aesthetic.

︎ The feedback can offer direct guidance.
︎ A utensil only offers so much space for potential tapping interactions. Pairing with 3 dishes may overcrowd the space on the utensil.

︎ An automatic process, does not interrupt the user’s dinning process.

︎Specifically targeted to the pain-point related to cutting food. ︎ Different users would have different bite-size preferences.


Further Directions

From the storyboards above, I decided to move forward with storyboard 2, 3, and 4 with the following modifications and further investigation directions:

Storyboard 2
︎ Change the button-the proposed click interaction to a tap, to make the interaction less intrusive and more natural. 

Storyboard 3
︎ Investigate holding areas of the utensils and determine the best positions for tapping interactions, and how many such interactions can one utensil afford.
︎ Investigate what technologies could help achieve this.

Storyboard 4
︎ Investigate what technologies could help achieve this.

Concept Development


Holding Gestures

I investigated different holding gestures for the three utensils to look for places on the utensils for potential interaction. My main considerations were finding places that does not affect normal-use (holding) of the utensils, but also are close enough that they are convinient places for interaction. 

I determined each utensil to have two areas that could afford interactions:
the bridge between the handle and the utensil’s head, and the end of the utensil’s handle.

︎ Area - 01

This area can afford interactions. Out of the two interaction areas, it is the one for more convinient and intuitive interactions, for the user can easily tap on the area with their index finger nearby. 

︎ Area - 02 

This area is where the user holds the utensil, and is therefore not available for interactions. 

︎ Area - 03 

This area is for less intuitive interactions. The user needs to strecth their thumb to tap on the area. 


Logistics - Coupling of Bits and Atoms

To achieve the proposed interaction in storyboard 3 (utensils guiding the user to specific dishes), I further explored technologies that could answer the following question:

How can I allow utensils (atoms) to store dish-information (bits), and allow pairing between different dishes and utensils?

Radio Frequency Identification (RFID) technology allows automatic identifications and tracking of tags attached to objects. The data on the tags can be rewritten and modified. Radio Frequency Identification technology is also cheap and durable. It will not greatly increase the cost of production, and the utensils in-turn could maintain an affordable market price for everyone.

RFID tags could placed/embedded onto plates. The utensils will serve as “readers” which scans and read the data that the user required and send the user feedback with haptics.


Logistics - Let the Spoon Be Smart

To achieve the proposed interaction in storyboard 4, the spoon needs to be able to detect its angle and make adjustments to change the angle (at the bridge between its head and handle) stabilize itself.

To find the right kind of technology that could help achieve this, I first looked at soft robotics. However, soft robotics could be rather expensive and complicated to be implemented in the context of utensils. Furthermore, the three main control methods of soft robotics (electric field, thermal, and pressure difference) are all not applicable in this case.

Looking for inspirations from existing products, I found portable gimbals today to have the ability to self-stabilze and self-adjust its angle as reactions to the user’s movements. In a way, this is very similar to the self-stabilzing and adjusting abilities I envisioned for the spoon. I further researched what technology lays behind this interaction in smart portable gimbals.

Looking at the technical components of gimbals, I allocated Inertial Measurement Unit (IMU) to be what does the magic.
IMU uses three sensors (accelerometers, gyroscopes, and magnetometers) to measure and report objects’ movements, orientations, angular rates and specific forces.

According to a presentation from U-M EECS students, it is used in motion capture, vehicle tracking, attitude and heading reference system, and orientation sensors.


Logistics - Haptics Feedbacks

For simple haptics feedbacks, I plan to use an Arduino board to program coin vibration motors

Coin vibration motors are small and cheap vibration motors with an enclosed vibration mechanism. These charateristics make it suitable for this project, to be applied onto the utensils. 

Adafruit DRV2605L is a haptic motor controller that provides more than 100 vibration rhythms. It could also be used in testing the guidance granted by different vibrance patterns



Prototyping Interaction Models

I prototyped my proposed interactions onto the CAD models of my utensils design in Cinema4D:

︎ 01 Locate a specific plate with the help of haptics feedback

︎ 02 Haptics alert before potential “knock-down” event

︎ 03 Spoon self-stabilizes and adjusts angle to prevent spillage

︎ 04 Check for and guide user to the food left on the plate


Physical Prototype

I am currently in the stage of testing different vibration patterns with an Arduino board an coin vibration motors.

I would to like to continue my experimentations with an Adafruit DRV2605L haptic controller, an inertial measurement unit, and radio frequency identification tags. After the experimentation phase, I would like to make 3D-print modified versions of my CAD utensil models, and add the motors etc. onto them for them to become functioning prototypes. After which, I hope to perform user testing, and use the feedbacks to move the prototypes forward.

User Testing


Yinuo Han @2021