For Project Nourished, I developed a series of tableware to aid eating and drinking in VR and AR as well as delivering scents in various methods such as heating and diffusion. The specialized fork and chopsticks utilize tracking markers that reflect IR lights emitted by IR LED source and then their movements were translated into absolute XYZ coordinates using the video footage. An alternative method of tracking relative movement using accelerometer was also explored to compare the accuracy of captured motion data. Furthermore, micro interactions such as picking up of the food and releasing it into user's mouth were also detected by using conductive sensors as a set of binary data.
The experiment was conducted on a hypothesis that any forms including that of inedible objects can satiate hunger or trigger desire to eat as long as the object follows specific patterns of attributes such as motion, color, shape, etc. Furthermore, these attributes may be considered as subsets of our ability to classify food and non-food. Using transcranial direct current stimulation (tDCS) and 10-20 electrode placement method, I applied 2mA current onto C4 region to stimulate insular cortex while observing a series of animations of non-edible objects that have been identified by participants as "appetizing" or "wanting to eat."
The experiment was conducted to explore dynamic forms and low-caloric configuration of food. Using a technique used in molecular gastronomy, I polymerized carrot juice into gelatinized sheets that can be reconfigured into new forms like an origami. This was accomplished by first mixing the juice with calcium lactate gluconate and then pouring the mixture into a sheet of paper. The paper was previously soaked with a solution containing water and sodium alginate to allow the sheet's surface to act as an activator. Once the edible sheets gelatinized, they were folded into complex shapes and objects. By leveraging hydrocolloids, the process transformed a carrot with expected physical and textural qualities into complex three-dimensional forms while reducing amount of calories per volume.
As a part of on-going exploration into emerging technologies and their implications, a physical prototype of Google Glass app was created using readily-available materials—in response to the release of Google Glass. The app was designed to upload a real-time video from the headset and then have Googe's AI framework analyze its facial recognition data to measure interactee's median point every ten seconds. Zero point in the 10-scale measurement represented negative mood, five points represented neutral and ten points represented positive mood. Interaction-specific tasks involved selecting the most optimal and appropriate UI patterns based on a handful of social contexts and chaining various modals of interactions and feedback mechanisms (e.g. head-mounted display, voice commands, touch gestures [tap, swiping and multi-touch], haptic and audio feedback, head gestures [tilt, nod and shake] and gaze) to introduce additional UI patterns not found in the Google Glass API.
In collaboration with Francis Bitonti, I designed and generated armor-like dress concept using a combination of 3D modeling, sculpting and generative algorithm software such as Rhino, Grasshopper, Maya and Zbrush. The design is inspired by 15th century Korean battleship in a shape of tortoise called, Geobukseon developed by an admiral and military general Yi Sun-Shin. Its ornamental, yet functional elements such dragon figurehead and metal spikes on the battleship were used to invoke fear towards enemy combatants and protect the the ship from boarding tactics. Similarly, array of ridges on the dress made out of liquid metal extrude out of the dress shell and then morph into metal spikes when danger is sensed by the wearer.
With a guidance from Steve Mann, the father of wearable computing, I prototyped a prosthetic device that can be worn like a scarf or belt. The device allows for capturing and tracking of motion of human body and their appendages with its embedded array of infrared LEDs. Once one's IR motion is captured through a camera, a field of "metaveillance" created by the wearer can be visualized to communicate the invisible data based on motion, space and time.
Based on the open-source project Pupil Core, I printed and assembled a head-mounted eye tracker with built-in eye and world cameras. In order to capture the world and pupil movement, I used two CCD sensors and controller units salvaged from webcams. The eye camera required detaching of IR filter and soldering of two IR LEDs onto the controller board so that a distance between two markers reflected above the pupil and their XY positions can be measured. The eye tracker was later used for various client and personal projects to capture information about gazed subjects and their related attributes such as frequency, duration, and order of gaze, and color, form, and location of gazed subject.
Using modeling clay, I prototyped a wearable device that allows users who suffer from Hyperhidrosis disorder (excessive sweating) to control and manage unwanted body perspiration. A surface that touches user's armpit contains numerous rotating beads powered by high-torque servo and vibration motor—as numerous scientific studies have shown that applying massage therapy on armpit can help reduce excess perspiration and unwanted body odor. Along with the massage, the device also applies antiperspirant infused with antibacterial essential oil. The device itself is shaped like a pomegranate and its seeds to convey novelty and health benefits.