One of the first examples of virtual reality affecting smell and taste was showcased at last month's SIGGRAPH, the computer graphics conference in Los Angeles. Using visual and olfactory stimuli, the system attempted to alter the

subject's perception of taste using virtual and real life cookies. After donning combination eye/nose goggle headgear, the user saw a visual representation of a flavored cookie, as well as inhaled scented air through nose tubes, and subsequently took a bite of a neutrally flavored cookie.

The contributing team for this project, made up of Takuji Narumi, Takashi Kajinami, Tomohiro Tanikawa and Michitaka Hirose, come from the University of Tokyo. They describe the project itself, "Meta Cookie," as "the world's first pseudo-gustation system that induces cross-modal effects so humans can perceive various tastes by changing only visual and olfactory information." The system recognizes the sugar cookies that have augmented reality markers printed on them so that the software can properly track movement and time sensory output.

As the team's article abstract explains, due to flavor being chemical-related, little innovation has been related to computers for this sense, despite how much has been done using the other major senses (vision, touch, hearing, etc). Since much of chemical interaction with taste is not understood, this project has taken advantage of visual and olfactory cues to affect taste. Using these tools, as well as memory, an effective experience can be cobbled together to fool the brain into changing the perception of flavor for the user. "We taste with our yes and nose before any food enters our mouth," says Stuart Fox at MSNBC. If indeed "good chefs know this," then so now will virtual and augmented reality engineers.

Narumi has several projects in the computer science field, including other projects that use non-visual feedback. One project, "Thermotaxis," uses thermal feedback in a social edutainment situation.