We’re simplifying the ways we interface with technology. Touch computing will hopefully once and for all get rid of the mouse as pointer, but we still have to hold the device in one hand and manipulate the screen with the other.
What if the screen was our body?
That’s a scenario that Carnegie Mellon’s Skinupt offers.
The goal of Skinput is to compensate for the small screen spaces on mobile computing devices.
“In my research I think about clever ways to appropriate surfaces that are already around us, like tables and walls,” said Skinput’s designer, Carnegie Mellon’s Chris Harrison.
Skinput uses bio-acoustic sensing technology that makes the human body the input source.
Parts of the body are acoustically distinct: different parts make different sounds due to size mass, bone density, as well as from filtering effects such as joints and soft tissue. The sounds are read by a device worn around the upper arm. Skinput’s software classifies the impacts, making the body an input device. A pico projector can be attached to the device to project a graphical interface onto the user’s body (see video for some really cool examples, including playing Tetris on your hand).
“Appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso),” Harrison writes on his blog. “Furthermore, proprioception (our sense of how our body is configured in three-dimensional space) allows us to accurately interact with our bodies in an eyes-free manner.”
Just watching the demo video is enough to make it obvious how useful this technology could be, and how computing paradigms are in the process of radically changing. Functionalities are improving so fast these days, but until the last few years, the ways we interface with technologies have remained clunky.
Technology like Skinput makes sense for the evolution of computer interfaces. And how can you beat playing Tetris on your hand?