Moves to bring human-like senses to computers, i.e. incorporating the addition of touch, voice, facial recognition and other technologies into our information and communication devices, are gaining ground.
Only a few years ago, electronic devices using voice and touch recognition were the stuff of science fiction movies. Today, you can already have some of these technologies installed in your living room or purchase eyewear incorporating them. Recently Atheer Labs introduced glasses which enable the user to combine augmented reality with gesture-based manipulation for design purposes. This field of development, which has been dubbed ‘perceptual computing’, is all about incorporating facial, voice and/or touch recognition into our electronic devices. Whether we are talking about building this sensory functionality into a ‘traditional’ PC, integrating it into objects or places, or embedding the technologies into clothing or furniture, this idea is now making great strides and increasingly integrating the machine into our daily lives. Intel is a leader on the field, and has developed several initiatives to boost the sector: its online Perceptual Computing Challenge and more recently, a $100 million fund to finance perceptual computing, or similar, projects that will enable the use of multiple senses to interact with computing devices.
A developing market
The term ‘perceptual computing’ covers a range of technologies and various different devices. Intel mentions touch, voice, and image recognition, as well as sentiment detection and measurement technologies. In the field of gesture recognition for instance, 3D depth camera can detect movement and send signals into a room and measures the time they take to bounce back in order to map the room’s 3D space. Leading chip-maker Intel, which is one of the pioneers in this market development, has provided backing for a California-based startup called SoftKinetic, licensing its industry-leading middleware for close-range gesture tracking, as part of the Intel® Perceptual Computing Software Development Kit (SDK). SoftKinetic enables three-dimensional mapping of a room using a camera incorporated into an ultra-flat device. At the moment Intel is working on developing the Creative Senz3D Peripheral Camera, which works with gesture-based interaction, scheduled to be launched on the market this year.
Natural, intuitive and immersive experience
The basic approach with this type of device is to incorporate technologies which make the user experience more natural, intuitive and immersive. Being able to use one or more of your five senses to send commands to an electronic device is a huge step towards integrating the technology into all our day-to-day interactions. Talking directly to a machine rather than having to type on to a keyboard is a move towards ‘humanizing’ the machine and provides the human user with a more natural experience. With its perceptual computing fund, Intel aims to help startups on a technical, business development and financial level. The applications of such technologies are very wide. Education is one of them, especially the field of ‘edutainment’.