Using a programme designed for use with Microsoft Kinect, US university researchers have built a system that analyses posture and provides verbal instructions to those who cannot see and imitate a trainer’s movements.
The Kinect system seems to be inspiring researchers to break new ground. It has already been used to restore human links between teachers and students, to develop distance learning and to model crowd behaviour. The Microsoft smart camera system, which enables capture and interpretation of human movement, now looks set to open up a variety of sports to blind or partially-sighted people. One such application has been developed by researchers at the University of Washington specifically for yoga. In a traditional yoga class, the teacher leads the students through the various poses by demonstrating them physically, which means that if you cannot see very well it is almost impossible to follow the class. Setting out to solve the problems faced by the visually impaired when taking such classes, the researchers have created software for the Kinect using geometrical analysis to spot wrong posture. A voice system then gives verbal instructions to help the participant modify his/her position and attain the desired posture. “My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting,” explained project lead Kyle Rector, a UW doctoral student in computer science and engineering.
Reading body angles, giving verbal feedback
Although Kinect software does have some limitations in the level of detail with which it can track movement, the researchers still opted to use it because it is open source and readily available on the market. The ‘Eyes Free Yoga’ system, as they have called it, covers six yoga poses, reading the angles of the user’s body based on simple geometrical rules and applying the cosine formula. For example, in certain positions, a yoga student’s leg is supposed to be flexed at an angle of 90 degrees, while the arm must be aligned to form an angle of 160 degrees. The Eyes Free system gives verbal feedback in real time. Instructions are given on placing the arms, legs, neck and back in order to attain the correct position. The product does not come under the heading of ‘video games’; it is actually what is known as an ‘exergame’ – a video game designed to encourage people to take exercise – in this case allowing people with poor sight to interact verbally with a simulated instructor.
Test group helped to hammer out the essential commands
The researchers worked with a group of 16 blind and low-vision people around Washington State to test the programme and obtain feedback. Several of the participants had never done yoga before, others had tried it a few times, while a few of them took yoga classes regularly. The UW team also worked with a number of yoga instructors to work out the basic criteria for reaching the correct alignment in each pose. This resulted in the formulation of around 30 different commands for improving each of the yoga positions taught, based on a dozen rules deemed essential for each position. The Kinect first checks the subject’s core and suggests alignment changes, then moves to the head and neck area, and finally focuses on the arms and legs.