SoundSense , the result of research at Dartmouth College in New Hampshire, brings power to the most basic of sensors on a mobile phone - the microphone. While much attention has been paid to applications that use GPS or the accelerometer, the microphone is the only sensor that can be successfully utilized at all times, even when in a bag or pocket. According to the project authors, the microphone is "a powerful sensor that is capable of making sophisticated inferences about human activity, location, and social events from sound." The iPhone application makes use of a general purpose sound sensing system specifically designed to work on resource-limited phones, as opposed to a device with more processing power and memory.
SoundSense learns by itself to differentiate between general sound types (music, voice, background noise), and keeps track of how often they occur. When the app tracks a sound often enough, it asks the user to label the sound. The system runs locally on the phone without any back-end interactions.
According to the published paper (PDF ), the app can detect conversations and recognize activities or locations, track social networks and even identify dietary habits simply from analyzing audio data.
Implementing SoundSense was not the main focus of the documentation, but applications discussed included an audio diary or a music detector. Tracking time spent in the car or at rock shows are two of countless events that could be quantified or shared via social media.
The system collects information that is meant to be kept local. Because the system runs only on the phone, the authors believe that privacy is protected.
The Metrosense project works with industries and agencies to develop new uses for the sensors already built into mobile phones.
Related sensor-based research such as CenceMe , Sensor Sharing , BikeNet , AnonySense , and Second Life Sensor can be found at the MetroSense Projects Page .