Today Apple pulled an application from its iPhone App store that was reported to include a nude photo of a fifteen year old girl. Prohibited content is feared to be a major weakness in Apple's App Store, and this incident has effectively placed that fear in the spotlight. The app is BeautyMeter from Braun Software, and users upload images of themselves and rate images of others with a five-star system in the categories of face, body and clothes. The screenshot in question has not been verified to have been taken by an underage girl, but the listing shows just such a caption, with nearly five thousand ratings, under a picture of a topless iPhone user partially nude below the waist.

News headlines have claimed "Porn Comes to the iTunes App Store," and one day later, "Apple: No Porn Allowed in iPhone’s App Store," so its easy to get confused. Add to the mix a software list with over fifty thousand entries and one billion downloads reached in April, how can Apple effectively police all of this content?

Despite these applications allowing nudity for those eighteen and older, according to Neowin, the safeguards that Apple employs are on developer-maintained servers. For BeautyMeter, the safeguards include tracking the iPhone's ID number, which traces the owner of the device. No word yet on how this exact incident circumvented the security, but public opinion is showing the blame falling squarely on Apple's shoulders.

Wired coverage suggests Apple cannot win with adult content - faults in the system, as well as presumed censorship, both make the company into a villainous entity. But Gizmodo's article hopes that the issue can be resolved with raising public awareness.

After all, just because a device is running on Apple's platform, it is not developed by them. Hopefully mainstream media can come to treat this platform as it does console video games. Similarly, public decryers of Grand Theft Auto did not condemn Microsoft when gamers could play the title on their XBox 360s.