Apple has pulled the apps of popular photo-sharing site 500px over concerns that it is too easy to search for nude photographs within the app. This, despite the fact that 500px’s method of dealing with searches for nude images is even more prohibitive than that of the official Flickr iPhone app. Could Flickr be next?
500px’s mobile app was pulled around 1AM this morning after discussions with Apple over the relative ease with which users could search for nude photos within the app failed to result in a compromise.
Techcrunch explains the issue:
The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo-sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these types of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.
Tchebotarev said the company did this because they don’t want kids or others to come across these nude photos unwittingly. “Some people are mature enough to see these photos,” he says, “but by default it’s safe.”
What’s interesting about this to me is that 500px’s method of keeping minors from seeing nude images in their official iOS app is a lot more prohibitive than that employed by Flickr, a similar photo-sharing app. On Flickr, users default to “Safe Search” mode, but when a nude image is searched for, it’s still only a single in-app tap away.
If 500px can be kicked off the App Store for allowing users to search for nude photographs, will Apple stop there, or start going after the apps of other sites that host nudity, like Flickr and Tumblr?
500px has apparently already submitted an updated version of their app, which is waiting aproval and makes it impossible to search for nude images within app, but come on, this is just silly. 500px is an app for photographers, not pornographers, and it already has a system for weeding out pornography that is uploaded to te service. Surely, at a certain point, Apple has just got to trust that the world’s not going to end if an app takes sensible precautions and a minor accidentally sees a pair of boobs. Right?
Update: Apple has responded to the controversy with a statement.
The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.