No, Apple’s not building a massive archive of bra pictures

By

bra
The internet is upset at one of Apple's machine learning applications.
Photo: Michael Summers/Flickr CC

Is Apple building up a massive centralized archive of bra pictures belonging to its female users? To invoke Betteridge’s law about attention-grabbing headlines that end with a question mark, no, it’s not.

The internet went crazy yesterday after a tweet from one internet user pointed out that typing “brassiere” into the Photos app search bar of her iPhone brought up what appeared to be a folder showing various images of her in a state of undress. As it turns out, though, that’s not exactly the case.

Apple’s image recognition tech

“Attention all girls! Go to your photos and type in brassiere. Why are Apple saving these and made it a folder!?” the Twitter user in question wrote. Immediately there was an outcry, with other users pointing out that Apple is saving these photos in their own folders, and that doing this makes them feel “really violated.”

In reality, Apple isn’t trawling through your photos and picking out the more salacious ones at all. Instead, what had been “discovered” was Apple’s impressive machine learning capabilities introduced with last year’s iOS refresh, which uses machine learning to figure out what objects are present in images.

Bras are one such item, but others include less controversial items such as abacuses, fairgrounds, fish tanks, Bassett Hounds, citrus fruit, Japanese radishes, and hundreds more. (For a full list check out this comprehensive round-up here.) When you search these items, Apple’s AI algorithms scan through your pictures and assemble a temporary folder allowing you to access them.

The biggest point about this, however, is that the images aren’t in any way Apple’s to look through. All the face recognition and scene and object detection and processing is done locally on your device with images that are also stored locally. If anything, all the backlash shows is that Apple needs to do a better job advertising some of these cutting edge features.

An alternative solution

If you are worried about sensitive images on your device, however, I can recommend an app I wrote about in a recent “Awesome Apps of the Week” roundup. Called Nude, the app works by using AI similar to Apple’s own deep learning image recognition technology. With this it scans through your photos for naked pics, and then places them in a private vault, deleting them from your camera roll, and ensuring that none of them make their way to iCloud.

Nude, which is free to download but costs $10 a year to run, will even track attempts to get into your photo vault, and can additionally be used to protect sensitive images such as credit card, ID or driver’s license images. You can download it here.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.