Mobile menu toggle

No, Apple’s not building a massive archive of bra pictures

By

bra
The internet is upset at one of Apple's machine learning applications.
Photo: Michael Summers/Flickr CC

Is Apple building up a massive centralized archive of bra pictures belonging to its female users? To invoke Betteridge’s law about attention-grabbing headlines that end with a question mark, no, it’s not.

The internet went crazy yesterday after a tweet from one internet user pointed out that typing “brassiere” into the Photos app search bar of her iPhone brought up what appeared to be a folder showing various images of her in a state of undress. As it turns out, though, that’s not exactly the case.

Apple’s image recognition tech

“Attention all girls! Go to your photos and type in brassiere. Why are Apple saving these and made it a folder!?” the Twitter user in question wrote. Immediately there was an outcry, with other users pointing out that Apple is saving these photos in their own folders, and that doing this makes them feel “really violated.”

In reality, Apple isn’t trawling through your photos and picking out the more salacious ones at all. Instead, what had been “discovered” was Apple’s impressive machine learning capabilities introduced with last year’s iOS refresh, which uses machine learning to figure out what objects are present in images.

Bras are one such item, but others include less controversial items such as abacuses, fairgrounds, fish tanks, Bassett Hounds, citrus fruit, Japanese radishes, and hundreds more. (For a full list check out this comprehensive round-up here.) When you search these items, Apple’s AI algorithms scan through your pictures and assemble a temporary folder allowing you to access them.

The biggest point about this, however, is that the images aren’t in any way Apple’s to look through. All the face recognition and scene and object detection and processing is done locally on your device with images that are also stored locally. If anything, all the backlash shows is that Apple needs to do a better job advertising some of these cutting edge features.

An alternative solution

If you are worried about sensitive images on your device, however, I can recommend an app I wrote about in a recent “Awesome Apps of the Week” roundup. Called Nude, the app works by using AI similar to Apple’s own deep learning image recognition technology. With this it scans through your photos for naked pics, and then places them in a private vault, deleting them from your camera roll, and ensuring that none of them make their way to iCloud.

Nude, which is free to download but costs $10 a year to run, will even track attempts to get into your photo vault, and can additionally be used to protect sensitive images such as credit card, ID or driver’s license images. You can download it here.

  • Subscribe to the Newsletter

    Our daily roundup of Apple news, reviews and how-tos. Plus the best Apple tweets, fun polls and inspiring Steve Jobs bons mots. Our readers say: "Love what you do" -- Christi Cardenas. "Absolutely love the content!" -- Harshita Arora. "Genuinely one of the highlights of my inbox" -- Lee Barnett.

2 responses to “No, Apple’s not building a massive archive of bra pictures”

  1. Cai says:

    Oi, people are so paranoid. I’d do some homework if I found this, before making false accusations!

  2. Sandeman21 says:

    Wonderful. So can you also please explain how machine learning works?
    How does a machine learn?

    Does it send your photos to apple’s cloud to process and compare with other photos in order to categorize them?

    And do these tags get saved on apple’s servers “for our convenience” thus identifying who has such photos on their device? Isn’t apple’s icloud the easiest to hack as it has been proven time and again?

Leave a Reply