Apple’s hidden Photos feature will blow some minds

“As we leave 2016 behind it’s worth taking a look at one of the features of Apple’s iOS 10 that doesn’t get as much coverage as it should. In this case, it’s the iPhone’s ability to search your photos and tag both people and objects,” Ian Morris writes for Forbes. “What’s interesting here though is that Apple does this locally, rather than using the cloud as Google does with its Photos service.”

“It’s not insignificant that this happens locally either. For one thing, it’s a lot harder to use a phone than a cluster of supercomputers,” Morris writes. “But also, it does mean that those who don’t sync images to iCloud can still enjoy the service.”

“Both Google and Apple’s services use “computer vision” to detect what’s in photos,” Morris writes. “But on the iPhone it’s incredibly easy to pick a subject, let’s say ‘cats’ and search your photo library for images that feature a cat… It’s worth trying a few searches to see what you can come up with. To get going, just open the Apple Photos app and press the little search magnifying glass.”

Read more in the full article here.

MacDailyNews Take: Yes, it’s fast and works well for many things (try “ocean,” lake,” “baby,” etc.)

9 Comments

    1. I’m honestly curious, did you expect iOS to locate individual snow flakes or to just pull up every one of your photos that are half white or all white?
      It’s software, not Harry Potter magic, be impressed people! This is an amazing feat to be done locally. And your personal private photos are yours and not Google’s, or whoever they’re selling your data to.

  1. But what a royal pain when it comes to people tagging as each device is independent and there is no syncing. So with an iPhone and 2 iPads and an iMac (in my case) you are forced to maintain separate people libraries.

  2. I’ve been praising this feature for some time now.

    It’s not perfect, but it’s a very impressive starting point and of course it’s going to be getting better.

    I was especially impressed when it produced great matches with searches for “SUV’ or for “birthday”. “SUV” was an interesting one because it’s not an obviously different object to a normal car as might be seen from an image recognition algorithm. When I searched for ‘truck’, I got an entirely different ( and entirely appropriate ) selection of images “birthday” was quite an abstract search term, but the images offered were of things like cakes, decorations and people in party hats.

    If you want to see some of the search terms that Photos recognises, just type a single letter and look at the list of suggested words that it offers. You might be surprised at how comprehensive the list really is.

    I’d be interested to know if Photos learns from individual users and whether we can train ours to better identify our specific images?

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.