“As we leave 2016 behind it’s worth taking a look at one of the features of Apple’s iOS 10 that doesn’t get as much coverage as it should. In this case, it’s the iPhone’s ability to search your photos and tag both people and objects,” Ian Morris writes for Forbes. “What’s interesting here though is that Apple does this locally, rather than using the cloud as Google does with its Photos service.”

“It’s not insignificant that this happens locally either. For one thing, it’s a lot harder to use a phone than a cluster of supercomputers,” Morris writes. “But also, it does mean that those who don’t sync images to iCloud can still enjoy the service.”

“Both Google and Apple’s services use “computer vision” to detect what’s in photos,” Morris writes. “But on the iPhone it’s incredibly easy to pick a subject, let’s say ‘cats’ and search your photo library for images that feature a cat… It’s worth trying a few searches to see what you can come up with. To get going, just open the Apple Photos app and press the little search magnifying glass.”

Read more in the full article here.

MacDailyNews Take: Yes, it’s fast and works well for many things (try “ocean,” lake,” “baby,” etc.)