Sometimes, even as a tech reporter, you notice how fast technology improves. Case in point: It wasn’t until today that I learned that my iPhone offers a feature I’ve longed for: the ability to identify plants and flowers from a photo.
It’s true that several third-party apps have offered this feature for years, but the last time I tried them, I was disappointed with their speed and accuracy. And yes, there is Google Lens and Snapchat scanbut it’s always less convenient to open an app I wouldn’t otherwise use.
But since the release of iOS 15 last September, Apple has offered its own version of this visual search feature. It is called Visual Lookupand it’s damn good.
It works very simply. Just open a photo or screenshot in the Photos app and look for the blue “i” icon below it. If it has a small glittering ring around it, iOS has found something in the photo that it can identify using machine learning. Tap the icon and then click “Look Up” and it will try to retrieve some useful information.


It works not only for plants and flowers, but also for landmarks, art, pets and ‘other objects’. It’s not perfect, of course, but it surprised me more than disappointed. Here are some more examples from my camera roll:

Although Apple announced this feature at WWDC last year, it hasn’t really touted its availability. (I saw it via a link in one of my favorite tech newsletters, the overflow†) Even the official Visual Look Up support page gives mixed messages, telling you in one place it’s “US only” and then listing other compatible regions at another pageâ€
Visual Lookup is still limited availability, but access has expanded since launch. It is now available in English in the US, Australia, Canada, UK, Singapore and Indonesia; in French in France; in German in Germany; in Italian in Italy; and in Spanish in Spain, Mexico and the US.
It’s a great feature, but I also wondered what else visual search could do. Imagine taking a picture of your new houseplant, say, just for Siri to ask “Do you want me to set reminders for a watering schedule?” — or, if you’re snapping a photo of a landmark on vacation, have Siri search the web for hours of operation and where to buy tickets.
I learned a long time ago that it’s foolish to pin your hopes on Siri doing something too sophisticated. But these are the kinds of features we could eventually get with future AR or VR headsets. Let’s hope that if Apple introduces this kind of functionality, it makes a bigger splash.