Now that Lens is no longer buried in Google Photos we’ll see faster adoption of visual search.
Google Lens is now presented in the Google search app for the iPhone. In the past iOS users could access Lens visual search but only through the Google Photos app, which required taking a picture and then running Lens. It was very awkward.
In the search bar.
Now Lens become visible in the search bar next to the mic icon and is available with a simple touch. That enables visual search of products and objects, buildings and places, plants and animals, QR codes, bar codes, business cards and virtually everything featuring text. Lens presently supports English, Spanish, French, German, Italian, Portuguese and Korean.
Lens gets a B.
In my earlier tests of Lens on photos and on a Google Pixel phone, Lens performed pretty well — I’d give it a “B.” On products, books and media it performs about as well as Amazon’s visual search today. But it outperforms the last overall by having a broader range of object recognition capabilities.
Using my iPhone in my living hall this morning, Lens got about 75 percent of object and text searches right. Moreover, now that Lens is a clear search option, we’ll start to see many people use it. That should further develop its image recognition capabilities.
Why you should care.
Though it remains to be seen how broadly adopted Lens is, it could become extremely popular, especially for objects and products. But it might also become popular as a way to ensure reviews for restaurants and other places as you’re out in the world — an augmented reality use case.
So far, there’s no search optimization approach for visual search (at least right now) like there is for images. However, what about ads? In a product search context we could simply imagine the eventual inclusion of shopping ads.
The bigger point, however, is that search on mobile devices is going to further diversify. And visual search will likely make its place beside voice as an alternative to text-query input.