Google Lens in Assistant rolls out to Pixel and Pixel 2

Google Lens in Assistant rolls out to Pixel and Pixel 2

Google Lens, the smart function that analyses and identifies what your camera sees, was one of the most exciting features touted by Google in their I/O and Pixel event earlier this year. Although a limited version of Google Lens has already been present in Google Photos for Pixel users, now certain users have begun to find the visual search feature running integrated with the Assistant on their Pixel and Pixel 2 phones.

Google Lens in Assistant was first spotted by users on Friday evening, the first users have spotted the visual search feature up and running on their Pixel and Pixel 2 phones. As the rollout reaches all users, soon Pixel and Pixel 2 users will be able to use Google Lens with Assistant as well as with Photos.

In Photos, Lens can be activated when viewing any image or screenshot. The Lens can recognize and capture information such as phone numbers, addresses, URLs, etc. However, in Assistant, it is integrated into the new Assistant interface, triggered by holding down on the home button. 

androidpit google pixel 2 google lens
In Photos, Google Lens will recognize objects and provide you with information. / © ANDROIDPIT

The new button in the bottom right corner opens up a camera viewfinder. Tapping anywhere on the image freezes the view, outlines the item in question and Google Lens starts a search which offers up a range of possible results to identify the object, as well as suggested actions such as search the web, open other apps, and more. Naturally, you can also share the result along with a feedback rating (thumbs up or thumbs down) which should improve the effectiveness of Google Lens over time.

The new interface also allows users to quickly start a voice search with the microphone, and you can start a new visual query with Google Lens by re-tapping the Lens icon in the bottom right.

Although the update has yet to roll out to our own Pixel phones, it will certainly be welcomed by Google Pixel users of both generations, who will benefit from a machine learning assisted user experience unparalleled by other devices. In the very long term, this feature should roll out to all Android phones, but it will likely be a Pixel exclusive for a while.

Are you excited to use Google Lens in Assistant? Have you already received the update?

Source: 9to5google

Liked this article? Share now!
Join the discussion

Latest articles

Recommended articles

1 Comment

Write new comment:
All changes will be saved. No drafts are saved when editing
Write new comment:
All changes will be saved. No drafts are saved when editing

  • Abdul Ghafoor Nov 20, 2017 Link to comment

    Bixby Vision already does this, but in a more practical way. You point the camera at something, it recognizes it, and gives you the option of seeing matching images, or shopping for the thing it identifies. That's a lot more useful than pointing your camera at a copy of "Cat in the Hat" and having the assistant tell me "That's a book called "Cat in the Hat".