Google Lens will live in Google Photos on the Pixel 2, coming to Google Assistant later this year
Google Pixel 2 owners will get first taste of Google Lens' computer vision powers.
Google has announced that Google Lens will come to Google Photos on the new Pixel 2 phones, with support in Google Assistant coming later this year.
First shown at this year's Google I/O developer conference, Lens is a piece of behind-the-scenes software that makes sense of what the camera is looking and acts on it in real time. Imagine taking a photo of the flower and identifying the species at the same time. Or pointing the camera at a restaurant's storefront and immediately pulls up customer ratings.
In the Pixel 2 and Pixel 2 XL, Lens is baked into Google Photos. Simply tap the new Lens button for an existing photo to discover new data or get actionable options. For example, if it's a photo of a building, you will get more info about the place; and if it's an invitation with phone numbers and links, you'll get options to create new contact and event entries.
Lens will also come to Google Assistant via an update later this year. Say you chance upon a concert poster on the street while chatting with your friend; you can just point the camera at said poster midway through the conversation to pull up ticketing info. That's way faster than jumping in and out of different apps.
At today's fall hardware event, Google demonstrated a few things Lens can do. To recap:
Google Lens can transcribe email addresses and phone numbers for you to use:
It can also recognize an artist by a painting:
It’ll give you details of a movie if you capture its poster:
Tell you about a book when you point the camera at it:
Let you know about a music artist from album art:
Tell you about a location just from its photo:
Here's a video showing Lens in action in Google Photos and Google Assistant:
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.