Google Lens is an image recognition technology developed by Google, designed to bring up relevant information related to objects it identifies using visual analysis based on a neural network. First announced during Google I/O 2017, it was first provided as a standalone app, later being integrated into Android’s standard camera app.

Google Lens was one of Google’s biggest announcements in 2017, but it was a primarily a Google Pixel-exclusive feature at launch.
If you’ve been waiting for Google Lens to come to more phones, you’ll be glad to know that day has finally arrived. At Google I/O 2018, Google announced that Google Lens is coming to a lot more phones, and the app is now available to download on Google Play.
Google Lens is an AI-powered technology that uses your smartphone camera and deep machine learning to not only detect an object, but understand what it detects and offer actions based on what it sees. Anyway, here is everything you need to know about the feature.
What is Google Lens?
Google Lens is a super-powered version of Google Googles, and it’s quite similar to Samsung’s Bixby Vision. It enables you to do things such as point your phone at something, such as a specific flower, and then ask Google Assistant what the object you’re pointing at is. You’ll not only be told the answer, but you’ll get suggestions based on the object, like nearby florists, in the case of a flower.
Other examples of what Google Lens can do include being able to take a picture of the SSID sticker on the back of a Wi-Fi router, after which your phone will automatically connect to the Wi-Fi network without you needing to do anything else. Yep, no more crawling under the cupboard in order to read out the password whilst typing it in your phone. Now, with Google Lens, you can literally point and shoot.
Features
When directing the phone’s camera at an object, Google Lens will attempt to identify the object by reading barcodes, QR codes, labels and text, and show relevant search results and information. For example, when pointing the device’s camera at a Wi-Fi label containing the network name and password, it will automatically connect to the Wi-Fi source that has been scanned. Lens is also integrated with the Google Photos and Google Assistant apps. The service is similar to Google Goggles, a previous app that functioned similarly but with less capability. Lens uses more advanced deep learning routines in order to empower detection capabilities, similar to other apps like Bixby Vision (for Samsung devices released after 2016) and Image Analysis Toolset (available on Google Play); During Google I/O 2019, Google announced four new features. The software will be able to recognize and recommend items on a menu. It will have the ability to also calculate tips and split bills, show how to prepare dishes from a recipe and can use text-to-speech.
How does Google Lens work?
Google Lens app
Google has a standalone app on Android for Google Lens if you want to get straight into the features. You can access Google Lens through a whole range of other methods – detailed below – but the Lens app will be the most direct. In truth, the experience is similar whichever approach you take and tapping the Lens icon in Google Assistant takes you through to the same view you get directly in the Lens app.
Google Assistant
Within Google Assistant you’ll see a Google Lens icon in the bottom right-hand corner. You can tap it and point your smartphone camera at, for instance, show times outside a cinema or a gig venue’s information board.
You’ll will then be presented with a number of suggestions in the viewfinder, such as hear some songs from the artist picked up from the information board, get tickets for the event through TicketMaster, or add the event to your calendar. Using Lens to get information without having to write it down is handy; you’ll be able to call numbers, for example, without having to remember them or manually type them.
Google Lens is now integrated into Google Assistant across a wide range of Android devices, not just Pixel phones.
Google Photos
Within Google Photos, Google Lens can identify buildings or landmarks, for instance, presenting users with directions and opening hours for them. It will also be able to present information on a famous work of art. Maybe it will solve the debate of whether Mona Lisa is smiling or not.
When browsing your pictures in Google Photos, you’ll see the Google Lens icon in the bottom of the window. Tapping on the icon will see the scanning dots appearing on you picture and then Google will serve up suggestions.
Again, Google Lens is now in place in Google Photos across a wide range of Android devices.
Camera app
In some Android phones, Google Lens has been been directly added to the device’s own camera app. This functionality is now rolling out, so we haven’t had a chance to test it yet, though we suspect it’ll be similar to the implementations described above.

Which devices offer Google Lens?
Google announced Google Lens at I/O 2017, giving us a preview of the new software, before rolling into the announcement of the Pixel 2 and Pixel 2 XL. At launch, Google Lens was integrated into Google Assistant and Google Photos on those phones. Other devices eventually followed, though you still had to use Google Assistant and Google Photos to access the feature.
Then, in May 2018, Google announced Google Lens will soon be directly added to the camera app on “supported devices” from several manufacturers, including LG, Motorola, Xiaomi, Sony, HMD/Nokia, Transsion, TCL, OnePlus, BQ, and Asus. Google Lens will arrive for these phones “in the coming weeks,” though how long it takes depends on how quickly each manufacturer updates their devices.
The standalone app is now available too and this supports a lot of devices, but there are some omissions from the list – so it’s worth checking on Google Play to see if you can get it.