The Internet is a wonderful thing. Whatever questions you have, the Internet most likely have the answers. Or is it? What if sometime you find it hard to describe in words what you trying search? Well, this is where Google Lens aims to help. First introduced last year in Google Photos and the Assistant, Lens will now be directly accessible in native camera apps.
In addition to Google Pixel devices, Lens will be supported on devices from 10 other makes, namely, LG, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ and ASUS. In addition to integrating Lens with the native camera app, Google also announced three updates to Google Lens. The three updates are as follows:
1) Smart text selection, which is similar to what you have seen on Google Translate, has now come to Google Lens. It will let you copy and paste text from the real world, as seen through the sensor of your camera. It will let you make sense of the words, like, for example, showing you a picture of a dish on the menu, so you’d know what the particular looks like.
2) Image search straight through the lens in real-time. When you see an outfit in the physical world, you can simply use Lens to search for the specific item online, as well as see things in a similar style. It is basically the image search, but done straight from the camera.
3) Finally, Lens works in real time, enabling it to proactively pull out information on items that the camera ‘sees’. For example, if you come across a book in say, someone else home, you can point Lens at it and it will pull up information relating to the book.
Google Lens in native camera app was announced last month at the annual Google I/O. It was supposed to roll out “over the next few weeks” in May, but I am not seeing it on my Google Pixel. If you device is updated with Lens camera integration, you should be seeing an icon to the bottom right (in the native camera app) for activating the feature.
Images and animated GIFs: Google.