Google has started rolling out visual search feature Google Lens in Assistant for the first batch of Pixel and Pixel 2 smartphones.
“The first users have spotted the visual search feature up and running on their Pixel and Pixel 2 phones,” 9to5Google reported late on Friday.
At first built into the Photos app, with the launch of the Pixel XL 2, Google Lens can recognise things like addresses and books, among others. In Photos, the feature can be activated when viewing any image or screenshot.
However, in Google Assistant, it is integrated right into the sheet that pops up after holding down on the home button. It will soon be integrated into the camera and other apps.
“Lens was always intended for both Pixel 1 and 2 phones,” Google had earlier said in a statement. Google says Lens has an accuracy of over 95 percent.
The app was announced by the tech giant during Google I/O 2017. It has been designed to bring up relevant information using visual analysis.
When it was firs unveiled, Google CEO Sundar Pichai described Google Lens as a set of vision based computing capabilities that can understand what you’re looking at and take actions based on that information. Say if you want to know what a particular flower is, you’ll be able to bring up Google Lens through Assistant which will identify the flower and it will tell you what flower it is.
[“Source-gadgets.ndtv”]