Google Lens, the technology that combines the smartphone camera’s ability to see the world around you with A.I. technology, is coming to the Pixel 3 camera, Google announced this morning. That means you’ll be able to point your phone’s camera at something – like a movie poster to get local theater listings, or even look up at actor’s bio, or to translate a sign in another language – and see results right in the camera app itself.
The integration is thanks to Google’s investment in A.I. technologies, something that was the underlying tie to everything Google announced today at its hardware event.
Lens, in particular, was first shown off at Google I/O back in 2017, before rolling out to new products like Google Image Search just weeks ago.
The feature has also been inside the camera apps of older Pixel devices as well as those from other manufacturers, including LG, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, and Asus. But Google touted Lens today as one of the new Pixel 3 camera’s big features.
With Lens, you can point your camera at a takeout menu, Google says, and it will highlight the number to call.
Another feature is centered around shopping. With a long press, you can have Lens identify a product the camera sees in the viewfinder, and have it match it to real products. This is called “Style Search,” Google says.
This time we built Google Lens right into the Pixel 3 camera. Point it at a takeout menu and Lens can pull up the number to call. Or with a long press, find out where you can buy those sunglasses that would look so good on you. #madebygoogle pic.twitter.com/yQZBSFCDXS
— Made by Google (@madebygoogle) October 9, 2018
As Google explained at the event, you can point your Pixel 3 camera at a friend’s cool new pair of sunglasses or some shoes you like in a magazine, and Lens will point you to where you can find them online and browse similar styles. The feature is similar to Pinterest’s visual search, which has been available for some time.
Style Search has been available in Lens as of earlier this year.
Also of note, Lens will be able to take some of its more common actions instantly in the camera, without the need for a data connection.
Google says this is possible by combining Pixel’s visual core with its years of work in search and computer vision.
“Being able to search the world around you is the next logical step and organizing the world’s information and making it more useful for people,” said Brian Rakowski, VP Product Management at Google.