CEO Sundar Pichai unveiled Google Lens, a set of vision-based computing capabilities that can understand what you are looking at. It will first be available as part of Google’s voice-controlled digital assistant — which bears the straightforward name “Google Assistant” — and Photos app. In the real world, that means you could, for instance, point your phone camera at a restaurant and get reviews for it.
Pinterest has a similar tool. Also called Lens, it lets people point their cameras at real-world items and find out where to buy them, or find similar things online.
Another tool in Google Photos will prompt you to share photos you take with people you know. For instance, Photos will notice when you take a shot of a friend and nudge you to send it to her, so you don’t forget. Google will also let you share whole photos libraries with others. Facebook has its own version of this feature in its Moments app.
Many of Google’s products are vying against similar offerings from Apple, Amazon and Microsoft. Some features won’t be out until later this year.
The overview comes Wednesday during Google’s annual conference for thousands of computer programmers.
Google is expected to give the crowd a look at new twists in its Android software for mobile devices. Executives are also laying out how the company is expanding the reach and capabilities of the Google Assistant, which is currently available on some smartphones and an internet-connected speaker called Home.
Wednesday’s keynote is taking place at an outdoor theater near the company’s Mountain View, California, headquarters.
AP technology reporter Tali Arbel contributed from New York.