from the believe-it-when-i-see-it dept.
Google has announced an augmented reality service that overlays information on top of objects seen by a smartphone camera:
On Wednesday, the search giant announced a big push into augmented reality, which overlays digital images on what you'd normally see through a camera.
The new technology, announced at the company's I/O developer conference, is called Google Lens. It's a way to use your phone's camera to search for information. For example, point your camera at that flower and Google will tell you what kind it is. Point it at a book, and you get information on the author and see reviews. Ditto with restaurants: You'll be able to see reviews and pricing information on a little digital card that appears above the building on your phone's screen.
[...] Google Lens marks a big, ambitious attempt by a mainstream company to get into augmented reality in a way we haven't much seen yet. Snapchat, Facebook and Instagram (owned by Facebook) use AR for now to make you laugh and smile with filters like rainbow vomit or Iron Man masks. That stuff is important, but Google is taking a different approach when it comes to AR: utility.
Indeed, photo filters are very important.
Google has shown off some new augmented reality features at its I/O conference. Google Maps will get an augmented reality Street View:
Google showed off new features for Google Maps at I/O today, including an augmented reality Street View mode to help you follow directions in real time, along with personalized recommendations to help you discover places in your neighborhood.
The new AR features combine Google's existing Street View and Maps data with a live feed from your phone's camera to overlay walking directions on top of the real world and help you figure out which way you need to go. It's a lot like the promises Google had made with the original version of Google Glass, except without the need for wearing an additional AR headset.
No need for an AR headset? What if I want to walk around without holding a phone?*
Google is launching a new version of its augmented reality platform for Android, ARCore 1.2. Version 1.2 adds support for wall detection, launching an AR experience via image recognition, a new "Sceneform" framework, and a "Cloud Anchors" feature that enables shared experiences not just across Android devices—it works on top of iOS' ARKit, too.
Google launched ARCore version 1.0 in February as its big reboot of the Project Tango augmented reality project. Where Tango was focused on special hardware with extra sensors and cameras, ARCore tries to replicate some of that functionality on a normal smartphone. ARCore doesn't work on every single Android phone; instead, it works on a model-by-model basis (mostly on flagships) and requires some work from the device manufacturer. Most of the major Android OEMs, like Samsung, LG, and OnePlus, are signed up though, and today ARCore has a supported install base of more than 100 million devices.
Google Lens, an augmented reality service which overlays information on top of objects seen by a smartphone camera, will be integrated into the default camera app of at least 10 Android smartphones, instead of operating separately. It will also add new features, including real-time search displayed within the camera app (e.g. point your camera at a concert poster and begin playing a video of the artist's new single), and "smart text selection" (e.g. point it at a handwritten recipe to convert it into a document).
*And still be blessed by Google's guidance.
Google said today that it'll be shutting down Project Tango next year, on March 1st. Project Tango was an early effort from Google to bring augmented reality to phones, but it never really panned out. The system was introduced in 2014 and made it into developer kits and even a couple consumer devices as recently as last year.
But those devices required special sensors. And in the meantime, Google (and competitors, like Apple) figured out ways to bring AR features to phones with just the hardware that's already on board. Google introduced a new augmented reality system, known as ARCore, in late August. It just brought that system to the Pixel and Pixel 2 in the form of some augmented reality stickers — immediately opening AR features to more people than Tango is likely to have reached in its lifetime.
Related: Google's Project Tango Coming to 12 More Countries
Google Tango Means You'll Never Get Lost in a Store Again
Google Announces "Lens" Augmented Reality Service
Google Partnering With HTC and Lenovo for Standalone VR Headsets
HTC Cancels U.S. Release of a Google Daydream VR Headset, Reveals Own Standalone Headset
Apple is buying music recognition service Shazam. The Shazam app basically uses your microphone to listen to a snippet of whatever music is being played in your vicinity, identify the song, and store it along with a timestamp. But the company was also working on visual recognition technology similar to Google Lens:
Apple is finalizing a deal to acquire Shazam, the app that lets you identify songs, movies, and TV shows from an audio clip, according to TechCrunch. The deal is reportedly for $400 million, according to Recode, which also confirmed the news.
For Apple, the obvious benefit of acquiring Shazam is the company's music and sound recognition technologies. It will also save some money on the commissions Apple pays Shazam for sending users to its iTunes Store to buy content, which made up the majority of Shazam's revenue in 2016, and drove 10 percent of all digital download sales, according to The Wall Street Journal.
A side benefit is if Apple decides to shut down the app, it will hurt competing streaming services like Spotify and Google Play Music, where Shazam sends over 1 million clicks a day, the WSJ reported. Shazam also has a deal with Snapchat. It's unclear how the acquisition will affect any of these agreements.
Related: The Shazam Effect