The image-identification technology in Google Photos is now in the iOS version of this software. Google Lens can find objects and text in pictures and then provide more information. For example, by examining a photo of a business card, the artificial intelligence can pull out all the contact information.
Google made the announcement a short time ago: “Starting today and rolling out over the next week, those of you on iOS can try the preview of Google Lens to quickly take action from a photo or discover more about the world around you.”
The system can be shown a picture of a book and present reviews of it, or give more details about a famous painting. Google Lens can even identify plants and animals and display details about them. Historical buildings can also be recognized, and businesses too.
Of course, there’s a commercial aspect. Identifying a book gets more than reviews, as links to buy it are also displayed. Taking a picture of flower gets its name as well as links to florists.
Google’s AI at work
This isn’t a real-time process; it’s necessary to take a picture and store it in Google Photos before Lens will try to identify what’s in it. That’s because the image processing is being done by artificial intelligence running on Google’s servers, not the iOS device.
Access to Google Lens within this company’s photo-storage app is being rolled out over the next several days, so even those who have the latest version 3.15 might not see the the Google Lens button appear until next week. Those who don’t have Google Photos at all can get it from the App Store. It’s free to download and use. Don’t look for a stand-alone Google Lens application—there’s no such thing.
Despite making the Android operating system, Google offers quite a few iOS apps. There are versions of almost all its top utilities available for iPhone and iPad, like Google Assistant, Google Maps, Street View, and even a Safari extension and an iMessage plug-in.