Home Apps Apple’s new Live Text feature is a years-late Google Lens ripoff

Apple’s new Live Text feature is a years-late Google Lens ripoff

Apple’s new Live Text feature is a years-late Google Lens ripoff

We’re here at WWDC — well, not “here,” it’s remote, and I’m sitting in my office streaming, blogging, and drinking coffee — and Apple is showing off all the new software features we can look forward to for the coming year for its products. But one particular announcement just caught our eye, a new “Live Text” feature that promises to let you pull text and contact details from photos. For Android users, this sounds pretty goddamn familiar. Ever hear of Google Lens, Apple?

June07 13.22.48

Literally Google lens by another name. 

June07 13.23.03

I’m honestly surprised that Apple hadn’t duplicated Google’s functionality for its platform before now. After all, Google Lens dates back nearly half a decade to I/O 2017. Though the feature set has expanded quite a lot since then, Google Lens allows you to do things like see details from Google’s extensive knowledge graph when looking at objects or landmarks, copy relevant information like contact details, and even scan documents. Those are all features Apple’s announcing now in 2021 as if they’re revolutionary — par for the course.

June07 13.23.18

Apple’s new Live Text feature does pretty much all the same stuff that Google Lens claims to, including recognizing dog breeds or landmarks with data pulled from “Siri Knowledge” or other results from other Apple services — essentially the same as Google’s knowledge graph. Live Text also allows you to parse and pull text from what you see, including images that you’ve already taken. It even has a few integrations that Apple is marketing as part of Live Text and which Google also already has through other products like Google Photos, including the ability to search for recognized landmarks, objects, or locations in Spotlight.

It’s also cross-platform, working on Macs as well as iOS/iPadOS devices. You might think that would give it a leg up on Google Lens, but that’s also integrated into Chrome’s image search, and (as we mentioned) some of the features that Apple markets as part of Live Text are found in other Google services.

June07 13.22.52

Apple says that Live Text will work with seven languages, including two different forms of Chinese (which makes it eight, Apple). We assume it will be part of iOS15 and later software updates for other Apple platforms.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

%d bloggers like this: