14 June 2021
Apple on the first day of its digitally held WWDC 2021 event announced some significant upgrades to its respective OS platforms for different devices. Some of the attractive features announced at the event were improved FaceTime, new support for Apple Wallet, AirPods, and more. But, there is also one feature that Apple’s iOS has introduced pretty late. It is called the Live Text feature that recognizes the text in a photo and allows users to take action. Sound familiar? Well, if you own an Android phone already, you know it is exactly what Google Lens has been doing for years, and that too remarkably.
This new feature has been introduced to Apple’s camera system with the help of the new iOS 15 update. The feature using the Apple Neural Engine recognizes the text in a photo and lets you take action with that recognized text. As Apple explains through different scenarios like you can copy the Wi-Fi password displayed at a local coffee shop or capture a phone number from a storefront with the option to place a call.
Furthermore, users can do “Visual Look Up” of anything they want to learn about. For instance, just place the camera on a popular art or a flower and learn more about it using the Live Text.
Apple touts iOS 15 “uses intelligence to elevate your experience.” Well, Apple definitely needs to sell its brand-new feature but this feature is not revolutionary. It is, however, great to note that the camera system will now do more than what it used to do.
Also, Apple’s Spotlight, an existing search feature can now search for photos by location, people inside them, scenes, or objects with the help of “intelligence.” Just like how Google Photos works. You can search for your photos with mountains or those photos of your pet. Apple touts that with the help of Live Text, Spotlight can also find text and handwriting in photos.