Apple has been announcing a lot of exciting new information at WWDC surrounding the upcoming iOS 15 update that will be landing later this year. From exciting changes to FaceTime (including some enhancements some have waited years for) to ways of reducing the number of distractions caused by all the notifications that you may get throughout the day. Another update is coming to your camera, with the addition of Live Text.
The company will (finally) be taking on the likes of Google Lens and other similar apps and features offered by other companies by allowing the camera within iOS to identify text within any image. This could be a web address you find on a sign or business card that you can instantly click on or a recipe you’d like to save for late in a document (as seen in the image above).
It uses artificial intelligence through deep neural networks to look for text within any image you take or live as your camera is facing something. The text can then be pasted into a document, text, or email where it can be sent along or saved. This even moves beyond text as it will automatically tag your images, making them easier to search for (they refer to this as “Visual Look Up”).
Live Text will roll out across all Apple devices, including iPhone, iPad, and even Mac computers. It will support multiple languages (likely, with more to come).
This is another example of how Apple can be slow to catch up at times. Google Lens, which can accomplish most of this has been around for nearly four years, and Microsoft was experimenting with it long before that with the Windows Phone series. However, it is nice to at least see that Apple is starting to apply more attention to these areas finally vs spending all of their time finding ways to turn what’s already out there into proprietary tech that locks people into expensive accessories. It will be quite fascinating to see how well it really works when iOS 15 rolls out.