Tag Archives: live text feature tutorial

Live Text in iOS 15.1 is Deep – with Layers of Options and Powers

The huge iOS and iPad OS 15 update keeps yielding more tips and tricks to learn

We’ve all gotten so used to a slew of system software updates from Apple each year that we kind of just sleepwalk into them, it seems. With so many new features and new ways to use them, it would take an encyclopedia to catalog them all.

Live text is one of the best liked and most useful of this year’s mountain of changes. The ability to extract test from a photo or from the live camera feed – that can be accessed from an assortment of built-in apps, is something that may seem like a nifty trick, perhaps, at first, but when mastered in a deeper way, is something that will, one day, be hard to imagine having lived without. Like having a camera in your pocket at all times.

The interdependence and interaction between built in apps is key

In this how-to video, the second from Madison on Live Text so far, a business card is used as an example. This is a great choice as it contains a web address, a phone number, a street address and an email address, as most business cards do.

This allows the Live Text feature to get busy – and the video shows how you can, just from the camera app, extract the text from the card (or text on any visible object) and then “route” the information in the text to the best app to do what you want with it.

The, at times dramatic, examples include taking the address and sending it with a click to the maps app where a route can be generated to drive or walk there, immediately. Naturally, once the address is ingested into the maps app there are additional things that can be accessed, like the satellite view, or the 360 degrees look around feature. If it’s a business in a shopping center there is also a new “look inside” feature where hours of operation, photos and more can also be accessed.

The phone number on the card can be extracted and made “click-able” to call, copy, use to initiate a FaceTime session, add to contacts, and so forth.

Email addresses and web addresses can be instantly used to compose and send an email message or open a web page in Safari. These example only give one short peek into a single layer of what you might use LiveText for, in this case with a single business card.

Check out Lynxotic on YouTube:

A Steamroller where a flyswatter would suffice?

While, like any high tech magic, there are times when LiveText seems like a million dollar way to avoid using a pen and paper, at other times it is, well, magical, when large amounts of inaccessible text can be instantly accessed, for example.

I have taken to using to extract text from a screen shot, one that I take when a week site has text that is not clickable and can’t be copied directly (so annoying!) and that alone is a life saver when dealing with data for life’s everyday chores and information gathering.

All in all the new iOS 15 update is one that we will all have to grow with and adapt to – and while that won’t always be smooth and bug-free, the ways that life’s little tasks are made just a tiny bit easier is what will make the extra effort ultimately worthwhile.

Check out Live TEXT #1 on YouTube:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Lynxotic may receive a small commission based on any purchases made by following links from this page

Live Text in iOS 15 shocks people with its utility and power

Master the ways you can use it and information gathering on iPhone will never be the same

We are all cyborgs now, to paraphrase Elon Musk when speaking about his neural network project, neuralink. The implication is that we rely so heavily on the iphone in our pocket that it has been come a literal extension on our minds and bodies, like a bionic arm.

With live text features and functions using iPhone (or iPad) in iOS 15 and 15.1 there is a new and very powerful addition to our already amazing arsenal of sensory extension. But the use of these new powers is not always obvious, and since the iPhone comes without a user manual, or rather an infinite number of them via YouTube and the web, learning just how far this feature can take you is a journey in itself.

In typical Apple fashion, if there are 10 ways to use live text then there are 100

The first thing that is not immediately apparent but becomes clearer on repeated use is that you can extract text, including most handwriting, from any existing photo. It can be a photo you took to store some text (like a menu posted behind the deli counter or a for sale sign in front of the house you might want to bid on).

Less obvious is that you do not need to take a photo at all to activate the live text recognition options. You can just point the camera at the “thing” that has text on it that you want to extract. But taking this to it’s most extreme logical conclusion Apple has made it possible to access the camera from within various apps specifically to make it less of a hassle to get the text directly from that app you plan to store it in or send it from.

Examples in the video below include the Notes app, the Messages app (formerly iMessage) and the mail app.

It will also allow you to go directly to the phone app or Apple Maps to “act” on information that you gather, from any camera access node or from the camera app – such as extracting a phone number from a business card or a billboard and then just clicking call, or sending an address from either of those examples into Apple Maps and immediate get directions.

These are just a few of the “live” uses of the feature that come up amazingly often in real life. More detailed use cases will be in both the video below and in subsequent videos on this feature, already in the works.

iOS 15.1 is filled with features that have a myriad of use cases, almost too many to list or describe

Every year when a major upgrade is sent down from on high, there is adapting to do and bugs to avoid. Sometimes it seems like the effort to learn how to use the new features is nearly on par with the gains in productivity from the better performing software. Three steps forward and two steps back, as it were.

This is not the case with iOS 15 – it’s more like 10 steps forward and only four steps back! Seriously, there are so many new features that it is completely reasonable to want to slowly adapt to the improvements, no matter how exciting they may be.

But in the case of live text, as well as the extensive upgrades to nearly all the built in apps for iPhone, iPad and Macs, the future will reward those of us that proactively evolve with the software’s upgraded abilities.

For those that use iPhones and iPads with a mac laptop or desktop, the changes coming with iOS 15.1 and macOS 12 Monterey (scheduled to go public next week) are just the beginning of an intense evolution toward what we have been calling the “Apple OS ecosystem singularity”.

The added power from improved hardware in all Apple devices, along with the ever converging and evolving ability to interact with one another via software upgrades, is going to make the world feel like a very different (better?) place a year or two from now.

It’s only a question of if we, with our non-bionic brains and bodies, can adapt to the new powers that come our way fast enough to gain from them before the next wave of changes hits us with new challenges of adaptation.

Apple Articles:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page