4 Android Features That Can Help ELs

Last month, we talked about how English learners (ELs) can use a number of features built into the iPhone to help them on their language learning journey. This month, as promised, we’re going to look at ways to help language learners who are part of the other 70+% of people in the world: Android users. That’s right, the vast majority of people using mobile operating systems are on some variation of Android.

One frequent problem with talking about Android is that devices are by no means uniform, so what works one way on one device might be totally different on another. Still, with a little digging or flexibility, most of the following features or apps should be available to you or your students. For reference, I’m basing the info here on a Samsung Galaxy Tab on Android 11.

Let’s take a look at some features you can share with students to help them out:


Talkback is designed for low-vision and blind people, but the deep options for customization make it a useful boon to ELs who want to practice English while using their phone or for those who have higher spoken and listening skills than they do reading skills.

Talkback allows ELs to tap on their screen and have the phone read out the section they have selected. This may take a little getting used to as the design is meant to talk people with low-vision through the screen, but if a user adds TalkBack as a custom accessibility button, they can turn it on and off quickly as needed.

Where to find it: Settings > Accessibility > TalkBack

How to make a custom button: Settings > Accessibility > Advanced Settings > Accessibility button

Live Transcribe

Live Transcribe in Action

Much like TalkBack is built for the blind and low-vision communities, Live Transcribe is built for the deaf and hearing-impaired communities. But some quick creativity can show how useful it is for language learners, too.

Live Transcribe is a simple app that does exactly what you’d expect it to do: Transcribe your words as you speak. Over the years, Google’s transcription engine has gotten quite strong, but as we talked about in “4 Promising Programs to Practice Pronunciation in Private,” it’s not flawless. Still, live transcriptions can be very useful to ELs who want to check what they think they’re saying against what the computer interprets. This can help customize accents as needed for learners focusing on specific pronunciation goals.

Where to find it: After a user downloads Live Transcribe from the Google Play store, it will show up under Settings > Accessibility > Installed Services

Live Caption

Last month, I mentioned that subtitles are invaluable for following along with videos, understanding fine details, or simply picking up unusual vocabulary words. This is true on any device, so activating captions by default can be very useful to ELs. While the captions are pretty easy to find for most people in YouTube, they can create more problems in other apps, like Twitter or Instagram, or with videos embedded in websites.

Activating Live Caption across the entire device can make it much easier on students who want to read what people are saying without searching out the custom menus of every video each time they come up. Turn it on once, and then never worry about it again!

Where to find it: Settings > Accessibility > Hearing Enhancements > Google subtitles (CC) [Note that Live Caption varies wildly across different devices, so some searching may be in order]

Google Lens

Screenshot of Google Lens taking a picture of scissors

Google Lens in action, recognizing not only that this is a pair of scissors, but also the brand and where to buy them.

I don’t always save the best for last, but I definitely saved the best for last here. Google Lens is nothing short of magic, allowing users to identify objects (and buy them), copy text from the photo of a typed or handwritten page, translate text from multiple languages, tell them what plant or animal they’re looking at, and even help with homework. In other words, it’s Computer from Star Trek—it can tell people just about anything they’d want to know.

For ELs, this can be as simple as trying to learn the name of something they’re not aware of or as complicated as trying to understanding complex documents they need to fill out to survive in a new country. For those of us who learned a second language without all of this tech, it’s hard to imagine just how much this could have helped. Google Lens basically puts a tutor in every student’s pocket, guiding them through their daily life without interrupting.

The truth is that Google Lens deserves its own post (or book), but a brief examination will show how valuable and versatile it can be. If you’re going to share any one resource here with your students, make it Google Lens.

Where to find it: Download from the Google Play Store or access features in the Google Photos or default Google app.

Services like these change, update, and are moved around by software engineers regularly, so keep an eye on them to make sure you know how to use and access them. Sharing resources like these with students may seem like a small move on your part, but can change the learning trajectory of a student. Be generous in letting them know about services that can help!

Did I miss anything? There are far more resources than I could possibly cover, so please feel free to share your favorites in the comments below!

About Brent Warner

Brent Warner
Brent Warner is a professor of ESL at Irvine Valley College in California, and an educational technology enthusiast. He is co-host of the DIESOL podcast, the only podcast with a specific focus on EdTech in ESL. He frequently presents on the crossroads of technology and language learning, focusing on student engagement and developing learner autonomy. Brent likes his coffee black and his oranges orange. He can be found on Twitter at @BrentGWarner
This entry was posted in TESOL Blog and tagged , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.