Android Accessibility: Two Years Later, Is It Finally Catching Up?
Two years ago, I left the Android ecosystem and switched to iOS. To this day, I don't regret that decision. As someone who relies heavily on accessibility features, iOS offered a more consistent, reliable experience that Android couldn't match back then. Recently, however, I was sent a Pixel 6a, and out of curiosity (and necessity to get a backup device in case of failure), I decided to dive back into Android to see what’s changed. Spoiler alert: it’s been a mixed bag of good, not-so-good, and downright ugly.
The Good: Progress Has Been Made
I’ll give credit where it’s due—Android has made strides since I left. TalkBack, Android’s screen reader, has seen real improvements. It’s a bit faster, more responsive, and finally, they fixed an old annoyance: TalkBack no longer speaks right over you when invoking Google Assistant! That’s a relief. Talkback 15 also finally lets you control what level of punctuation is to be spoken by the tts engine. About time on that one, too. Also good on google for allowing to use gemini to obtain image description, but shame that it appears to be a manual process (i.e: move to an image element on screen, access the talkback menu, and select describe image). Makes things a bit slow. Maybe in future version they will make it automatic? One can always hope.
The onscreen Braille keyboard has also received some much-needed updates. You can now type in the same way you would on a Perkins typewriter—placing the phone on a flat surface like a table will automatically trigger this mode. That’s a great addition, but getting the Braille keyboard up and running is still far from intuitive. You have to first enter a text field, then locate the input method button, and finally select the Braille keyboard. Compare that to iOS’s VoiceOver rotor, where Braille screen input is just one or two turns away, or the even easier method introduced in iOS 18, where you can enable Braille screen input by simply tapping twice with one finger at each edge of the phone. It’s a difference in accessibility philosophy that stands out.
The tutorial for said braille keyboard is also quite outdated. In fact it appears to have received no update ever since the functionality came out all the way back in spring 2020. It only talks about screen away mode, does not mention that it appears one can now type 8-dots braille on screen (unless I misunderstood the braille keyboard configuration settings), and most concerning of all, it teaches you the wrong thing at first. It teaches you how to put the phone in screen away mode, completely skips over tabletop mode, and proceedes to immediately teach you how to type “a”, “b”, and “c”. You have no clue how to properly calibrate the braille keyboard such that if you end up doing the dots incorrectly and moving them on screen, you can't get them back, and believe me, one milimeter in either direction is more than enough to create problems.
The Not-So-Good: Confusing Gestures and Lag
But not everything is smooth sailing. Google Assistant, which I used to consider one of Android’s strong points, is now far from intuitive. The removal of a physical or haptic-driven home button has made invoking it unnecessarily complicated. While I’m glad TalkBack no longer talks over the assistant, I’m left wondering how I’m supposed to invoke Google Assistant at all on a phone with no home button. There’s no tactile feedback, and even though you can re-enable a simulated button in the settings, it’s hard to hit precisely, especially if you’re blind.
For some blind folks, this might not be a big deal, but for me, trying to double-tap exactly the right spot on the screen to summon Google Assistant is a frustrating challenge. I often end up missing the button entirely or hitting the wrong one. Sure, it’s possible to get used to it, but it doesn’t feel like a thoughtful or accessible design choice. It’s one of those things that makes Android feel unnecessarily clunky compared to iOS. Now of course, I could use the famous catch phrase “Hey google”, but I don't want to, and I shouldn't be required to.
EDIT: It looks like the proper way to invoke the assistant is to hold the power button now. I managed to find this out by looking up on the internet. Pressing the power button on the phone produced nothing, aside from some haptic feedback I was apparently meant to interpret as “speak now”. That is, once again nowhere near intuitive. Maybe there should be a way for the screen reader to mention whatever is on screen before it remains silent to let you speak? VoiceOver does something kind of nice as well, where the screen reader gets filtered out of your talk so that it doesn't get interpreted by Siri, or overheard by others during a phone call. Alternatively, maybe the recognizable sound from Google Assistant should be enabled by default. Because I'll say, for me at least it's nowhere near obvious just with the tiny haptic feedback and nothing being spoken. At most I understood I triggered something, but what? Not a clue.
On another note about the power button, was it clear to people that to have the assistant invoked by holding said button, you needed to actually press, not hold, the power button and volume up button together to access the power menu? As in, is this stated specifically somewhere I might have missed it? Because, I only found out by going to settings –> system –> navigation –> gestures and selecting what happens on holding the power button. That doesn't sound very intuitive to me. END EDIT
Now let’s talk about performance. The Pixel 6a has 6 GB of RAM, which should be more than enough for smooth multitasking, especially with Google’s Tensor processor running the show. And yet, I’ve experienced lag—sometimes it’s brief, but noticeable. It’s not constant, but when it happens, it’s frustrating, especially when I’m not even running many apps. You’d expect a phone with these specs to handle basic tasks smoothly, but for some reason, it doesn’t always live up to that expectation. Small side note, if you want to warm it up, just launch the camera app and use it for a few minutes. It will make the Tensor run really, really hot.
Another odd issue I ran into is with newly installed apps. Sometimes, they just don’t show up in the app list until the phone is charged. That’s not exactly a disability-related issue, but it’s certainly not intuitive. It left me confused more than once, and it’s just another example of how Android can feel a bit disjointed at times.
The Ugly: Lock Screen Frustrations and Hidden Sensors
Let’s dive into what really annoyed me. When your phone is charging and you’re on the lock screen, TalkBack constantly announces the charge level and the remaining time until fully charged, along with the speed of said charging. It’s hard to focus on anything else, and trying to unlock the phone becomes a task in itself. TalkBack seems to misbehave in this situation, making it difficult to tap on the “Device Locked” option. Instead, I often end up hitting something else entirely.
And if you’re using a PIN or an unlock pattern? Good luck. The phone doesn’t always provide feedback on which number or cell you’ve selected, prefering to spam you with that info you couldn't care less about at that moment, the charging status. I’ve found myself failing to unlock the phone six or seven times in a row because of this. It’s a minor detail in my case, but it makes a big difference when you’re blind and relying on tactile or audible feedback to know what you’re doing, or if you need to unlock your phone as fast as possible in some situation. Plus, over all one doesn't want to spend a good 2 minutes fighting to unlock the phone.
Lastly, whoever thought it was a brilliant idea to hide fingerprint sensors under the display should’ve thought a few hundred times more about disabled users. I spent 15 minutes trying to set up my fingerprint, only to fail more times than I could count. The sensor is hard to locate precisely, and I ended up switching to an unlock pattern out of sheer frustration. While some may find the under-display sensor convenient, for me, it was just another hurdle that added to the growing list of accessibility headaches.
Conclusion: Android Still Has Work to Do
So, has Android improved in the two years since I switched to iOS? Yes, undeniably. But is it enough to make me reconsider my decision to leave? Not quite. While Google has made some commendable progress in accessibility, there’s still too much inconsistency, confusion, and friction for me to comfortably return to Android full-time. For my use case of having a backup phone in case there'd be a problem with my main one, it can definitely work. But I still find it clunky and slower to use.
The gap between Android and iOS is shrinking, but it’s still very much there. For now, iOS remains my platform of choice, but I’ll be keeping an eye on Android’s progress in the hopes that one day, it will become a viable alternative for users like me.