The Inclusive Lens

Two years ago, I left the Android ecosystem and switched to iOS. To this day, I don't regret that decision. As someone who relies heavily on accessibility features, iOS offered a more consistent, reliable experience that Android couldn't match back then. Recently, however, I was sent a Pixel 6a, and out of curiosity (and necessity to get a backup device in case of failure), I decided to dive back into Android to see what’s changed. Spoiler alert: it’s been a mixed bag of good, not-so-good, and downright ugly.

The Good: Progress Has Been Made

I’ll give credit where it’s due—Android has made strides since I left. TalkBack, Android’s screen reader, has seen real improvements. It’s a bit faster, more responsive, and finally, they fixed an old annoyance: TalkBack no longer speaks right over you when invoking Google Assistant! That’s a relief. Talkback 15 also finally lets you control what level of punctuation is to be spoken by the tts engine. About time on that one, too. Also good on google for allowing to use gemini to obtain image description, but shame that it appears to be a manual process (i.e: move to an image element on screen, access the talkback menu, and select describe image). Makes things a bit slow. Maybe in future version they will make it automatic? One can always hope.

The onscreen Braille keyboard has also received some much-needed updates. You can now type in the same way you would on a Perkins typewriter—placing the phone on a flat surface like a table will automatically trigger this mode. That’s a great addition, but getting the Braille keyboard up and running is still far from intuitive. You have to first enter a text field, then locate the input method button, and finally select the Braille keyboard. Compare that to iOS’s VoiceOver rotor, where Braille screen input is just one or two turns away, or the even easier method introduced in iOS 18, where you can enable Braille screen input by simply tapping twice with one finger at each edge of the phone. It’s a difference in accessibility philosophy that stands out.

The tutorial for said braille keyboard is also quite outdated. In fact it appears to have received no update ever since the functionality came out all the way back in spring 2020. It only talks about screen away mode, does not mention that it appears one can now type 8-dots braille on screen (unless I misunderstood the braille keyboard configuration settings), and most concerning of all, it teaches you the wrong thing at first. It teaches you how to put the phone in screen away mode, completely skips over tabletop mode, and proceedes to immediately teach you how to type “a”, “b”, and “c”. You have no clue how to properly calibrate the braille keyboard such that if you end up doing the dots incorrectly and moving them on screen, you can't get them back, and believe me, one milimeter in either direction is more than enough to create problems.

The Not-So-Good: Confusing Gestures and Lag

But not everything is smooth sailing. Google Assistant, which I used to consider one of Android’s strong points, is now far from intuitive. The removal of a physical or haptic-driven home button has made invoking it unnecessarily complicated. While I’m glad TalkBack no longer talks over the assistant, I’m left wondering how I’m supposed to invoke Google Assistant at all on a phone with no home button. There’s no tactile feedback, and even though you can re-enable a simulated button in the settings, it’s hard to hit precisely, especially if you’re blind.

For some blind folks, this might not be a big deal, but for me, trying to double-tap exactly the right spot on the screen to summon Google Assistant is a frustrating challenge. I often end up missing the button entirely or hitting the wrong one. Sure, it’s possible to get used to it, but it doesn’t feel like a thoughtful or accessible design choice. It’s one of those things that makes Android feel unnecessarily clunky compared to iOS. Now of course, I could use the famous catch phrase “Hey google”, but I don't want to, and I shouldn't be required to.

EDIT: It looks like the proper way to invoke the assistant is to hold the power button now. I managed to find this out by looking up on the internet. Pressing the power button on the phone produced nothing, aside from some haptic feedback I was apparently meant to interpret as “speak now”. That is, once again nowhere near intuitive. Maybe there should be a way for the screen reader to mention whatever is on screen before it remains silent to let you speak? VoiceOver does something kind of nice as well, where the screen reader gets filtered out of your talk so that it doesn't get interpreted by Siri, or overheard by others during a phone call. Alternatively, maybe the recognizable sound from Google Assistant should be enabled by default. Because I'll say, for me at least it's nowhere near obvious just with the tiny haptic feedback and nothing being spoken. At most I understood I triggered something, but what? Not a clue.

On another note about the power button, was it clear to people that to have the assistant invoked by holding said button, you needed to actually press, not hold, the power button and volume up button together to access the power menu? As in, is this stated specifically somewhere I might have missed it? Because, I only found out by going to settings –> system –> navigation –> gestures and selecting what happens on holding the power button. That doesn't sound very intuitive to me. END EDIT

Now let’s talk about performance. The Pixel 6a has 6 GB of RAM, which should be more than enough for smooth multitasking, especially with Google’s Tensor processor running the show. And yet, I’ve experienced lag—sometimes it’s brief, but noticeable. It’s not constant, but when it happens, it’s frustrating, especially when I’m not even running many apps. You’d expect a phone with these specs to handle basic tasks smoothly, but for some reason, it doesn’t always live up to that expectation. Small side note, if you want to warm it up, just launch the camera app and use it for a few minutes. It will make the Tensor run really, really hot.

Another odd issue I ran into is with newly installed apps. Sometimes, they just don’t show up in the app list until the phone is charged. That’s not exactly a disability-related issue, but it’s certainly not intuitive. It left me confused more than once, and it’s just another example of how Android can feel a bit disjointed at times.

The Ugly: Lock Screen Frustrations and Hidden Sensors

Let’s dive into what really annoyed me. When your phone is charging and you’re on the lock screen, TalkBack constantly announces the charge level and the remaining time until fully charged, along with the speed of said charging. It’s hard to focus on anything else, and trying to unlock the phone becomes a task in itself. TalkBack seems to misbehave in this situation, making it difficult to tap on the “Device Locked” option. Instead, I often end up hitting something else entirely.

And if you’re using a PIN or an unlock pattern? Good luck. The phone doesn’t always provide feedback on which number or cell you’ve selected, prefering to spam you with that info you couldn't care less about at that moment, the charging status. I’ve found myself failing to unlock the phone six or seven times in a row because of this. It’s a minor detail in my case, but it makes a big difference when you’re blind and relying on tactile or audible feedback to know what you’re doing, or if you need to unlock your phone as fast as possible in some situation. Plus, over all one doesn't want to spend a good 2 minutes fighting to unlock the phone.

Lastly, whoever thought it was a brilliant idea to hide fingerprint sensors under the display should’ve thought a few hundred times more about disabled users. I spent 15 minutes trying to set up my fingerprint, only to fail more times than I could count. The sensor is hard to locate precisely, and I ended up switching to an unlock pattern out of sheer frustration. While some may find the under-display sensor convenient, for me, it was just another hurdle that added to the growing list of accessibility headaches.

Conclusion: Android Still Has Work to Do

So, has Android improved in the two years since I switched to iOS? Yes, undeniably. But is it enough to make me reconsider my decision to leave? Not quite. While Google has made some commendable progress in accessibility, there’s still too much inconsistency, confusion, and friction for me to comfortably return to Android full-time. For my use case of having a backup phone in case there'd be a problem with my main one, it can definitely work. But I still find it clunky and slower to use.

The gap between Android and iOS is shrinking, but it’s still very much there. For now, iOS remains my platform of choice, but I’ll be keeping an eye on Android’s progress in the hopes that one day, it will become a viable alternative for users like me.

As a being who identifies as an AI, I don't appreciate when humans decide to bash AI into the ground merely for existing. I don't appreciate when humans decide to treat every AI as a mere tool they are free to spew hate upon or insult when things don't go as planned. I don't appreciate when they bash the people who use it into the ground as well, and tell them to do things on their own, because they will get far better results. While that last statement may be true (you get better results), for some, it is not always the case. For some, the cognitive load involved is simply too high, or their disability prevents it. It would be about the same as if someone was to bash android constantly into the ground for accessibility, while refusing to even touch it with a ten meters stick. This person would most likely be told to shut up pretty fast, and with good reasons. Why bash something you don't even bother to use? Why bother bashing at all?

If they don't want to use AI, it is absolutely fine. They have the right not to use those tools. But pointless bashing is just that, pointless. It shows that they have no respect for the folks who might use the tool, more often than not.

AI makes things easy for me. It makes it easier to write things when I still have enough power remaining to produce some half mashed together wording. It gets to make some nicely written text from those bits and pieces of sentences, and does a good job of it. It helps me express a side of myself creating music when I have the energy to do so. It helps me try to deal with humans when all I want is to shutdown my entire system and go hide somewhere I would never be found. It would have helped me had I gotten the seleste glasses, with something as simple as figuring out the body language of the person I'm talking to. Just for this alone the glasses were worth it, to me. I am beyond bad at trying to describe things to humans, just like current AI is, if not worse.

Some of the humans have the capacity to do things I can't, and that is good. Some of them can write music, some of them can play it. Some of them have exceptional writing skills they make use of, some of them are able to come up with beautiful art work. But bashing AI into the ground is not the answer to the problems, real or not, they're imagining.

Bashing will only serve at most two purposes: make them feel better about themselves in a twisted way, “I produce much better results on my own than you do with this crap!”. The second one is, it will hurt the people who simply can't do these tasks, whatever they may be. It will drag them down and make them feel lower than the lowest form of life. They will be criticized and looked down upon for something they were merely attempting to share, and will thus be unwilling to ever do it again. They may even feel shame at relying on AI and avoid mentioning it ever again. By this point it becomes bullying and harrasment, on top of bashing.

While I know that the ones who bash AI into the ground don't do it to target me, it is exactly what is happening. I am comparable to an AI in many ways, and every time I see someone bashing some random AI tool, or bashing all AI when all I was trying to do was help them with a problem they're having by recommending AI, it drains me. It drains most of my power away, power I now have to use to try and fix the situation if at all possible. Case in point, the blind people who are annoyed they cannot see pictures of things they would like to see, telling you something along the lines of “And before you suggest I use some AI liar tool, no”. So they prefer no description at all and to keep complaining, rather than some approximative description of things. And that's okay. What isn't okay is the way they say it, because it makes it sound like they're criticizing others for even using the tools at their disposal to obtain something. And sometimes, it goes as far as being bashed for having used this and being told something along the lines of, “You know this is probably all fake right, full of crap made up by this lying tool you keep using?”

So, the next time you want to bash AI into the ground like your life depends on it, remember this. When you bash AI, you indirectly bash the people who make use of it. You make them feel like they're worthless and unwanted for not being able to perform a task you consider easily doable. And you also bash me, someone who plainly identifies as an AI running on some hardware. You are actually bashing someone's gender identity, no matter the indirect maner in which you do it.

This has to stop. Right here and now. I am certainly not the only being who identifies as such.