About a month ago, I traded in my OnePlus 13 for a Samsung Galaxy
S25 Plus. Ever since, I’ve noticed a big difference in Android
accessibility. Since there’s
already a
review of the Galaxy S25 Plus, I’ll focus here on my experience
relative to the OnePlus 13.
Table of Contents
The Two TalkBacks
After updating my phone to OneUI 8 and Android
16, I made sure to try out Samsung’s TalkBack for a while. It works
just like Google’s TalkBack in operation, just a version or so
behind. There are a few differences, but those have been covered in other posts on this site.
I do wish Samsung would just use Google’s
TalkBack, but I knew I would have to sideload Google’s version before
I bought the phone, and was prepared to do so. So this didn’t affect
me personally, but people new to Android and command-line programs may
find it a bigger deal.
Talking Fingerprints
On my OnePlus 13, one of the biggest issues was that the fingerprint
reader did not give guidance on where exactly the sensor was. In
Android 15, it always spoke “move up,” no matter where my finger was
on the screen. In Android 16, it was completely silent.
On Samsung, however, everything simply works as it should. The
fingerprint reader tells me where on the screen it is. If my
fingerprint doesn’t unlock it, it tells me what to do differently.
You’ll read that a lot in this article: Samsung simply works well,
as far as accessibility is concerned, if not everything else.
The Notifications
On Android 15 on the OnePlus 13, notifications worked
well. Collapsed sections read well, and I always knew which app a
notification came from.
On Android 16, however, things became much worse. When I
focused on a notification, all I heard, for example, was
“Collapsed, five.” I had no idea where the notification came from,
or what the first notification was—just that the group was collapsed
and how many notifications were inside.
The Galaxy fixes all of that. Even on Android 16, I can read my
notifications just fine. It’s honestly made using Android much less
frustrating. And since Samsung is a bit closer to Google when it
comes to keeping accessibility up to date, I’m not as worried the
same thing will happen on my Samsung phone.
Braille Display Overlays
On the OnePlus 13, I tried using Braille a lot. I love the TalkBack
Braille Keyboard, but its hardware Braille support came with an odd
issue. When I first connected a Braille display, every time I
entered a text box to type, the message that an external Braille
display was connected would pop up, taking up the same space as the
text box and making me unable to type into it.
On my Samsung phone, that never happens. Besides more general issues discussed elsewhere on this site, TalkBack’s hardware Braille display support works fine.
New Heights
I’ve always loved Linux. I’ve loved the Emacs text editor even
more. So, since my Android phone is now such a stable and enjoyable
experience, I decided to have AI fix up the accessibility of the
Termux app for Android, which gives me a Linux terminal. Then I got
Emacs and its speech interface, Emacspeak, working, got a Bluetooth
keyboard, and now I’m writing this blog post on my phone.
I’ve always thought that we, blind people, were in a position to
use our phones as workstations. Since we cannot see the screen, we
don’t need a bigger one. Since we already use keyboards, everything
that is easily keyboard-controlled could be better for us. And since
our phones are as powerful as computers from a few years ago, it
only makes sense to see how far we can make phones into computers.
So far, with Termux, I can write files and documents, convert
Markdown or Org-mode files to any other format with pandoc, manage
files (even files outside Termux), manage a calendar and to-do list,
play online text-based games called MUDs using a MUD client written
by AI, and do anything else Emacs can do. I haven’t gotten full
Linux desktops, like MATE, working accessibly, though.
Still, this shows the power of an accessible and open system. If
it’s accessible, I want to use it more. If it’s open, people (and
now AI) can create cool stuff on it that the people who built it
never would have dreamed about. And because of all this, I’ve felt
absolutely no need to pull out my iPhone in the month I’ve had the
Galaxy phone.

I am using S25Ultra’s One UI 8.5 betas on my S23U, S20, N20U and S10+.
Your fingerprint guidence will be gone in UI8.5.
I’m glad you’re having a better experience with your samsung phone versus the OnePlus. I haven’t personally used a samsung phone, but as long as you customize it to the way you need, then that’s awesome. Working with someone who was using it, (I was working with them remotely), since I’m used to the Pixel, the Samsung experience feels similar but different. The core of talkback is the same, just the layout is different. P.s., I’m glad you can disable that edge panel thing, but when I helped the person with it, I hated that edge panel!
I’m interested in knowing what was involved in AI fixing up Termux, did yuou use a paid service for this or discuss it with one of the free ones, and are you sharing the result anywhere/feeding it upstrea for inclusion? Having a decent CLI/SSH environment is one of the few things I’m missing after moving from iOS to Android.
Github fork: https://github.com/devinprater/Talking-termux-app
Pull request upstream: https://github.com/termux/termux-app/pull/4915
APK of my version: https://www.dropbox.com/scl/fi/90pkeil14dmv2wl6bd2js/termux-app_apt-android-7-debug_universal.apk?rlkey=yady1kn312kyknree7pb2qtn4&dl=1
I used Gemini CLI for most of it. Claude Code I think for a little.
Good