Since Android 9 Pie, Google has made the accessibility framework and its APIs a part of the Android Open-source Project. This means that if developers or manufacturers want to, they can modify this side of Android, just like they modify many others. Some developers choose to leave it like that (Nokia, Motorola), while others modify it (Samsung, Xiaomi, Realme, OnePlus, Oppo).
Table of Contents
What is the accessibility framework?
The accessibility framework contains many systems and APIs that developers can use. Some include:
-
Gestures API – This API is responsible for the handling of gestures by any accessibility service, be it a screen reader or not. This API also allows one to manage the volume keys or the power button, the reason why some button remappers ask for the enabling of an accessibility service.
-
Speakthrough API: This is for screen readers only. It allows apps to send text to screen readers, text that’s usually not visible on the screen. Not to be confused with the other method of communicating information to screen readers, where the app effectively uses another string altogether when a screen reader is active. This later method does change the visible text on the screen, but it’s rather old and professionals don’t use it too much. Some information that’s sent through screen readers you might’ve seen could include:
- “Voice message, button. Double tap and hold to record, swipe left to cancel recording, swipe up to lock recording on. Release to send”
- “Current location, Los Angeles. Double tap to open Weather”
- “(On the keyboard) Period. Hold for options.”
There are many other APIs that developers can use to their advantage, such as the accessibility click API, which can help not only accessibility developers but they’re helpful in many other scenarios, such as for memory/management apps. This allows such apps to automate some actions like force stopping apps or clearing their cache, which has been seen in the past and is really helpful, especially when you have many apps to manage.
The accessibility framework isn’t just about resources that app developers can use, it’s one of the biggest deals for us screen reader users. The accessibility framework is responsible for all the information that’s sent to a screen reader and the way it receives it. What it can focus or can’t, if any element is clickable or not, the way a screen reader sees the progress bars or how windows change. In short, the accessibility framework is the heart of a screen reader. Without a good accessibility framework, your screen reader experience can’t be good as well.
What happened to the OneUI6.0 and 6.1 accessibility frameworks?
Next, we’ll go over anything that happened in these 5 months since I received OneUI6 on my phones. I am using an S10+ and S20Ultra. Some factors may influence your accessibility framework, especially on phones like this, where the updates are, in fact, ports and the phone is using old Vendor, therefore old drivers. Do note that newer phones, starting with the S24 series, are part of Google’s “Upstream Kernel” program, which means that both Google and Samsung are working on the kernel. After the release period ends, important drivers and kernel patches might be taken easily from other phones that are part of the program. The kernel might update with each update the phones receive, no matter how big or small. For reference, S10+’s kernel hasn’t been updated since it received Android 11, even though the phone received Android 12. So it is with the other phones as well. “Upstream Kernel” is meant to remove all this delay and ship the latest kernel versions to phones. A series released this year might be included, although I’m not sure as that’s not been sourced anywhere and I’d not like to talk about something that I’m not sure about.
The matter with bad TalkBack sound on speakers. Finally revealed!
Since TalkBack 13.1 (on Google’s side) and 14.0 (on Samsung’s side), TalkBack started sounding weird and bad on phones’ speakers. Some phones, however, don’t experience this behavior. Here’s why:
Samsung’s soundbooster library, found in both /system and /vendor (/system is prioritized in favor of /vendor), is responsible for the FX on Samsung’s phones speakers. The reason why your speakers sound good and not like TalkBack sounds is because soundbooster is there to equalize everything. I came across this realization quite by mistake, when I was porting OneUI6.0 S23FE base for my S20Ultra. While the vendor drivers were good and working, using the S21Ultra ones that work good on the phone because they haven’t changed much, the soundbooster library in /system was the one from S23FE base, which was a higher number than the one that the drivers were asking for. For this reason, soundbooster couldn’t be seen and, therefore, all my audio sounded as bad: The ringtone channel, media, notifications, system, let alone the accessibility channels.
Knowing that the one in /system was bigger than the drivers, I decided to modify it, then make the drivers look for the version 2000 of soundbooster, not 1500 that they were looking for initially. After that happened, surprise! Even my TalkBack sounded the way it is meant to sound like.
In conclusion, your soundbooster version determines this: 2400, found on this year’s phones, makes both Samsung’s and Google’s TalkBack sound good. 2000, used on S22 and S23X series, works just for Samsung’s TalkBack. 1500 and lower, found on S21X and lower, isn’t supported by any TalkBack. The simple change that TalkBack has made is to drop support for some soundbooster versions, while supporting others. Because as we’ve seen, the lack of soundbooster support is the reason for the bad sound.
The issue this brings. What’s wrong with this approach?
Samsung usually doesn’t update libraries with software updates, mostly because they’re unaware. Many phones released in 2021 and earlier are suffering from this bad sound on Samsung’s speakers. For normal users who don’t know to root and to port, this is an issue if sound is something that bothers you. You have absolutely no way of making a change, which, from my point of view, is totally against what Android stands for and definitely something that shouldn’t ever happen. If Samsung happens to read this, they should note that updating libraries with phones for as long as they’re supported is an important thing to do, because apps are evolving. And like Google’s TalkBack, they might drop support for old versions of libraries.
Do note that, while this is not truly about the accessibility framework, I decided to clarify it, since in the past I have assumed it has to do with the framework, truly unaware of what was happening. I was in a testing environment. Not only I couldn’t look for what actually happens, I wasn’t even in proximity to the device. I truly apologize for the mislead!
SIM info, Wi-fi, phone signal, Bluetooth info, battery level, date, and time on the notification shade. Changed between OneUI6.0 and 6.1
As we mentioned before, the framework is responsible for how screen readers focus things. On OneUI6.0, this information at the top of the notification shade was quite unorganized. Here’s how it looked. To replicate the UI, I’ll put things just like they looked there. Remember, we’re swiping right:
- SIM info, phone signal, Wi-fi info, battery level, time.
- SIM info
- Time
- Date
With OneUI6.1, things have improved, at least in my opinion. Each thing is separate, clear to swipe on. Here’s how it looks:
- SIM info
- Wi-fi
- Phone signal
- Bluetooth info
- Battery level
- Date
- Time
Do note that some things may be added here or removed, depending on if the feature is enabled (like NFC, Bluetooth) or for location requests. No matter what is added or is changed, the UI and the way they’re displayed isn’t modified.
Jieshuo’s focus issues:
I won’t go into too much detail, as this thing isn’t just about the accessibility framework, it’s also about Jieshuo itself, although the accessibility framework is responsible for these as well:
- Notifications are wrongly focused: As you open the notification shade, you first see the notifications, then the quick panel, which we talked a bit about earlier, then the media players. This is not the order they’re shown on the screen though. The right order is the one that TalkBack actually shows them in: Quick panel, media players, then notifications.
- Google Messages reverse focusing messages. In the past, around Q3 or Q4 2023, Google has made a change, where messages are shown in reversed order. IA: The last message received is at the top of the screen. I don’t know at all how TalkBack manages to focus these things in the right order. This is truly a secret for me as of now.
The matter with the fingerprint hint message on the screen. When is it sent to screen readers?
For those of you who know this, there’s a message that some people are seeing when on screen lock. The message is only sent to screen readers, can’t be focused, and is quite unstable. I don’t know the exact reasoning behind including this, or if it’s only included by Samsung or it’s an AOSP thing, but here’s how it works:
The message: “The fingerprint sensor is centered in the lower part of the screen. You’ll hear feedback when you touch and hold it.” is supposed to be shown after changing windows and coming back on the lockscreen.
If that sounded confusing for some, here’s an example of how this feature should actually work:
You turn your screen on because you want to add a new Bluetooth device from quick settings. After you turned your screen on, you swipe down with two fingers to open quick settings, or press your set keyboard shortcut, in case you’re using a keyboard. You long press the Bluetooth button, open the Bluetooth floating panel, add your device, then back out of it and are to your lockscreen. Now that you’ve returned, the message will be read: “The fingerprint sensor is centered in the lower part of the screen. You’ll hear feedback when you touch and hold it.”
This feature is unstable. For one, the message will be announced every time you change windows while on the lockscreen. This means that it will be spoken 3 times instead of just once in our scenario. Secondly, it’s delayed. This delay makes it so the message is sometimes spoken after you unlock the device. The accessibility framework hasn’t received the message that the device is unlocked yet, therefore it sends the message over because the window has changed. For some cases, this feature would’ve been nice actually. As of now, however, it’s a miss, something gone quite wrong.
Conclusion:
Apart from the focus issues and the idea with the fingerprint message, the Samsung accessibility framework34 is quite stable and not too bad for a daily driver. In these 5 months I used OneUI6.0 and 6.1, I can’t say that there was something the framework did wrong, to the point I couldn’t use any features of my phone, just like with OneUI 5.1’s quick settings closing out of nowhere, or 5.0’s notification shade rebooting the phone.
The sad thing, however, is that not all the devices that get updated, at least those that got updated to Android 14 and OneUI6.0, didn’t get the full, stable accessibility framework, especially phones from the A series. Let’s hope that with OneUI6.1, Samsung will think again and update the framework, as the 6.1’s framework is way faster than 6.0’s one is.
Samsung Android Accessibility API34: Improvements and My findings So Far

Hi can you unlock my Samsung galaxy phone please
Hello there! It depends on a few things. Is this ok if we switch this over to my Telegram please? https://t.me/CiprianDinca. Thanks for the question and interest!