iOS app accessibility review
JustinGuitar: an app with no accessibility tree at all.
VoiceOver cannot see anything in this app. Not the login screen, not the lessons, not even a single button. The only way I could interact with it was through Apple's Screen Recognition, which uses on-device ML to guess what is on screen. Even then, most of the app still felt closed off.
What this case study shows
This is a useful hiring sample because the issue is bigger than a missing label.
The important judgment here is recognizing that this is not a scattered collection of accessibility defects. It is a structural failure: the app has no accessibility layer exposed to iOS at all. That changes both the severity and the fix path.
What to notice
- User impact tied to the app's core purpose, not abstract compliance language
- Distinction between partial ML fallback and actual accessibility support
- Root-cause reasoning that points to implementation, not just symptoms
- Remediation guidance shaped by likely rendering architecture
Summary
VoiceOver sees nothing. Screen Recognition sees some things. The app is still unusable.
Without Screen Recognition
VoiceOver announces only the iOS status bar. Every button, text field, and control on every screen is invisible. You cannot even get to the login screen.
With Screen Recognition
Apple's ML fallback detects some visual elements and makes partial navigation possible. I was able to log in through "Continue with Google" and get through onboarding. But core features like the guitar tuner stayed broken.
Root cause
The app uses a non-native rendering framework with no accessibility bridge to iOS. This is not a case of missing labels on native controls. The entire accessibility layer is absent.
Screen-by-screen findings
Here is what happened when I tried to use the app with VoiceOver.
Login screen
Blocked- Without Screen Recognition
- VoiceOver sees nothing. No login button, no email field, no "Continue with Google" option. The app is a blank wall.
- With Screen Recognition
- "Continue with Google" becomes tappable. Login works because the Google account picker uses native iOS UI, which is fully accessible on its own.
Onboarding
Partially navigable- Without Screen Recognition
- Nothing is announced. Cannot proceed.
- With Screen Recognition
- Screen Recognition picks up buttons and text on the skill level and music preference screens. Selection is possible but inconsistent. The microphone permission dialog works correctly because it is native iOS.
Guitar tuner
Blocked- Without Screen Recognition
- VoiceOver sees nothing.
- With Screen Recognition
- The string letters (D, B, G, E, A, E) show up as one long unbroken line of text. You cannot tap individual strings. The tuner is completely unusable even with the ML fallback running.
- Why this matters
- Tuning your guitar is the first thing any beginner needs to do. If the tuner does not work, the app does not work.
The key insight
Screen Recognition partially working proves the visual UI is there. The buttons exist. The text exists. The controls exist. They are just invisible to VoiceOver because no one connected them to the accessibility layer.
This is not a design problem. It is an implementation gap.
Impact
What this means for blind users.
Cannot sign in
Without Screen Recognition, a blind user cannot even reach the login screen. The most basic action in the app is blocked.
Cannot learn
Lessons, practice features, and progress tracking were not reachable during testing. The reason someone downloads this app is the part that does not work.
The workaround is not reliable
Screen Recognition is a best-effort ML system. It guesses wrong, merges adjacent controls, and cannot handle complex interactive elements like the tuner. It is not a substitute for real accessibility.
Remediation
Because the app has no accessibility tree at all, the fix is not adding a few labels. It requires connecting the rendering framework to the iOS accessibility API.
If the app uses Flutter
- Action
- Enable the
SemanticsBindingaccessibility bridge. Flutter has built-in support for this, but it needs to be configured for each widget.
If the app uses Unity
- Action
- Implement the iOS
UIAccessibilityprotocol for interactive elements. Unity does not do this automatically. Each button, field, and control needs a label, trait, and action.
If the app uses a custom canvas
- Action
- Build an accessibility overlay that maps visual elements to
UIAccessibilityElementinstances with labels, roles, frames, and actions.
Verification criteria
What "fixed" looks like.
A VoiceOver user should be able to do everything a sighted user can do, without needing Screen Recognition.
Passing criteria
- Login works with VoiceOver alone
- Onboarding screens announce all options and allow selection
- Guitar tuner strings can be individually selected
- Lesson content is navigable and playable
- No screen requires Screen Recognition to function
Why the severity is critical
This is not just inconvenient. It blocks the product at the first meaningful steps.
Account access fails
If a user cannot reach login controls with VoiceOver alone, the product is inaccessible before any lesson content even begins.
Core task fails
The tuner is a foundational feature for beginners. When that interaction is unusable, the app's main promise breaks down.
Fallback is not a product strategy
Screen Recognition may occasionally reveal visual controls, but it is inconsistent by design and cannot stand in for a real accessibility implementation.
Need this kind of review for your app?
I test iOS apps, websites, and digital products with VoiceOver and keyboard navigation, then write findings your team can act on.