Accessibility portfolio
Real findings from real products, tested the way blind users actually experience them.
Each case study documents what I found, what broke, and what needs to happen next. These are not hypothetical audits. They are hands-on evaluations of public websites and apps using VoiceOver, keyboard navigation, and manual inspection. Below the audits, you will also find engineering projects where I built accessibility solutions for professional desktop software that had no VoiceOver support at all. If you are hiring, this page is here so you can judge the work directly: how I identify blockers, describe impact, and turn what I notice into something another person can actually use.
How to read this page
The point is not just that I can spot issues. It is whether I can explain the real user impact and the next move clearly.
Hiring managers and collaborators should be able to skim a case study and answer a few practical questions quickly: Can this person identify the actual blocker? Can they tell the difference between something irritating and something that stops the experience cold? Can engineering use the writeup without losing the human context?
What to look for
- Clear statement of the user task that failed
- Specific reproduction details instead of vague commentary
- Severity grounded in user impact, not drama
- Fix direction written for teams that need to ship
Case studies
Detailed accessibility reviews of real products.
JustinGuitar
Popular guitar learning app has no accessibility tree whatsoever. VoiceOver sees nothing but the status bar. Login, onboarding, and the guitar tuner are all completely blocked for blind users.
Read case studyEventbrite homepage
Event card links expose unresolved placeholder text instead of actual event titles. Screen reader users hear repeated meaningless names when browsing events, making event discovery slow and unreliable.
Read case studyRetail product page
Fulfillment and quantity controls create inconsistent screen reader navigation. VoiceOver jumps away from the expected interaction path near pickup and quantity selection, disrupting the purchase flow.
Read case studyEngineering projects
Accessibility solutions I built for software that had no VoiceOver support at all.
These are not audits. These are working tools I designed and engineered using macOS Accessibility APIs, machine learning, and platform-level integration to make inaccessible professional software usable with VoiceOver.
ScreenRecognition
A macOS app that makes inaccessible applications usable with VoiceOver through ML-powered UI element detection, OCR-based label extraction, and a transparent overlay of interactive accessible controls.
Read project detailsPro Tools accessibility
Custom accessibility layer that exposes hidden plugin parameters as VoiceOver-readable sliders and surfaces application startup status messages to screen reader users.
Read project detailsKomplete Kontrol accessibility
VoiceOver support engineered for Native Instruments' Komplete Kontrol, including accessible preferences, keyboard navigation, and toolbar label correction for an app with zero native accessibility.
Read project detailsHow I work
What I pay attention to when I evaluate an experience.
Screen reader flow
I look for whether the experience stays understandable and operable in sequence, not just whether individual controls have labels.
Keyboard flow
I care about focus order, interaction states, trap conditions, and whether a keyboard user can complete the task without guesswork.
Structure and meaning
Headings, landmarks, buttons, links, form labels, error handling, and state changes all need to communicate the right thing at the right time.
Finding format
Findings written for people who need to fix things, not admire a report.
When I document accessibility issues, I focus on what the user was trying to do, what failed, how to reproduce it, and what kind of fix direction the team needs next. I want the writeup to stay precise without flattening the actual experience.
What each finding includes
- Issue title with clear scope
- Affected users and user impact
- Reproducible steps and observed behavior
- Relevant standards reference when needed
- Plain-language remediation direction
Tools and methods
The point is not to name every tool. It is to show how I use them to evaluate real user experience.
Assistive technology
VoiceOver, JAWS, and NVDA to evaluate whether a flow stays understandable, navigable, and complete for screen reader users.
Manual interaction testing
Keyboard-only review to check focus order, interaction states, dialogs, forms, and error recovery alongside screen reader testing.
Reporting and remediation
Findings in plain language with reproducible steps, user impact, and practical remediation guidance teams can actually work from.
Next step