Voice-powered AI computer use, built for the blind.
State rehab programs spend multi-week training cycles teaching keyboard shortcuts for Outlook. Then another cycle for every new app. Existing tools don't scale to how people actually work.
Read me my unread emails
Shop for gaming headphones on Amazon under $100
Schedule a call with Sarah next Tuesday afternoon
Write me an intro about VCs in 2026 and save it to a doc
Today: email, calendar, browsing, files, Word, Google Workspace.
The bet isn't equality with sighted users. It's making blind users faster at voice-driven multi-step work, while they're still clicking menus.
Institutions have line-item budgets. We're not inventing demand. We replace a more expensive, worse incumbent.
1% of 50M blind users at $50/mo = $300M ARR. Adjacent (visually impaired, motor-impaired, elderly): same product, much larger TAM.
JAWS (41% share, $1,475)
NVDA (free)
VoiceOver (free)
Keyboard-shortcut interfaces. Describe pixels. Don't do tasks.
Be My Eyes · Aira
Seeing AI · Envision
Visual interpretation only. None target full computer use.
Claude Computer Use
OpenAI Operator
Screenshot-based. 30K tokens. 30-60s. Not voice-first. Not built for blind users.
State vocational rehab agencies (ADA-funded). Federal: VA, DoE, Section 508 procurement.
Universities (disability services), employers (accommodation budgets), nonprofits and rehab centers.
Free trial, no card. Self-serve seats. Institutional users convert family and colleagues.
Same budget code, approval process, and compliance review across all three channels as the JAWS line item it replaces.
Software engineer. Built Darvy solo since late 2025. Background in AI tooling and ML pipelines. Going full time on funding.
Brother. Lost his vision as an adult. Tests every feature daily. His frustration is the roadmap. Iteration cycles measured in hours, not sprints.
Use of funds: founder runway, infra (LLM credits + voice infrastructure), pilot deployments at 2 to 3 rehab institutions.