HapMap: Auditory & Tactile Walking Navigation
An iOS and Android walking navigation app for individuals with visual impairments, using auditory and tactile feedback. 95% of users found the step-based voice assistance and haptic patterns very helpful.
00
problem
Standard map applications rely heavily on visual cues, making them inaccessible for individuals with visual impairments. There was a need for a navigation tool that could provide clear, non-visual directions.
solution
I co-developed HapMap, a mobile app using Dart and Flutter that leverages the Google Maps API. It provides step-by-step voice assistance, haptic feedback, and distinct vibration patterns to guide users, making navigation intuitive without visual input.
Engineering for Accessibility
The inspiration for HapMap was to make navigation more accessible for everyone. We focused on creating an experience that didn't rely on looking at a screen. Using Dart and Flutter, we integrated with device sensor and feedback APIs to generate intuitive cues—like specific vibrations for "turn left" or "turn right." We adopted a rigorous development process, using an MVC architecture and setting up a CI pipeline with GitHub Actions. To ensure the app was reliable, we used mutation testing, which helped us reduce bugs by 80% before launch. The most rewarding part was the user feedback; learning that 95% of our testers found the auditory and haptic feedback genuinely helpful confirmed that we had built something truly impactful.
see also




