The Inspiration

While arriving at the York University Markham Campus, we noticed that while GPS is great for getting to a building, it fails at Micro-Navigation. For a visually impaired student, finding a specific empty chair in a busy lounge or knowing if a door is slightly ajar is a high-stress gamble. Since it's Valentine's Day weekend, we wanted to build something that bridges the gap between sight and soul - helping users stay "In-Touch" with their world.

What it does

S.E.N.S.E. is an eyewear system that translates the visual world into a "language of feeling."

- Voice Guidance: Uses AI to whisper spatial context like, "An empty chair is at your 11 o'clock."

-Haptic Pulse: A frame-mounted buzzer provides a tactile "heartbeat" through the temples of the glasses. Using custom PWM logic, the pulse frequency increases as you approach obstacles, creating an intuitive spatial map.

-Adaptive Ally Dashboard: A highly customizable dashboard that allows either the user or a caregiver to adjust color palettes (e.g., high-contrast neon, protanopia-friendly) and UI sizing in real-time. This dual-control system ensures the interface meets the specific visual needs of the "Sighted Ally."

How we built it

- Hardware: Raspberry Pi 4 (Brain), Arduino Nano (Reflexes), Logitech C270 (Eyes), and an active buzzer for tactile feedback.

- Software: We decoupled the sensor logic from the AI logic. The Arduino handles the high-speed pulse-width modulation (PWM) for the buzzer at 115200 baud, while the Pi runs a Python-based server that manages the Gemini Vision API and ElevenLabs TTS.

- Form Factor: Built into a pair of glasses to ensure the AI's perspective perfectly matches the user's line of sight.

Challenges we ran into

- Active Buzzer Control: We had to implement a custom PWM "software volume" trick to prevent the buzzer from being too loud while still providing clear tactile feedback.

- Decoupling Architecture: Orchestrating two microcontrollers to share a single data stream without lag was a major hurdle we solved using a "Brain-Spinal Cord" architectural split.

Accomplishments we're proud of

- The "Language of Feeling: Successfully creating a low-latency "language of feeling" where the user can navigate short distances using only the haptic pulses, relying on the AI only for high-level object identification.

- Democratizing Assistive Tech: We re-engineered the smart-glass experience using off-the-shelf components, reducing the cost from the industry standard of $3,000+ to under $150. We proved that high-performance spatial awareness doesn't have to be a luxury item.

Built With

Share this project:

Updates