Inspiration
During Covid, my cousin in Mumbai, who lived alone, suffered a stroke and fell. He lay on the floor for three days without anyone knowing. By the time help arrived, it was too late. He passed away because no one knew he needed help until it was too late.
Many of us live with a subconscious worry about our aging parents (or vulnerable family members) who may be miles away. We wonder if they are safe. We want to protect them, but we don't want to invade their privacy with constant cameras or check-in calls.
Himom is my answer to that anxiety. It is a comprehensive safety solution that uses the smartphone—and eventually other smart home devices—to orchestrate a safety net. It respects privacy by processing everything locally on the device (computing at the edge, where $latenc \approx 0$) and only escalating when it truly matters.
What it does
Himom acts as an intelligent, invisible guardian that runs on an Android phone. It uses a "Hybrid AI" approach to monitor for emergencies without constantly sending private data to the cloud.
- Listen: The app quietly listens for distress sounds, like a fall or a cry for help, using technology that lives directly on the phone. By processing audio locally, we ensure $Privacy = True$.
- Verify: If—and only if—it hears something concerning ($Probability_{distress} > Threshold$), it wakes up Gemini 3 Flash, Google's advanced AI. The AI then speaks to the user ("I heard a noise, are you okay?") to check on them.
- Act: If the user asks for help or doesn't respond, the AI immediately sends an SMS alert to their emergency contacts.
It also watches for other danger signs, like wandering outside a safe area or if the phone hasn't moved for an unusually long time ($t_{inactivity} > Limit$).
How I built it
I built Himom entirely within the Google Antigravity IDE, using its advanced tools to develop the app rapidly.
- The App: I built the mobile application using Flutter, leveraging my experience to create a smooth, cross-platform experience.
- The Brain: Gemini 3 Flash is the core intelligence. It powers the voice assistant that verifies emergencies, understanding the difference between "I'm just watching TV" and "I've fallen and can't get up" with remarkable speed and accuracy.
- The Ears: I used TensorFlow Lite (YAMNet) for the on-device listening. This allows the phone to detect sounds instantly with $O(1)$ latency without needing an internet connection, ensuring privacy.
- Design: Nano Banana Pro helped generate the visual assets to make the app look professional and trustworthy.
Challenges I ran into
Building an app that reliably runs 24/7 on modern smartphones is incredibly difficult because phones are designed to shut down background apps to save battery.
- Staying Awake: I had to engineer a robust system that keeps the "digital ears" open even when the screen is off and the phone is sleeping, without killing the battery.
- Seamless Handover: One of the hardest parts was ensuring that when the local sensor hears a thud, it instantly and smoothly hands over control to the sophisticated Gemini AI without losing a split second of audio or context.
Accomplishments that I'm proud of
- Privacy First: I built a system that offers peace of mind without constant surveillance. No audio leaves the phone unless a potential emergency is actually detected.
- Instant Response: The transition from detecting a sound to the AI asking "Are you okay?" feels instantaneous.
- Battery Efficiency: Despite listening 24/7, the app is smart enough to use lightweight technology most of the time, so it doesn't drain the battery aggressively.
What I learned
- Speed Saves Lives: The sheer speed of Gemini 3 Flash allows the voice verification to feel natural and conversational, which is critical during a stressful emergency.
- Context is Key: A simple loud noise trigger isn't enough. By using advanced AI to understand the meaning of a user's response, we drastically reduce false alarms—the biggest problem with most safety apps.
- Agentic Development: Using the Antigravity IDE allowed me to build complex, enterprise-grade features much faster than traditional coding methods.
What's next for Himom
If I am fortunate enough to win a prize at this hackathon, the funds would directly accelerate transforming this prototype into a robust, life-saving product. I plan to use the resources to focus on:
- Smart Home Integration: Expanding beyond the phone to connect with smart home sensors.
- Wearable Support: Bringing fall detection directly to smartwatches.
- Privacy-Preserving Vision: Experimenting with on-device camera technology to detect lack of movement during an emergency, ensuring images never leave the device unless explicitly authorized.
- Refined UX/UI: Collaborating with professional designers to create an even more seamless, elderly-friendly interface.
Built With
- android
- dart
- flutter
- google-antigravity
- google-cloud
- google-gemini
- nano-banana-pro
- tensorflow
- tensorflow-lite
- yamnet
Log in or sign up for Devpost to join the conversation.