Inspiration

Emergencies outrun official alerts. When there’s a fire, a gas leak, a crash, or someone missing, neighbors need a way to tell each other now—without apps, logins, or Wi-Fi. The Web is a “neighborhood pager”: one phone number + SMS + a small local LLM that turns raw texts into instant, hyper-local alerts.

What it does

  • Anyone joins by texting a short oath; then they can report incidents by SMS.
  • A local language model classifies each message (spam vs. real), asks only for missing pieces (what/where/when), and when complete, composes a concise public alert.
  • The alert is broadcast by postcode to nearby subscribers. Every alert includes a safety note and stays under SMS limits.
  • Works fully offline in demo mode; can use Twilio (A2P 10DLC needed for unrestricted sending) or a USB GSM modem for real SMS.

How I built it

  • Stack: .NET 8 console app + SQLite.
  • Onboarding: strict SMS “oath” parser so join is human-simple yet machine-reliable.
  • LLM brain: OpenAI-compatible local endpoint (e.g., LM Studio). I send only the last few messages from the same reporter (within 30 minutes), extract hints (postcode/time/location), and ask the model for one minified JSON decision.
  • Broadcast: when complete, I format a short alert and SMS it to subscribers in the target postcode, excluding the reporter.
  • Safety/ops: 30-minute cooldown per reporter, inbound de-dupe, and a demo replay script for predictable demos.

Challenges I ran into

  • SMS realities: needing A2P 10DLC registration (otherwise “verified numbers only”) made true end-to-end testing tricky.
  • LLM reliability: getting strictly valid JSON under token limits; preventing the model from giving unsafe advice; keeping follow-ups ultra-short.
  • Partial info: robustly extracting postcode/time/location from messy texts and threads, not just the last message.
  • Race/dup: avoiding double processing when polling inboxes; enforcing a per-reporter cooldown so alerts don’t spam.

Accomplishments that I’m proud of

  • A working, end-to-end SMS → LLM → geofenced broadcast pipeline that runs locally.
  • A clean demo mode so anyone can try it without a provider or hardware.
  • Human-friendly UX in pure SMS: a one-line oath to join, one-line follow-ups for missing info, and clear, short public alerts.

What I learned

  • For crisis tools, SMS first wins: zero friction beats any fancy app.
  • LLMs behave best with strict schemas + guardrails (seed JSON, extract only the first balanced object, post-validate).
  • Design for failure paths (timeouts, 400/429s, A2P constraints) from day one—your demo will thank you.

What’s next for The Web

  • Go fully offline: USB GSM/LTE modem + SIM (no cloud dependency).
  • Stronger signal: incident clustering + simple “2-reporter” corroboration gates; lightweight YAML playbooks for severity.
  • Ops panel: minimal web UI to review/override alerts, see heatmaps, and adjust templates.
  • A2P 10DLC & onboarding: register properly, add multilingual flows, and polish the safety copy.
  • Open source packaging: one-command setup, sample replay scripts, and a tiny local model preset so anybody can run a neighborhood pager.

Built With

Share this project:

Updates