🏛️ Townhall: The AI-Powered Civic Radar

💡 Inspiration

Imagine waking up to a bulldozer in your backyard.

That is exactly what happened to Wake County families who were recently blindsided by a highway extension because the plans were buried in a PDF.

At the same time, cities like Boston are unlocking millions by rezoning commercial districts into residential housing, yet most developers are missing these "goldmine" opportunities because they can't decipher the bureaucratic noise.

The problem isn't that this data is secret, it is that it is "Hidden Data," effectively buried in bureaucracy. Townhall bridges this gap, using Elasticsearch to decode the bureaucracy and turn hidden government files into real-time alerts for everyone.

📡 What it does

Townhall is an intelligent "Civic Radar" that scrapes, indexes, and visualizes local government proceedings. It serves three distinct users:

  • 🏡 The Concerned Resident (Civic Defense): Residents can simply ask the AI to "watch" their neighborhood, receiving instant email alerts if any zoning petitions appear within a specific radius of their home.
  • 💼 The Business Owner (Opportunity Hunter): Entrepreneurs can visualize real-time "up-zoning" shifts (e.g., Residential to Commercial) on a heatmap, securing prime leases before the market reacts.
  • 🤖 The Operator (Autonomous Ops): To handle scraper instability, an Autonomous Ops Agent powered by Elastic Observability monitors the system, using ES|QL to instantly detect and alert on silent failures or anomalies.

🤖 The Agents: The Townhall Trinity

The Problem: One Interface Does Not Fit All Solving the "Dark Data" problem in local government requires more than just a search bar. A resident asking about "noise near my house" needs a fundamentally different interaction than a developer analyzing "approval rates for R-3 zoning," or an engineer debugging a failed scraper. A single AI model often hallucinates when trying to be an analyst, a reliability engineer, and a notification service simultaneously.

The Solution: A Multi-Agent Ecosystem To solve this, I developed Townhall as a system of three specialized agents, each with a distinct "Job Description" and "Definition of Done":

  1. Townhall City Analyst (The Interface):

    • Role: The user-facing intelligence that translates natural language into precise ES|QL queries.
    • Logic: It uses a "Strategy Selector" to distinguish between a "General Search" (looking for keywords) and "Statistical Analysis" (calculating approval rates). It seamlessly joins the petitions index with the parcels index to answer complex questions like "Is it hard to convert Residential to Commercial in Durham?"
  2. Townhall Ops Agent (The SRE):

    • Role: A specialized Site Reliability Engineer that monitors the crawlinglogs index.
    • Logic: Instead of generic logging, this agent enforces "3 Health Pillars" using ES|QL. It actively hunts for "Silent Deaths" (scrapers that run successfully but return < 10 logs) and "Infinite Loops" (abnormal volume > 5,000 logs), flagging them with specific emojis (✅/❌/⚠️) so I can assess system health in seconds.
  3. Townhall Alert Checker (The Watchman):

    • Role: A background service that acts as a "Civic Radar."
    • Logic: It iterates through active user subscriptions and executes the Haversine formula to calculate precise distances between new petitions and user homes. Beyond just distance, it performs an "Impact Analysis," classifying petitions as High/Medium/Low severity based on zoning codes (e.g., Industrial near Residential = High Severity).

Elastic Features & Challenges

  • Feature Liked (ES|QL for Anomaly Detection): The Ops Agent was my favorite to build. Using ES|QL to create the "Silent Death" detector (COUNT(logs) < 10) was a game-changer. It turned a complex debugging task into a simple SQL-like query that runs automatically.
  • Challenge (Geospatial Impact): The Alert Checker required complex logic. Simply finding petitions "near" a point wasn't enough; I had to implement logic to determine context. Integrating the Haversine distance calculation directly into the agent's workflow ensured users only got alerts for things that actually mattered, reducing "notification fatigue."

⚙️ The Tech Stack

Townhall is built on a "Multi-Agent" architecture, heavily relying on the Elastic Stack for storage, search, and observability.

  • Ingestion: Python Crawling Agents scraping agendas and minutes into Elasticsearch.
  • Search Engine: Elastic geo_distance for radius alerts and Full-Text Search for unstructured PDF content.
  • Visualization: Mapbox for rendering zoning parcels and Kibana for "Bloomberg-Grade" real estate analytics dashboards.

🚧 Challenges & Accomplishments

  • The "Zombie Project" Data: Local governments often leave projects in "Filed" status for years. We wrote complex logic to determine which projects were actually active versus abandoned.
  • Bloomberg-Grade Dashboards: We are proud of creating a Kibana dashboard that breaks down zoning stats by "Asset Class" (Residential vs. Commercial)—it looks like a professional real estate analytics tool.
  • Geospatial Complexity: Mapping raw addresses to lat/long coordinates for the radius search required precise geocoding and efficient spatial indexing.

🚀 What's next for Townhall.ai

  • Scale to 50 Counties: Now that the Ops Agent helps manage reliability, I can scale the number of scrapers confidently.
  • Predictive Zoning: Using Elastic Machine Learning to predict which parcels are likely to be rezoned next based on historical trends.
  • LLM Integration: Feeding the indexed PDFs into a RAG pipeline so users can ask, "Summarize the arguments against this project."

Built With

Share this project:

Updates