Inspiration

I live on a rural property in Northern California with my wife, who's a fine-dining chef specializing in traditional French and Italian cuisine. Between her kitchen, our garden, and the chickens, food is basically our entire life. My wife would love a way to record her cooking trials and tribulations — the experiments that work, the ones that don't, the tweaks to a recipe over dozens of iterations. That's the kind of food context that gets lost.

I originally started building FoodLog — a Flutter app where you snap a photo of your meal and Gemini analyzes it. Camera pointed at dinner, instant nutrition breakdown. But as I kept building, I kept hitting the same wall: every food feature I wanted — recipe extraction, ingredient substitution, safety recalls, pantry tracking — I was rebuilding from scratch each time. There was no shared layer.

Then, 48 hours before the deadline, I had the pivot: why build one app when you could build the protocol that powers all of them? The same way Stripe standardized payments, food AI needed a standard interface. I scrapped FoodLog and went all-in on Food Context Protocol. FCP was born out of that frantic 48-hour sprint — and it turned out to be the right call, because the protocol is bigger than any single app.

What it does

Food Context Protocol (FCP) is an open protocol that gives any AI assistant instant access to food intelligence through 40+ typed tools across six domains:

  • Nutrition — Analyze meals from photos, log food, track macros, learn taste profiles
  • Recipes — Extract recipes from images/video/URLs, scale ingredients, find substitutions
  • Safety — Real-time FDA recall monitoring, allergen detection, freshness predictions
  • Inventory — Smart pantry tracking, shopping lists, barcode lookup
  • Discovery — Restaurant finder, meal planning, food trend analysis
  • Clinical — Dietitian reports, meal donation coordination, content publishing

FCP works with any MCP-compatible AI assistant (Claude, Gemini, ChatGPT) via dual transport: MCP stdio for local tools and REST/SSE for remote access.

How we built it

Solo project with significant AI assistance — I used Claude and Gemini extensively throughout development, which made building 40+ tools in 48 hours possible.

The core server is Python/FastAPI with Pydantic for strict type validation across all tool schemas. Every tool connects to Gemini 3 through a unified connector layer leveraging 15+ API features:

  • Multimodal vision for analyzing food photos, reading nutrition labels, and extracting recipes from video
  • Search grounding for real-time FDA recall data and restaurant discovery
  • Extended thinking for complex meal planning and nutritional optimization
  • Image generation (Gemini 3 Pro) for project branding — even the logo was Gemini-generated
  • JSON mode for structured recipe and nutrition output
  • Context caching for efficient handling of large recipe databases

Auto-generated Python and TypeScript SDKs from the OpenAPI spec, built a TUI-based CLI with Rich/Textual, wrote the full protocol specification, and deployed to Google Cloud Run with Firestore.

Challenges we ran into

The 48-hour pivot. Scrapping FoodLog and rebuilding as a protocol server two days before the deadline was terrifying. But it forced ruthless prioritization — every tool had to justify its existence.

Multimodal consistency. Getting Gemini to reliably return structured JSON from food photos took real work. A photo of pad thai might return nutrition data in different units, or miss allergens that were visually ambiguous. Built a validation layer that cross-references Gemini's analysis against USDA FoodData Central.

Scope management. 40+ tools across 6 domains is a lot of surface area for one person. AI pair programming made it possible, but I still had to be disciplined about which tools shipped and ensure proper error handling and type safety throughout.

MCP is still young. The Model Context Protocol ecosystem is evolving fast. Hit edge cases around SSE transport reliability and authentication patterns that aren't well-documented yet.

Accomplishments that we're proud of

  • 40+ production-quality tools built, tested, and deployed in 48 hours as a solo developer
  • Full protocol specification — not just an app, but an open standard other developers can build on
  • Dual transport — same server works as an MCP tool (local AI assistants) and a REST API (web/mobile apps)
  • Auto-generated SDKs in Python and TypeScript from the OpenAPI spec
  • Gemini all the way down — the server uses Gemini for intelligence, the logo was generated with Gemini 3 Pro Image, even the demo video assets were AI-generated
  • The pivot itself — recognizing 48 hours before the deadline that the protocol was the real product, not the app

What we learned

  • AI-assisted development is a force multiplier — one developer with good AI tools can build what used to take a small team
  • Gemini 3's multimodal capabilities are genuinely production-ready for food analysis
  • Search grounding is a game-changer for food safety — real-time FDA recall data through a natural language interface
  • Protocol design is harder than implementation — getting the tool schemas right for different AI assistants took more iteration than writing the server

What's next for Food Context Protocol

  • Public launch of all repositories under Apache 2.0
  • Community contributions for additional food data providers and regional food databases
  • FoodLog app — the original Flutter app idea, now powered by FCP as its backend protocol
  • My wife's cookbook tracker — a dedicated tool for recording recipe iterations, substitutions, and cooking notes over time
  • Mobile companion app using the TypeScript SDK
  • Integration guides for popular AI assistants and food industry platforms

Built With

Share this project:

Updates