Inspiration

"Finding myself going back to RSS/Atom feeds a lot more recently. There's a lot more higher quality longform and a lot less slop intended to provoke.

We should bring back RSS - it's open, pervasive, hackable.

You will lose a lot fewer brain cells. I don't know, something has to change." - Andrej Karpathy

The modern internet is drowning in noise. RSS feeds - once the cleanest way to follow content - now deliver hundreds of articles daily, most of which are clickbait, SEO spam, or AI-generated fluff. I wanted to build a tool that brings back the signal: an AI-powered RSS aggregator that reads everything so you don't have to, filters out the junk, and delivers only what matters.

The idea was simple: what if your RSS reader had a brain? Not just keyword matching, but genuine understanding of article quality, relevance, and informational value.

What it does

DeepFeed is a full-stack AI-powered RSS platform that transforms any website into a curated, high-quality feed:

  • Smart Feed Creation — Paste any URL. DeepFeed's AI extracts articles, scores their quality (1-10), and automatically detects "slop" (clickbait, SEO spam, low-effort content). Clean articles get published as a standard RSS 2.0 feed you can subscribe to in any reader.

  • Daily AI Digest — A curated daily briefing across all your feeds: top 5 most important stories with explanations of why they matter, automatic topic clustering, and trend insights. Available in 15+ languages.

  • Audio Digest (TTS) — Your daily briefing as a podcast-style audio file, generated with Gemini's text-to-speech. Listen to your news on the go.

  • AI Cover Art — Each digest gets a unique cinematic cover image generated by Gemini's image models, themed around the day's top stories.

  • Real-Time Processing — Live streaming logs via SSE show exactly what the AI is doing as it crawls, parses, and analyzes your feeds.

  • Auto-Refresh Scheduler — Premium feeds are automatically refreshed on configurable intervals with intelligent deduplication.

  • OPML Import/Export — Bring your existing subscriptions or export DeepFeed feeds to other readers.

How we built it

Architecture: FastAPI backend + React/TypeScript frontend + PostgreSQL, deployed on Hetzner VPS via Docker Compose with nginx reverse proxy and Let's Encrypt SSL.

AI Pipeline (Gemini 3.x):

  1. URL Context Tool — Gemini 3's native tool-use reads URLs directly and extracts articles in a single API call (no separate crawling step needed)
  2. Fallback Crawler — For JavaScript-heavy sites, crawl4ai with Playwright renders pages, then Gemini parses the markdown
  3. Quality Analysis — A reasoning model scores each article and flags slop
  4. Digest Generation — Gemini 3 Pro with thinking capability analyzes all high-quality articles, identifies the most important stories, clusters topics, and generates insights
  5. Dynamic Model Fallback — On rate limits (429s), the system automatically falls back to cheaper/available models based on a dynamically-built model registry

Key technical decisions:

  • Structured output with response_schema for consistent JSON from AI
  • asyncio.Queue + SSE for real-time log streaming during processing
  • Fernet encryption for user API keys at rest
  • Per-user feed isolation with composite unique constraints
  • 30-day article deduplication window for digests

Challenges we ran into

  • Content quality at scale — Early versions let too much junk through. We iterated on prompts and added a two-stage pipeline: fast parsing first, then deep quality analysis with a reasoning model.

Accomplishments that we're proud of

  • Zero-config feed creation — Paste a URL, get a filtered RSS feed. No manual configuration, no feed discovery needed. The AI handles everything.

  • Production deployment — Not just a demo. DeepFeed runs at https://deepfeed.app with SSL, connection pooling, auto-refresh scheduling, and multi-user isolation.

  • Multi-modal digest — Text briefing + audio podcast + AI cover art, all generated from the same daily analysis. A complete media experience from RSS feeds.

  • Slop detection actually works — The quality scoring reliably filters out clickbait and content-farm articles, surfacing only genuinely informative content.

What we learned

  • Gemini 3's URL Context tool is a game-changer for web scraping — it eliminates the need for a separate crawling step for most sites, dramatically simplifying the pipeline.

  • Structured output (response_schema) makes AI responses predictable and parseable, but requires careful handling of model version differences.

  • Building a multi-model fallback system pays off quickly — rate limits are inevitable at scale, and graceful degradation is better than errors.

  • SSE is the right choice for long-running AI operations in web apps — it gives users immediate feedback and makes 30-second processing feel interactive rather than frozen.

What's next for DeepFeed.app

  • Feed discovery — AI-powered suggestions based on user interests and reading patterns
  • Collaborative feeds — Share curated feeds with teams or communities
  • Mobile app — Native experience with push notifications for important stories
  • Advanced analytics — Track reading habits, topic trends, and source quality over time
  • Webhook integrations — Push digest summaries to Slack, Telegram, or email

Built With

Share this project:

Updates