Inspiration
We love games but most people can't make them. What if you could describe a game in four words and have it exist 60 seconds later? Not a mockup, not a prototype — a real, playable, 60fps game running right in your Reddit feed. We wanted to turn game creation into something as casual and social as posting a meme.
The slot machine metaphor clicked immediately: you don't need to know what you want. Just pull the lever. Don't like it? Reroll. The randomness is the fun — every combination produces something nobody's ever played before.
What It Does
RerollGame turns Reddit posts into playable AI-generated games.
For creators:
- Spin a 5-reel slot machine to pick Genre, Theme, Mechanic, Twist, and Mood
- 75,600 possible combinations (7 x 15 x 10 x 12 x 6)
- AI generates a complete, playable Canvas2D game in ~2 minutes
- Live preview lets you play instantly before publishing
- Don't like the result? Tap any slot to reroll just that part, or pull the whole lever again
- Save up to 10 drafts with full version history (10 versions each)
- Publish to any subreddit as a playable Reddit post
For players:
- See a game in your feed, hit PLAY — no downloads, no redirects
- Games run at 60fps with full audio right inside Reddit
- Every post is a unique game someone generated
Sample outputs from the slot machine:
| Genre | Theme | Mechanic | Twist | Mood |
|---|---|---|---|---|
| Platformer | Space | Gravity Flip | Shrinking Arena | Frantic |
| Puzzle | Candy Land | Combo Chains | Controls Reverse | Silly |
| Survival | Haunted House | Phase Through Walls | Limited Vision | Creepy |
| Racing | Cyberpunk | Dash | Everything Speeds Up | Competitive |
How We Built It
The Sandwich: Three Layers of Isolation
The key insight is that AI-generated code can't be trusted — so we never trust it. The architecture is a security sandwich:
[ Reddit Post ]
|
[ Host Layer ] -- Canvas2D renderer, Web Audio engine, input handling
|
[ QuickJS WASM Sandbox ] -- AI-generated game code runs here
|
(no DOM, no fetch, no eval, no escape)
Games are pure functions. Every frame, the sandbox calls update(dt, input) and returns an array of draw commands. The host layer interprets those commands on a real Canvas2D. The game code literally cannot touch the DOM, make network requests, or access anything outside its 16MB heap.
AI Generation Pipeline
- Slot machine picks 5 parameters → human-readable game description
- OpenAI (GPT-5.2 Codex) receives the description + a detailed Canvas2D API reference + example game code → generates complete game source
- Validation layer checks for forbidden patterns (
eval,fetch,import), verifies required functions exist (metadata(),resources(),update()), enforces 200KB size limit - Image generation — if the game defines
generate-type sprites, Google Gemini creates pixel art, then a CIEDE2000 perceptual chromakey algorithm removes backgrounds at the pixel level - Sandbox test — game boots in QuickJS to verify it actually runs before saving
The whole pipeline is async: client fires a generate request, polls for status, and the server lazy-starts the AI call on first poll. No websockets, no long-polling — just simple HTTP that works within Devvit's constraints.
Command-Driven Rendering (30+ Operations)
Games don't draw — they describe what to draw. The command protocol supports:
- Shapes: rect, circle, line, polygon, arc, complex paths (bezier, quadratic)
- Text: fonts, alignment, baseline, word wrap
- Images: sprite rendering with source rectangles, rotation, alpha
- Transforms: save/restore, translate, rotate, scale, clip regions
- Audio: 8-channel synthesizer with sine/square/sawtooth/triangle waves, white/pink/brown noise, ADSR envelopes, pitch sweeps, sample playback
All of this from code that has zero access to Canvas2D or Web Audio directly.
Three Resource Types for Sprites
Instead of uploading image files, games define resources declaratively:
- Hex — Palette-indexed pixel art in compact notation (great for 16x16 sprites)
- Generate — A text prompt like
"pixel art red dragon 32x32"that triggers Gemini at load time - Procedural — Canvas draw commands composed into a reusable sprite
Stack
All vanilla JavaScript — no React, no TypeScript, no UI frameworks. ~3,000 LOC total.
| Layer | Tech |
|---|---|
| Platform | Reddit Devvit |
| Client | Vanilla JS, Canvas2D, Web Audio API |
| Sandbox | QuickJS compiled to WASM |
| Server | Hono (lightweight HTTP framework) |
| Storage | Redis (Devvit-managed) |
| Game AI | OpenAI GPT-5.2 Codex |
| Image AI | Google Gemini Flash |
| Image Processing | PNGjs + CIEDE2000 chromakey |
| Build | Vite |
Challenges We Ran Into
Making AI-generated code safe to run. The obvious answer is "use an iframe sandbox" but we went further — QuickJS WASM gives us a completely separate JavaScript runtime with hard memory limits (16MB), stack limits (1MB), and a 50ms-per-frame timeout that kills infinite loops. The game code physically cannot access the host environment.
Chromakey that doesn't suck. Simple RGB distance matching leaves ugly halos around sprites. We implemented CIEDE2000 — the gold standard perceptual color distance algorithm — converting RGB through Lab color space to match how humans actually see color differences. The result: clean sprite extraction from AI-generated images with no manual cleanup.
Async jobs on a stateless platform. Devvit doesn't support websockets or long-running requests. We designed a lazy-start polling pattern: the client creates a job, then polls for status. The server only starts the expensive AI call on the first poll, avoiding wasted compute if the user navigates away. All state lives in Redis with TTLs.
Making the AI output actually playable. Raw LLM output often has subtle bugs — off-by-one physics, missing collision checks, broken game loops. The prompt engineering includes a complete working example game, a full API reference, and strict constraints. The validation layer catches structural issues before the game reaches players.
Accomplishments We're Proud Of
- Zero-install game creation. From "I want a game" to "here's a playable game in my Reddit feed" in under 2 minutes, with no coding knowledge required.
- The security model actually works. QuickJS WASM sandbox with hard limits means we can run arbitrary AI-generated code with confidence. No escapes, no exploits, no trust required.
- 75,600 unique game combinations from 5 slot reels — and because the AI interprets them creatively, even the same combination produces different games each time.
- Full audio synthesis from a sandbox. Games define tones, noise, and samples through commands, and the host synthesizes them through Web Audio with ADSR envelopes and pitch sweeps. No audio files needed.
- Perceptual chromakey (CIEDE2000) for AI-generated sprites — a genuinely hard computer vision problem solved in pure JS.
- ~3,000 lines of vanilla JS. No frameworks, no transpilers, no build complexity. Just JavaScript that runs.
What We Learned
- LLMs are surprisingly good at generating games when you give them tight constraints. The Canvas2D command protocol acts as a creative straitjacket — the AI can't cheat with DOM hacks, so it has to actually implement game logic.
- Sandbox-first architecture pays off. Designing the security boundary first (not bolting it on later) made everything simpler. The command protocol emerged naturally from "what can we safely expose?"
- Slot machines are better than text prompts. Giving users random constraints instead of a blank text box produces more creative and fun results. Constraints breed creativity — for humans and AIs alike.
- CIEDE2000 is worth the complexity. Perceptual color distance sounds academic, but the difference between "ugly green halo around every sprite" and "clean transparent background" is the difference between a demo and a product.
What's Next
- Multiplayer — Real-time shared game state between players viewing the same post
- Leaderboards — Per-post high score tables with Reddit username attribution
- Remix — Fork someone's game, tweak the slots, regenerate with your twist
- Community voting — Upvote the best generated games to surface them in feeds
- More resource types — Tilemaps, particle systems, skeletal animation commands
Built With
- chatgpt
- javascript
- nanobanana
- quickjs
- redis
- webassembly
Log in or sign up for Devpost to join the conversation.