ACSS - AI Coding Session State: About The Project
🎯 Inspiration
The inspiration for ACSS came from a deeply frustrating pattern I experienced multiple times every single day: I'd be making great progress with ChatGPT on a complex coding problem, finally getting it to understand my project architecture, my tech stack decisions, and the specific blocker I was facing. Then—rate limit hit.
So I'd switch to Claude. And spend the next 10-15 minutes re-explaining everything from scratch:
- "I'm building a Node.js API with Express..."
- "I decided to use JWT for authentication because..."
- "I'm getting this CORS error in this specific file..."
By the time Claude understood the context, I'd lost my momentum and wasted precious coding time. This happened 3-4 times per day.
I realized this wasn't just my problem—28 million developers globally use AI coding tools. If everyone faces this friction multiple times daily, we're talking about millions of hours of wasted productivity every single week.
The solution seemed obvious: we need a universal format for AI coding context—like Git for version control, Docker for containers, or OpenAPI for APIs. A standardized way to capture and transfer coding sessions between any AI tools.
That's how ACSS was born.
💡 What It Does
ACSS (AI Coding Session State) is a universal format and CLI tool that captures complete coding context and enables instant handoffs between any AI coding assistants.
Core Workflow
1. Capture Context
acss init # Deep scan: project structure, tech stack, entry points
2. Log As You Work
acss log decision "Using JWT in HTTP-only cookies for security"
acss log error "WebSocket disconnect after auth" --file ws.js --line 45
acss log next "Debug JWT middleware"
3. Import AI Conversations
acss import chatgpt-export.json # AI extracts decisions/insights
4. Generate Instant Handoff
acss load --for claude > handoff.txt
# Paste into Claude → Perfect context in 30 seconds
Advanced Features
- Watch Mode: Auto-captures file changes and git commits in real-time
- Session Merging: Combines work from ChatGPT, Claude, Gemini into one unified session
- LLM Compression: Reduces token usage by summarizing long sessions with local/cloud models
- Multi-AI Support: Optimized prompts for ChatGPT, Claude, Gemini, local LLMs
- Local + Cloud: Works with Ollama locally or cloud endpoints (Google Colab via ngrok)
Result: What used to take 10-15 minutes now takes 30 seconds.
🛠️ How We Built It
Architecture
ACSS is a TypeScript monorepo with three core packages:
1. Core Package (@acss/core)
- ACSS Schema: Complete TypeScript interface defining session structure
- DirectoryScanner: Recursive file tree builder with
.gitignoresupport (usingglobby) - Validation: JSON Schema validation with Ajv
- Session Utilities: Create, update, merge, compress operations
- Prompt Generator: AI-specific template system for handoff optimization
2. CLI Package (@acss/cli)
- Commander.js: Command framework with 9 main commands
- Interactive Prompts: Inquirer.js for user-friendly initialization
- Document Parsing: pdf-parse (PDF), mammoth (DOC), native (Markdown/JSON)
- LLM Integration: Ollama client with fallback to cloud endpoints
- File Watcher: Chokidar for real-time monitoring
- Git Observer: Hooks to capture commits as decisions
3. VS Code Extension (Experimental)
- WebView-based sidebar UI with glassmorphism design
- Real-time session monitoring via WebSocket
- Live decision/error tracking
Tech Stack
- Language: TypeScript 5.x for type safety
- Runtime: Node.js 18+ for modern features
- CLI Framework: Commander.js + Inquirer
- File System: globby (scanning), chokidar (watching)
- Validation: Ajv with JSON Schema
- Document Parsing: pdf-parse, mammoth
- LLM Integration: Ollama API, custom cloud client
- Testing: Jest (unit tests)
- Build: Native TypeScript compiler
Development Workflow
- Day 1: Defined ACSS schema, built core utilities, implemented validation
- Day 2: CLI commands (
init,log,export,load,merge) - Day 3: LLM integration (Ollama + cloud), chat import, compression
- Day 4: Watch mode, file observer, git integration, session merging
- Day 5: Testing suite, documentation, demo preparation, final polish
Total development time: ~40 hours over 5 days
🧗 Challenges We Faced
Challenge 1: Designing the Universal Schema
Problem: How do you capture "everything" needed for AI handoff without creating a bloated format?
Solution: After studying real AI conversations, I identified the minimum viable context:
- Project metadata (name, tech stack, structure)
- Current task intent
- Chronological decisions
- Errors (with file/line context and resolution status)
- Prioritized next steps
- Sources (which AI tools contributed)
I rejected capturing raw code diffs (too large), full chat transcripts (too noisy), and file contents (privacy concern). The schema balances completeness with efficiency.
Challenge 2: LLM-Based Chat Import
Problem: Chat exports are unstructured conversations. How do you extract structured decisions, errors, and insights automatically?
Solution: Integrated Ollama for local LLM processing with a carefully crafted extraction prompt:
Extract from this AI conversation:
1. Technical decisions made (architecture, libraries, patterns)
2. Errors discussed and their resolutions
3. Key insights about the problem domain
4. Concrete next steps suggested
Format as JSON: {decisions: [...], errors: [...], insights: [...], nextSteps: [...]}
Challenge: Ollama isn't always available (students on low-spec laptops).
Solution: Added cloud endpoint support—users can run a model on Google Colab and expose it via ngrok. Graceful fallback with clear error messages if no LLM is available.
Challenge 3: File Watcher Race Conditions
Problem: Watch mode was triggering duplicate events, infinite loops, and missing rapid file changes.
Initial attempt: Listen to every file system event → resulted in 100+ events per second during build processes.
Solution:
- Debouncing: Group events within 500ms windows
- Ignore patterns: Respect
.gitignore+ ignorenode_modules,dist,.git - Deduplication: Track file hashes to avoid logging unchanged files
- Git hooks over polling: Use git commit hooks instead of watching
.gitfolder
This reduced noise by 95% while capturing meaningful changes.
Challenge 4: Session Merging Deduplication
Problem: When merging sessions from ChatGPT and Claude, how do you avoid duplicate decisions like:
- "Using JWT for authentication" (from ChatGPT)
- "JWT authentication for security" (from Claude)
Naive solution: Exact string matching → fails because wording differs
Smart solution:
- Normalize strings (lowercase, trim)
- Extract key terms (JWT, authentication)
- Use similarity threshold (Levenshtein distance)
- Mark as duplicate if >80% similar
Result: Intelligent deduplication that preserves unique insights while removing true duplicates.
Challenge 5: Token Optimization for Handoff Prompts
Problem: Some sessions have 20+ decisions, 10+ errors. Dumping everything into a handoff prompt exceeds token limits for smaller models.
Solution: Implemented priority-based compression:
- Always include: Unresolved errors, current task, project structure
- Summarize: Resolved errors ("Fixed 3 CORS issues"), old decisions
- Prioritize recent: Last 5 decisions in full, older ones summarized
- Use LLM compression: Optional
acss compressuses Gemma to create human-readable summaries
Math: Original session = 5000 tokens, compressed = 2500 tokens (50% reduction) while preserving critical context.
Challenge 6: Cross-Platform File Paths
Problem: Windows uses \, Unix uses /. Session files aren't portable.
Solution: Always store paths with forward slashes in JSON, normalize on read:
const normalizedPath = path.normalize(storedPath).replace(/\\/g, '/');
Tested on Windows, macOS, Linux—works seamlessly.
📚 What We Learned
Technical Learning
Schema Design is Hard: Balancing flexibility vs. structure requires real-world testing. I iterated the ACSS schema 4 times before settling on the current version.
LLMs Are Powerful But Unpredictable: Local models like Gemma 2B can extract surprising insights from conversations, but prompt engineering is critical. Small changes to the extraction prompt improved accuracy from 60% to 90%.
CLI UX Matters: Adding colors (chalk), spinners (ora), and clear progress messages transformed ACSS from "works" to "feels professional." Developers care about terminal aesthetics.
Monorepo Management: Managing shared types between
@acss/coreand@acss/clitaught me about package linking, build order dependencies, and TypeScript project references.Watch Mode Complexity: File watching seems simple until you encounter:
- OS-specific quirks (macOS FSEvents vs Linux inotify)
- Editor save strategies (atomic writes vs in-place)
- Build tool interference (webpack triggers 1000s of events)
Lesson: Always debounce and filter aggressively.
Product Learning
Solve Your Own Problem: ACSS resonated because I built it for myself first. Every feature addresses real friction I experienced.
Demo > Documentation: A 5-minute video demo explaining the problem and showing the solution is worth 10 pages of docs.
Open Source Strategy: Releasing under MIT license isn't just idealistic—it builds trust. Developers won't use a closed tool that captures their codebase.
Future-Proof Architecture: Building with extensibility in mind (plugin system, custom compressors, new AI tool support) sets up long-term success.
Soft Skills Learning
Time Boxing: Setting hard limits (e.g., "30 minutes to fix watch mode or disable it") prevented perfectionism paralysis.
Prioritization: Shipping 7 working core features beats 10 half-broken features. I disabled experimental features that weren't ready instead of rushing them.
Documentation Discipline: Writing README, CLI reference, and demo script as I built (not after) kept everything aligned and prevented scope creep.
🚀 What's Next for ACSS
Short-Term (v0.2 - Next 2 Months)
- npm Package: Publish to npm for
npm install -g acssglobal installation - Comprehensive Tests: 90%+ coverage with Jest, integration tests for all workflows
- VS Code Marketplace: Publish extension for live session monitoring
- Video Tutorials: Step-by-step guides for common workflows
- Bug Fixes: Address issues from early adopters
Medium-Term (v0.3-0.4 - Next 6 Months)
Native IDE Integrations:
- Cursor: Auto-capture + native ACSS export
- GitHub Copilot: Read ACSS sessions for better suggestions
- Windsurf: Seamless handoff integration
Browser Extensions: Chrome/Firefox for one-click export from ChatGPT/Claude web interfaces
Auto Context Logger (Zero Manual Effort):
- Automatically detect file changes, commits, errors from terminal
- Smart intent detection ("user renamed 15 auth files → refactoring auth")
- Background daemon mode
Long-Term (v1.0 - 1 Year)
- Team Collaboration: Share sessions with teammates, see all active coding sessions
- AI Model Comparison: Track which AI gave better suggestions, A/B test prompts
- Pattern Detection: "You've hit this CORS error 3 times—here's the permanent fix"
- Marketplace: Share anonymized sessions as templates ("How senior devs solve X")
💭 The Vision
ACSS becomes the Git of AI-assisted development.
Just like:
- Git standardized version control
- Docker standardized containers
- OpenAPI standardized API specs
ACSS will standardize AI coding context.
Every AI tool, IDE, and platform speaks ACSS. Developers never lose context again, regardless of which tools they use. The era of wasting time re-explaining projects ends.
🌟 Impact Potential
- 28 million developers use AI coding tools globally
- 10-15 minutes saved per AI tool switch
- 3-5 switches per day for active developers
- 30-75 minutes saved daily per developer
If just 1% adopt ACSS:
- 280,000 developers
- 7,000,000 hours saved per week globally
- Equivalent to 291,000 full workdays of productivity reclaimed
This isn't just a tool—it's a movement toward frictionless AI-assisted development.
🙏 Acknowledgments
Thank you to:
- The open-source community for incredible tools (TypeScript, Node.js, Commander, Ollama)
- Early testers who provided feedback on the alpha version
- Hackathon organizers for the opportunity to build and showcase ACSS
- Every developer who's ever felt the frustration of context loss—this is for you
Built with ❤️ by a developer tired of wasting time.
Star on GitHub: [https://github.com/sp25126/ACSS_context
Try it now: Installation instructions in README
Never lose AI coding context again. 🚀
MIT License | Open Source | Community-Driven
Built With
- commander
- node.js
- typescript
Log in or sign up for Devpost to join the conversation.