https://github.com/am1ree/cojam

About the project (CoJam)

CoJam is a collaboration-first music editor built for artists who want to create together — with other humans and AI agents — in the same shared workspace.

What we built

CoJam combines a real-time music editor with AI-powered creation tools:

  • Real-time collaboration (Figma-style)

    • Live “presence” with a cursor that shows where teammates are working
    • Shared timeline/arrangement so everyone sees changes as they happen
  • AI agent (Claude SDK + tool access)

    • We built an AI agent using the Claude SDK
    • The agent can take real actions inside CoJam (not just chat)
    • We expose an MCP server that gives the agent access to CoJam’s tools (ex: search/import audio, add clips, edit the timeline, add MIDI notes)
  • Music generation (Suno)

    • Integrated Suno to generate musical ideas quickly (drums, melodies, vocals, etc.)
    • Generated results can be dropped straight into the project as usable clips
  • MIDI board (compose directly)

    • A built-in MIDI piano board to add notes with piano and other instruments
    • You can compose manually or ask the AI agent to create notes for you
  • Strudel integration

    • We integrated Strudel so users can create / play pattern-based music and bring it into the workflow
    • This gives fast “live coding / pattern” creation alongside timeline editing
  • Editing + arranging

    • Basic editing for building a song quickly:
    • add/move/delete regions/clips
    • simple arrangement across tracks
    • (optional) trimming/looping depending on what you implemented
  • Export

    • Export your work so it’s shareable outside the app (music export for demo + sharing)

How we built it (high level)

  • Collaboration layer: real-time updates for timeline changes + cursor presence (Figma-like)
  • AI layer: Claude SDK agent connected to CoJam via an MCP server so the agent can use the same actions a user can
  • Generation layer: Suno for fast audio idea generation that becomes real clips in the editor
  • Composition layer: MIDI board + Strudel patterns to support both note-based and pattern-based creation
  • Output layer: export pipeline so projects can be shared as actual music

If you want, paste your actual tech stack (React/Next, Supabase, websockets, etc.) and I’ll rewrite this into a Devpost-ready version with concrete details (what runs where, how realtime works, how MCP tools are structured, and what exactly is supported in editing/export).

Built With

Share this project:

Updates