Inspiration

My inspiration from this project came from my passion for boardgames and online games. I've been really getting into boardgames and having fun playing them regularly with my co workers at my current internship. Especially Catan! But I've never really played a game that adapted to you as a person and learned about your weaknesses and take advantage of you, I feel like this is a much lacking part in the gaming market. I hope to see more of these adaptive type games come up in the future.


What it does

Adaptive Ascend is an AI-powered space shooter that adapts to your skill level in real time using behavioral analytics and machine learning aswell as ai.

  • Adaptive Difficulty System: The game analyzes your accuracy, reaction time, and damage patterns to adjust enemy count, speed, and attack patterns between levels
  • It has a final Ai powered boss battles which happens at Level 5 and features 3 coordinated alien ships that use different AI strategies - predictive shooting (leads your movement by 30px), direct targeting (shoots at current position), and randomized attacks
  • Gesture Controls: Use hand gestures detected via webcam (fist = shield, gun sign = shoot in desired direction) powered by TensorFlow.js
  • Smart Feature Flags: Struggling players automatically receive contextual hints when they take 5+ hits in early levels, triggered by Amplitude Feature Flags
  • Comprehensive Analytics: Movement heatmaps, damage location tracking, accuracy analysis, and AI-generated performance insights
  • Claude AI Integration: Real-time coaching messages and post-game adaptive difficulty analysis explaining how the game adjusted to your playstyle

The game tracks over 20+ events in Amplitude including shots fired, enemies destroyed, gesture usage, and damage patterns to create a personalized experience.


##How we built it

Tech Stack:

  • Frontend: Next.js, React, TypeScript, TailwindCSS
  • AI & ML:
    • Anthropic Claude (Haiku & Sonnet) for adaptive analysis and coaching
    • TensorFlow.js + MediaPipe for hand gesture detection
  • Analytics & Experimentation:
    • Amplitude Unified SDK for event tracking
    • Amplitude Session Replay for debugging player experiences
    • Amplitude Experiment for behavioral feature flags
  • Game Engine: Custom React-based game loop with 30ms collision detection

Key Implementation Highlights:

  1. Real-time Position Tracking: Sample player position every 500ms to generate 10x20 grid heatmaps
  2. Damage Location Tracking: Track every hit location to analyze where players struggle most
  3. Boss AI System: Implemented coordinated AI strategies (aggressive push, corner trap, focus fire, chaotic swarm) that adapt based on player patterns
  4. Feature Flag Integration: User properties automatically update in Amplitude when players struggle, triggering contextual hints via feature flags
  5. Gesture Recognition Pipeline: Real-time hand landmark detection → gesture classification → game action mapping

Challenges we ran into

  1. Feature Flag Timing: Amplitude cohorts require a paid plan, so we engineered a hybrid solution using User Properties for targeting. We had to handle the Unified SDK's identify API differently.
  2. Heatmap Data Persistence: Initially heatmaps only captured partial movement. Fixed by ensuring position refs stayed in sync with state via useEffect hooks.
  3. Boss AI Coordination: Making 3 AI ships work together without being too easy or impossible required extensive balancing. We implemented different shooting strategies per ship (predictive, direct, random) for dynamic difficulty.

Accomplishments that we're proud of

Intelligent Behavioral Triggers: Successfully implemented feature flags that activate based on real player behavior (provide game tips depending on user skill level)

Dual Heatmap System: Built both movement and damage location heatmaps displayed side-by-side, revealing player patterns and struggle zones

Boss AI with Personality: Created a boss fight where each of 3 ships has distinct attack patterns - one predicts movement, one shoots directly, one randomizes - making it challenging yet fair

Zero-Latency Gesture Controls: Achieved real-time hand tracking at 30fps+ with TensorFlow.js, making gesture controls feel responsive


##What we learned

Amplitude Deep Dive:

  • How to use User Properties for behavioral targeting instead of cohorts
  • The power of Session Replay for understanding player frustration points
  • Implementing Amplitude Experiment SDK with proper async/await patterns
  • The difference between Amplitude's Unified SDK and legacy SDKs

AI Integration Patterns:

  • How to structure prompts for game analysis (sending level progression data for adaptive insights)
  • Balancing AI response times with gameplay flow (using Haiku for speed, Sonnet for depth)
  • Generating actionable coaching messages vs generic feedback

Game Development:

  • Building React-based game loops without triggering unnecessary re-renders
  • Collision detection optimization (spatial partitioning concepts)
  • Balancing difficulty curves using player data rather than guesswork

Real-time Analytics:

  • Position sampling strategies for heatmap generation
  • Debouncing analytics calls to avoid overwhelming APIs

What's next for Adaptive Ascend

Immediate Roadmap:

  1. More Feature Flags:
    • spawn_extra_powerups: Increase heart/shield drops for struggling players
    • boss_strategy_tips: Show boss pattern hints after 3 deaths
    • enable_aim_assist: Add targeting reticle for players with <25% accuracy
  2. Multiplayer Mode: Use Amplitude's user segmentation to match players of similar skill levels
  3. Leaderboards: Implement Amplitude cohorts (when available) to create skill-based leaderboards
  4. Advanced AI Coaching: Real-time Claude streaming responses during gameplay for live tips

Built With

  • amplitude
  • anthropic
  • claude
  • mediapipe
  • nextjs
  • react
  • tensor
Share this project:

Updates