Inspiration

We believe for AI to truly enable human flourishing, it needs to go beyond software and consumer products. Accelerated science could compress a century’s worth of discoveries and quality of life gains into a decade. Our team is composed of EE, physics, and materials majors interested in next-generation compute, energy storage, therapeutics—and how AI can help get us there.

Graphene is an incredibly important material both for use in advanced nanoelectronics and quantum materials research (2D electron system). We knew we wanted to build an autonomous lab for Treehacks, and we realized we could achieve state-of-the-art synthesis methods of graphene flakes with simple hardware since the state-of-the-art is using scotch tape to peel apart the layers. Prof. Mannix and Goldhaber-Gordon in the materials/physics departments pointed us towards Raman spectroscopy and gave us graphite chips to exfoliate.

What it does

AutoLab is an autonomous end-to-end 2D materials discovery solution — from exfoliation to characterization, no human in the loop.

Our system exfoliates graphene, identifies candidate flakes, applies controlled strain, and characterizes the resulting material properties using Raman spectroscopy. This is powered by an agentic ecosystem where research agents access SoTA analysis tools and real-world hardware to run end-to-end experiments.

The platform connects three layers:

  1. Custom hardware -- A graphene stamping/straining jig with stepper motors controlled by a Raspberry Pi running a local VLM. The researcher talks to the jig through natural language prompts.
  2. Intelligent vision -- A two-stage CV + Claude Vision hybrid pipeline for real-time flake detection. Local computer vision optimizes contrast and finds candidates in ~30ms, then Claude Sonnet 4 verifies and classifies each one in ~3 seconds.
  3. Autonomous orchestration -- Rather than simply automating steps, AutoLab reasons about experiments. Agents can trigger measurements, analyze spectra, detect anomalies, and iterate on experimental parameters—transforming a traditional lab workflow into a closed-loop intelligent system.

How we built it

1. Experiment Design (Frontend)

The researcher types an experiment description in plain English:

"Prepare 5 graphene samples at 0-4% strain and characterize each with Raman spectroscopy."

The orchestrator agent parses this, generates a step-by-step plan, and presents it for approval.

2. Multi-Agent Execution (Backend)

Once approved, the orchestrator dispatches tasks to specialized sub-agents:

  • Orchestrator plans experiments, coordinates sub-agents, tracks progress. All sub-agent dispatch.
  • Synthesis controls hardware for sample preparation (exfoliation, stamping). Motor control, hardware interface.
  • Characterization runs and analyzes Raman spectroscopy. Spectrum fitting, peak detection, material ID.
  • Theory Searches literature, builds theoretical models. Semantic Scholar API, calculations.

All agents use Claude Sonnet 4 with tool use and stream their thinking in real-time to the frontend via WebSocket.

3. Flake Detection (CV + VLM Pipeline)

The microscope feed runs through a two-stage detection pipeline:

Stage 1 -- Local CV (~30ms):

  • Automatic contrast optimization (alpha/beta sweep to maximize flake-substrate separation)
  • CLAHE enhancement + Otsu thresholding
  • Contour analysis with area and edge-density filtering
  • Generates candidate bounding boxes

Stage 2 -- Claude Vision (~2-3s):

  • Sends contrast-optimized image (512px, 70% JPEG) to Claude Sonnet 4
  • Claude independently detects flakes using its own vision
  • CV candidates are provided as optional hints, not hard constraints
  • Returns verified detections with confidence scores, bounding boxes, and reasoning

4. Hardware Control (Raspberry Pi + Stepper Motors)

A custom graphene stamping/straining jig with:

  • ThorLabs KDC101 motor controller + MTS25-Z8 linear translation stage
  • Direct USB communication via pyftdi using the APT binary protocol (bypasses macOS FTDI VCP driver issues)
  • Precision: 0.001mm (34,304 encoder counts/mm)
  • Raspberry Pi runs a local VLM that accepts natural language commands ("exfoliate at position 3", "apply 2% strain")
  • The RPi agent reports completion back to the web platform, triggering the next pipeline step automatically

5. Raman Spectroscopy Analysis

Automated spectral analysis pipeline:

  • Asymmetric Least Squares (ALS) baseline correction
  • scipy.signal.find_peaks for peak detection
  • Multi-Gaussian fitting with scipy.optimize.curve_fit
  • Material-specific labeling (Graphene D/G/2D bands, MoS2 E2g/A1g)
  • LLM-powered interpretation and comparison to literature values

Challenges we ran into

-Strain control: Applying repeatable strain without tearing ultrathin flakes required precise mechanical design. -Signal-to-noise in Raman: Distinguishing meaningful peak shifts (G and 2D bands) from noise required calibration and careful preprocessing. -Detecting outliers (e.g., “Sample 3 looks wrong”) and triggering resynthesis required building a feedback loop rather than a linear pipeline. -Deciding when to stop iterating vs. gather more samples was a core scientific design challenge.

Accomplishments that we're proud of

  • Built a fully closed-loop autonomous materials discovery system.
  • Achieved reliable graphene flake detection in real time.
  • Implemented automated Raman peak fitting and strain quantification.
  • Created an agent architecture capable of iterative experimental reasoning.
  • Successfully integrated real-world hardware with AI orchestration.

What we learned

  • Autonomy is primarily a systems engineering challenge, not just an AI problem.
  • Scientific workflows are loops, not pipelines.
  • Structured data exchange between agents dramatically improves reliability.
  • Grounding AI reasoning in physics and experimental constraints is essential.

What's next for AutoLab:

  • Closed-loop optimization of strain parameters using adaptive experimental design.
  • Expansion to additional 2D materials (e.g., MoS2, WS2, heterostructures).
  • Real-time experiment visualization dashboard.
  • Higher-throughput parallel exfoliation modules.
  • Moving toward a fully autonomous self-driving materials lab.

Built With

Share this project:

Updates