Inspiration

We started y pondering some of the challenges and problems proposed in G1. Residential solar power is great but comes with the challenge of "Duck Curve" saturation in areas with high especially sunny climates. This effect is where solar energy production spikes across a grid simultaneously which destabilizes the grid.

At the exact same time, the AI industry is facing a massive energy bottleneck. Training and running models requires immense amounts of power, often straining national power grids and increasing carbon footprints globally.

Thus, we decided to try to solve both problems at the same time by harnessing this excess solar energy and turn it into high value AI compute. We think that by giving both excess solar owners and AI developers and researchers the choice to contribute to an open decentralized network we could create positive impact for a more sustainable market.

What it does

Solaris.ai is a decentralized physical infrastructure network (DePIN) that enables residential solar owners to maximize the utility of their renewable energy.

Instead of feeding excess power into an already saturated grid, the Solaris.ai system routes that energy into a local, high-efficiency Neural Processing Unit (NPU). Homeowners become "Compute Nodes," contributing to a distributed network of green AI inference.

  • For Homeowners: It visualizes the true potential of their energy assets, showing the ROI difference between selling $kWh$ (raw power) vs. selling $TOPS$ (Trillions of Operations Per Second).
  • For Researchers: It democratizes access to compute. Instead of relying solely on centralized infrastructure, they can access a distributed, community-powered network of "Green Environments" that are carbon-neutral by design.

How we built it

Since this is a vision of future infrastructure, we focused on building a High-Fidelity Interface Prototype to simulate the user experience of a peer-to-peer economy.

We built the frontend using Next.js and React to ensure a responsive, accessible experience.

  • The Simulation: We used Recharts to model the mathematical relationship between solar irradiance curves and NPU power draw.
  • The Experience: We designed distinct flows for the "Provider" (homeowner) and the "Researcher" (user), using Lucide React icons and glassmorphism effects to differentiate the "hardware" status from the "software" controls.

Challenges we ran into

The biggest challenge was Visualization of Abstract Concepts. How do you show "AI Inference" happening on a screen? How do you make a complex "Reverse Auction" look as simple as a standard checkout flow?

We struggled initially with the "Researcher Dashboard." It was too technical, filled with hashes and network stats. We had to iterate on the UX to create the "Auto-Cluster" concept, which abstracts away the complexity of distributed networking so a user can simply toggle a switch to connect to available solar nodes instantly.

Mathematically, we also had to grapple with the "Energy Arbitrage" logic to make the demo realistic. We had to calculate:

$$Value = (Compute_{rate} \times Time) - (Grid_{sell_price} \times Energy_{used})$$

Ensuring our UI represented this formula accurately—highlighting the efficiency gains without overwhelming the user—took several design revisions.

Accomplishments that we're proud of

We are incredibly proud of the "plug-and-play" UX. Decentralized networks are often viewed as complex or difficult to use. We managed to design an interface that makes participating in a distributed compute grid feel as easy as setting up a smart home device.

We are also proud of the "Sandboxed SSH" concept. Solving the logic for how a developer would securely remote connect to a distributed node was a design hurdle, but we found a solution that prioritizes security and isolation, ensuring trust between the node provider and the user.

What we learned

  1. The Power of NPUs: We learned that for this distributed model to work effectively, efficiency is key. We researched hardware like Rockchip and Hailo, learning that high $TOPS/Watt$ ratios are the secret to truly sustainable AI inference.
  2. Frontend Architecture: We deepened our understanding of React Hooks to manage the state of the "simulated" network, ensuring that when a user "deploys" a job, the dashboard updates instantly across the UI.
  3. The "Inference" Shift: We learned that a vast majority of future AI energy demand will come from inference (running models) rather than training them, confirming that distributed, low-power nodes are a viable solution for the industry's growth.

What's next for Solaris.ai

The demo proves the experience; now we need to build the pipe.

  • Phase 1: Benchmarking physical hardware (Jetson Orin Nano vs. Coral TPU) to find the perfect "Supernova" device configuration.
  • Phase 2: Writing the actual container orchestration layer (using Docker and K3s) to enable secure, distributed job processing.
  • Phase 3: Launching a pilot with 5 local university students to test the latency and reliability of the "Auto-Cluster" feature in a real-world, decentralized environment.

Built With

Share this project:

Updates