Inspiration

While AI has exploded and transformed every industry, the blockchain space has yet to feel its full impact. Current chat and voice models—like Grok, Google Gemini, Perplexity, and ChatGPT—still can’t deliver real-time financial or on-chain data. They’re blind to what’s actually happening on blockchains.

This gap has left Web3 sidelined in the AI revolution, blocking users from interacting seamlessly with decentralized networks. The opportunity is massive: we can be the ones to bring AI to blockchain in real-time, unlocking a new era of on-chain intelligence.

While exploring Google Gemini Studio and its models, I realized something profound: this wasn’t just the future—it was here, now, and fully possible. Harnessing the Gemini ecosystem, I’ve unlocked a way to bring real-time intelligence directly onto the blockchain, creating a bridge between AI and on-chain data that the world hasn’t seen before. This is more than innovation—it’s the next evolution of Web3 interaction. Using avalanche blockchain as the starting point, this brought the birth of Avalanchai.

What does Avalanchai Do?

Avalanchai is an real-time onchainGPT that lets users interact with the Avalanche blockchain using natural speech. In summary it can explore AVAX token contracts, analyze Avalanche protocols and subnets, fetch real-time market data (prices, market caps), and check wallet balances for native AVAX and specific tokens. By turning complex on-chain data into instant, actionable insights, it makes Avalanche ecosystem as easy to navigate as having a conversation.

Agent Image

Functionalities of Avalanchai

Natural Language Interface

  1. Text-based chat interface
  2. Optional speech-to-text and text-to-speech
  3. Multi-turn conversation memory
  4. Context persistence per user/session
  5. Prompt grounding to avoid hallucinations

Real-Time Blockchain Interaction Layer

  1. Contract address reading for Mainnet and Testnet
  2. Token Balance Fetching for Mainnet and Testnet
  3. Token Supply Fetching for Mainnet and Testnet
  4. Detection of Standard Contracts (ERC20, ERC721, ERC1155) for Mainnet and Testnet
  5. Analysis of Blockchain Protocols for Mainnet

Real-Time Market and Off-Chain Data

  1. Fetching of live price feed for specified token
  2. Real-time market cap and liquidity Fetching

How we built it

Client

The client side of this project was built entirely with google gemini studio.

The audio to text transcription and vice versa is powered by gemini 3 model.

We implemented the real-time and voice communication component. Following the requirement guidelines:

  1. Converting Voice to Text
  2. Converting Text to Voice
  3. Handling Text to Text and vice versa

We also prompted the UI implementation be responsive with the voice speaking indicator vibrating based on the frequency of response from the OnchainGPT.

To enable the client to communicate with the api, we also ensured that the endpoint and json strucuture in the client side fits the backend api which AI engine is also powered by google gemini. We consumed the server api and deployed the application live on google cloud.

Server

Regarding the server side, we used pure rust. we still used gemini api key to enable the engine reason incoming messages. We use alchemy api to access avalanche blockchain RPC. Using alloy_rs and genai rust library, we implemented lots of tools, wrappers and used reasoning COT(Chain of Thoughts) to guide the gemini model to generate extractable agentic text tool calls to directly fetch data from the blockchain.

We implemented processing agent and response agent in the server side using gemini. The processing agent processes the users message, fetches the demanded real-time data and sends all the processed task outputs to the report agent for effective and comprehensive report.

The agentic axum api was deployed on the cloud and the api was consumed from the client built by google gemini studio.

Challenges we ran into

Client Side

The following were our experience challenges:

  1. Implementing text to voice on the client side
  2. Implementing voice to text on the client side
  3. Implementing the text to text on the client side
  4. Ensuring all the implementation are working correctly while the other implmentation hasn't overwrittent he functionality.

Server Side

  1. Implementing the processing agent and the config file
  2. Implementing the parsers, on-chain and off-chain tools
  3. Ensuring I was able to filter tool messages provided by the processing agents for execution before giving back to the report agent.
  4. Queuing multiple processing execution task before giving complete task for report by report agent.
  5. Implementing of Axum Api and fighting rust bugs in general.
  6. Finally, fighting regex pattern matches to extract processed agent output for execution was a nightmare 😫.

Accomplishments that we're proud of

We are really proud to have been able to implement all this to ensure it works. Ranging from the voice client which was mentally demanding to consuming the api correctly.

What we learned

The major thing we learnt is the capabilities of google gemini studio in building real-time apps in a matter of minutes. I was astonished.

What's next for Avalanchai

Advancement

Regarding advancement we are on our way to do the following:

  1. Implement and support more protocols connection on avalanche.

  2. Add more demanded functionality.

  3. Work on documentation.

Community Growth

Regarding user growth we will do the following:

  1. Storm avalanche base communities and get users to use our products.

  2. Make promotions, get community feedback and reintegrate.

Business

Regarding the business side of things, we intend to do the following:

  1. Creating a business model around payment for easy UX.
  2. Collaborating with Protocols

We will continue to grow this project, publicize and market to avalanche ecosystem to make life better for web3 users. It's time we use OnchainGPTs powered by gemini models over blockchain explorers.

We will continue adding more tools, using gemini coding studio to update app features and getting user feedback to improve while we grow our network and craft marketing models to sustain our product.

Share this project:

Updates