🌱 Inspiration We were inspired by the growing awareness that AI queries, especially longer prompts, consume more energy and increase carbon emissions due to higher computational loads. We wanted to make users more conscious of their digital carbon footprint and empower them to chat more sustainably.

💬 What It Does Our chatbot analyzes your search query’s token count to calculate how “green” it is. It visualizes the percentage of sustainability behind every query, giving users a simple way to see and reduce the carbon footprint of their AI usage.

🛠️ How We Built It We built the chatbot using the Gemini API for natural language processing. Backend: Node.js and Express Frontend: React + Vite

⚙️ Challenges We Ran Into One of our biggest challenges was defining the logistics of how the chatbot would handle different query cases, specifically dead-end scenarios and inconsistent inputs. It took time to refine the flow, but this process helped us clarify our project’s direction.

🏆 Accomplishments That We’re Proud Of Despite the early design hurdles, we successfully bridged our backend and frontend within just a few hours and built a fully functional MVP. Seeing real-time sustainability feedback from our chatbot was a major milestone.

📚 What We Learned We learned that starting early, even with uncertainty, is far more productive than over-planning. Iterating through small mistakes taught us much more than endless hypothesizing ever could. Execution drives learning.

🚀 What’s Next We plan to turn our chatbot into a Chrome extension that integrates directly with popular platforms like ChatGPT, Claude, and Gemini, so users can instantly see the sustainability of their prompts, no extra steps required. Our goal is to make sustainability a seamless part of every AI interaction. We also plan to conduct Statistical tests (T-test, F-test, and ANOVA test) among different LLMs to see which LLM has the greatest impact on the environment. This is to then further inform the populace that not all LLMs have the same effect on the environmental impact, and if one can't live without LLMs, they can use alternative ones that uses less tokens, hence committing to environmental impact less.

Built With

Share this project:

Updates