Inspiration

In the race to develop more advanced AI, companies often subscribe to the belief that bigger is better. GPT-4 boasts an estimated 1.8 trillion parameters, up from GPT-3's 175 billion and GPT-2's modest 1.5 billion parameters. While improved performance is a given, the exponential scale-up has significant side effects that aren’t talked about such as the increase in energy expenditure and cooling resources required to train and draw inference from these models. It is estimated that the operational demands of a model like ChatGPT equate to the energy consumption of over 17 thousand U.S. households each day.

Energy demand is extreme with AI models like ChatGPT, which consume up to 25 times more energy per operation than a typical Google search and scale with tokens. Azalea is inspired by the need to mitigate these effects by promoting more sustainable AI usage, akin to adopting recycling as a societal norm. By providing a tool that helps users understand and reduce their AI-related environmental footprint, we aim to foster a more conscientious approach to AI utilization.

What it does

Azalea empowers users to make environmentally conscious decisions when interacting with AI. When a user submits a query, they can click the "Calculate" button to see a comprehensive score for each AI model. This score, calculated based on the model's energy consumption and complexity, advises which model is the most practical and sustainable option for the given query. It also includes metrics that represent the environmental impact in familiar terms, including such as the equivalent number of plastic bags used, liters of water consumed, and distance driven by an electric car. A score of 0-2 indicates a poor fit (high cost or not suitable for the complexity), a score of 2-4 is a moderate fit, and a score of 4-6 is a good fit (low cost and suitable for the complexity). With these insights, users can make informed decisions, selecting a model to answer their query. Over time, Azalea tracks the cumulative environmental impact of each model, allowing users to compare the sustainability of their choices and become more conscious of their energy savings.

How we built it

We built Azalea with a React frontend and a Python FastAPI backend. We stored energy and conservation data entries in a database using MongoDB. For handling various AI model interactions, we utilized Google Custom Search API, Mistral API and OpenAI API. For preprocessing, we used nltk, pandas, and numpy to clean the data. We also used the Transformers Library to tokenize prompts to calculate energy consumption and scikit-learn for training and using a Random Forest model to predict query complexity.

Our dataset for complexity scoring can be found on HuggingFace datasets "/deita-complexity-scorer-data".

Challenges we ran into

Finding a third model for comparison, other than ChatGPT and Google Search, with proper question-answering capabilities and minimal energy consumption was challenging, but we eventually found the ideal candidate in Mistral 7B. Training our model for predicting query complexity proved to take a long time for each of us because of the extensive preprocessing and feature extraction steps involved, as well as the computational demands of training the Random Forest model on a large dataset. Furthermore, creating an algorithm for scoring with dynamic weights was also a significant challenge and required extensive testing to ensure accurate recommendations based on query complexity and energy consumption metrics. Finally, using Flask proved to be quite difficult due to its synchronous nature and limited scalability, which didn't align well with our need for handling multiple concurrent AI model interactions efficiently. FastAPI, with its asynchronous capabilities and high performance, was much better and suitable for our project.

Accomplishments that we're proud of

We're particularly proud of mastering and integrating new technologies such as FastAPI and MongoDB, which not only allowed us to expand our tech stack but also improved our projects performance and scalability. Another notable achievement utilizing the Transformers Library and scikit-learn to develop a predictive model to analyze prompt complexity. This was rewarding as it involved NLP and ML – areas that a couple of our team members were exploring for the first time.

Additionally, we developed a user-friendly React frontend that works seamlessly with our backend, making Azalea easy to use for everyone. This effort ensures that users can make informed choices about AI use with minimal effort. Overall, we are proud of our commitment to continuous learning and innovation throughout the development of Azalea.

What we learned

We learned about the significant, growing impact of AI on energy consumption and the powerful role small adjustments can play in reducing this. Implementing Azalea showed us how guiding choices towards energy-efficient models can have real benefits. Technically, we discovered the strengths of FastAPI—its speed and ease of use were game-changers for our backend development. Additionally, integrating various APIs into our models not only streamlined our processes but also enhanced our application's functionality. This project broadened our technical understanding and deepened our commitment to sustainable AI practices.

What's next for Azalea

We next plan to incorporate the dollar cost of each query per model into the scoring system to give users an understanding of the financial impact of their model choice. We are also planning on implementing continuous chats and a chat history feature. This means users can carry on conversations across multiple prompts and switch between models without losing context. Next, we are thinking about developing a Chrome extension for Azalea, integrating our cost and complexity algorithms directly into ChatGPT, making it easier for everyone to make informed choices about their AI use. Lastly, we're looking forward to deploying Azalea soon—bringing it into the daily lives of users.

Built With

Share this project:

Updates