Inspiration
Space exploration is fascinating, but for students and children, it can be overwhelming. Reading Wikipedia articles about "Orbital Mechanics" or "Neptune's Atmosphere" is boring and difficult. We wanted to change that.
We asked: What if you could just talk to a space expert like a friend? That inspired Planet Infoβan interactive, voice-first AI guide that makes learning about the cosmos accessible, conversational, and fun.
π What it does
Planet Info is a voice-activated mobile application that serves as your personal guide to the universe.
- Voice-First Interface: Users simply ask questions like "How far is Mars?" or "Tell me a fun fact about Black Holes."
- Real-Time AI Reasoning: The app understands complex queries and context using Google Gemini 2.5 Flash.
- Human-Like Response: Instead of reading text, the user hears a lifelike voice response generated instantly by ElevenLabs eleven_multilingual_v2. NASA Astronomy: Daily NASA Astronomy news, providing students with accurate, scientific information.
- Visual Feedback: A custom-built waveform animation reacts in real-time to the voice, creating an immersive "Sci-Fi" experience.
βοΈ How we built it
We built a modern, serverless architecture focusing on speed and scalability:
- Frontend: Built with Flutter to ensure a smooth, cross-platform mobile experience. We designed a custom animation engine for the audio visualizer.
- AI Brain: We used Google Gemini 2.5 Flash for its speed and reasoning capabilities. It processes the user's text and generates concise, educational answers.
- Voice Synthesis: The text response is sent to ElevenLabs API (eleven_multilingual_v2), which returns high-fidelity audio in milliseconds.
- Backend: We went fully serverless with Google Firebase. We used Firebase Auth for secure user management and Firestore to store user history and planetary data.
π¨βπ» Challenges we ran into
The biggest challenge was latency. Combining two AI models (Gemini for text + ElevenLabs for audio) often creates a delay. To solve this, we optimized our API calls and chose Gemini 2.5 Flash specifically for its sub-second inference speed. We also had to ensure seamless integration between Flutter and Firebase to keep the data sync instant.
Accomplishments that we're proud of
- Successfully integrating Google Gemini and ElevenLabs into a single, seamless workflow.
- Building a custom Waveform Animation in Flutter that syncs perfectly with the AI state.
- Creating a fully serverless, scalable application using the Firebase ecosystem.
π What we learned
We learned the power of Multimodal AI. By combining the intelligence of Gemini with the emotion of ElevenLabs, we created an experience that feels far more "human" than a standard chatbot. We also gained deep experience in structuring NoSQL data with Firebase Firestore.
Built With
- dart
- elevenlabs
- farebase
- flutter
- gemini
- google-cloud
Log in or sign up for Devpost to join the conversation.