HOSTell: Making Important Updates Stand Out
Inspiration
Our key inspiration stemmed from a shared daily struggle: staying attentive to important events and news amid overwhelming information noise. Announcement chats constantly bombard our phones with teleblasts, diverting our attention so frequently that many of us end up muting them—often at the cost of missing out on important events.
This issue is further exacerbated by bots repeatedly reposting the same events to call for sign-ups. Even when these messages are slightly reworded, their constant repetition makes residents disengaged and less likely to notice truly time-sensitive announcements. Additionally, spam messages on news platforms dilute the value of genuinely important information, making it harder for residents to identify what truly matters.
What We Built
To address these challenges, we designed HOSTell, a system that delivers filtered, sorted, and prioritized updates tailored to users’ needs. By leveraging Large Language Models (LLMs) and AI platforms such as Google Gemini, HOSTell intelligently analyzes incoming messages to reduce information overload.
Beyond basic filtering, HOSTell is able to detect and suppress repeated or semantically similar event announcements, even when they are phrased differently. This ensures that users are not repeatedly shown the same information, allowing truly new and urgent updates to stand out.
As a result, users can focus on what truly matters—leading to a better hall life or hostel experience.
How We Built It
We developed HOSTell by integrating multiple components into a unified system:
- AI-driven contextual filtering to distinguish important updates from noise
- Semantic deduplication to reduce repetitive event announcements
- Sorting and prioritization mechanisms to surface timely and relevant information
These components work together to deliver concise, meaningful updates instead of overwhelming message streams.
Challenges Faced
One of the main challenges we encountered was repeated failure during system development, particularly when integrating different pieces of code into a single, cohesive pipeline. Debugging and aligning these components within a limited timeframe was demanding and required multiple iterations.
Despite these challenges, we successfully resolved the integration issues and delivered a working system within the hackathon’s time constraints.
What We Learned
Through this hackathon, we learned how to:
- Better train and apply LLMs to real-world information filtering tasks
- Integrate multiple code components based on a clear system vision
- Rapidly iterate and adapt when facing technical setbacks
Overall, the experience strengthened our understanding of how AI can be used to reduce cognitive overload and meaningfully improve everyday digital experiences.
Log in or sign up for Devpost to join the conversation.