Inspiration

Our inspiration came from the app ideas list shared by the Omi community on GitHub. We saw an opportunity to leverage Omi’s potential as an AI wearable by integrating it with Perplexity, a rapidly growing and preferred search engine. The idea was simple: bring instant, high-quality search results to users directly through their Omi device, enhancing accessibility and productivity.

What it does

SearchPerplexity enables Omi devices to provide instant answers from Perplexity. Using the device's Real-Time Transcript Processors, user commands are sent to a webhook hosted by our Flask app. There, an advanced LLM (Groq LLaMA 3 8B model) determines if the user’s intent is to perform a Perplexity search. If so, it reformulates the query for better search engine compatibility. Perplexity processes the query and sends back results, which are displayed as notifications on the Omi device.

How we built it

Platform: We used Omi's Real-Time Transcript Processors to stream user inputs to our webhook. Backend: A Flask app deployed on the cloud handles incoming transcripts. AI Integration: Groq LLaMA 3 8B is used to detect user intent and rewrite queries. Search Engine: Queries are sent to Perplexity, and results are optimized for display as notifications. Notifications: Results are token-limited (~40 tokens) to fit iOS notification character limits.

Challenges we ran into

  • Intent Identification: Ensuring accurate detection of when a user actually wants to search with Perplexity.
  • Solved by crafting a robust LLM prompt to handle transcription errors and recognize similar-sounding words.

Accomplishments that we're proud of

  • Successfully building and deploying a fully functional Omi-Perplexity integration in less than a day.
  • Tackling challenges like transcription errors and character limits with creative solutions.
  • Gaining hands-on experience with Omi's app development and Perplexity integration.

What we learned

  • The nuances of working with real-time transcription on Omi devices.
  • Effective use of LLMs for intent recognition and query optimization.
  • Challenges of integrating third-party APIs into wearable tech workflows.

What’s next for SearchPerplexity

We envision integrating Perplexity's SDK (once available) to allow Omi users to log into their Perplexity accounts. This would enable searches initiated on Omi devices to appear as threads in users’ Perplexity history, allowing them to revisit and explore results in greater depth later. Additionally, we plan to enhance query processing for even more intuitive interactions.

Built With

Share this project:

Updates