Inspiration

Imagine you're on a hike, fully immersed in nature, enjoying the moment. But it’s easy to lose track of potential dangers—fluctuating pressure, extreme temperatures, and even your own physical health, like heart rate. That’s why we’ve developed a Mixed Reality software that monitors both your well-being and environmental conditions, ensuring your safety based on your hiking location. By providing real-time warnings and guidance, we help hikers stay safe, worry-free, and fully supported on their adventures.

What it does

Using a combination of sensors, user-provided hiking locations, and real-time environmental data, our system continuously monitors your surroundings and physical health. By comparing conditions against predefined safety thresholds, we detect potential dangers before they become critical. If any threshold is exceeded, we provide immediate warnings and automatically notify your emergency contacts, ensuring your safety on every adventure.

How we built it

We developed a comprehensive hardware and software ecosystem for monitoring environmental and health data during hikes. At the core of our system is a Raspberry Pi running native Ubuntu, which serves as our central data processing hub. We configured the Pi with SSH and VNC capabilities for remote access and management, allowing us to deploy and troubleshoot our system even in remote locations. The Raspberry Pi connects to a BME688 sensor to collect environmental data, including temperature, barometric pressure, and air quality (gas detection). To create a reliable network for all our devices, we configured the Pi to function as a WiFi access point, establishing a self-contained communication network that works independently of external internet connections. For health monitoring, we gather real-time heart rate data to transmit BPM readings from an Apple Watch, using Apple’s HealthKit SDK, integrated via a custom Swift application we developed in Xcode. All collected data from various sensors and devices was processed through ROS publishing and MQTT Mosquitto respectively, our WebSocket server running in a Docker container on our local network. We chose this containerized approach to ensure the data processing pipeline correctly formatted and synchronizes all inputs for real-time visualization. The compiled environmental and health data is then transmitted to our Unity application running on the Meta Quest 3 headset via the MQTTnet C# framework. . Using C# scripts in Unity, we visualize the data through an intuitive interface and implement an intelligent notification system that alerts hikers to potential dangers based on their physiological responses and environmental conditions. This complete end-to-end system—from sensors and wearables to mixed reality visualization—creates a seamless safety monitoring solution that operates reliably even in remote hiking locations with no cellular connectivity.

Challenges we ran into

One of the biggest challenges we faced was installing ROS on Windows, as multiple dependency issues made setup difficult. To overcome this, we pivoted to using a Docker image to install and run the ROS WebSocket server more efficiently. Additionally, we initially relied on a ROS server for managing data exchange but later switched to an MQTT broker for better IoT compatibility, allowing for smoother data transmission between devices. Another major hurdle was receiving user health data from the Apple Watch, which proved challenging due to Apple’s strict permission requirements, security restrictions, and complex Xcode setup. Ensuring real-time data transmission required extensive debugging and configuration. Furthermore, working with the Meta Quest 3 introduced additional complications, as the headset's frequent updates led to a lack of up-to-date documentation, making it difficult to properly set up the environment and establish communication with our devices. Lastly, we ran into issues with capturing data from our Raspberry Pi due to the configurations of our GPIO pins.

Accomplishments that we're proud of

Everyone on the team had the opportunity to work with a tech stack they were previously unfamiliar with, including networking protocols, server setup, IP addressing, Docker images, ROS, and integrating a VR/AR headset within Unity. Despite our initial lack of experience in these areas, we successfully navigated the complexities of system integration and got everything running. Overcoming these challenges has been a rewarding experience, and we’re all incredibly proud of what we accomplished.

What we learned

Throughout this project, we gained hands-on experience with a wide range of technologies, deepening our understanding of game engine development, networking fundamentals, IP configuring, server-client communication, containerization and MQTT protocols. Setting up a ROS WebSocket server and managing Docker containers provided valuable insights into system architecture and deployment, while working with the BME688 sensor for environmental data collection and integrating Apple Watch health monitoring taught us how to bridge multiple data sources effectively. We also learned how to process and visualize real-time data in a Unity VR/AR environment for the Meta Quest 3, enhancing our ability to develop immersive experiences. Beyond the technical aspects, this project strengthened our team collaboration and problem-solving skills, as we adapted to unfamiliar technologies, debugged complex integration issues, and worked efficiently as a team.

What's next for HikeyBara

The next step for HikeyBara is to refine the UI/UX, making the display more visually appealing for a seamless user experience. Additionally, we aim to transform our software into a shareable app or an installable OS that can run across multiple XR devices, including the Apple Vision Pro, Snap Lens, and RealWear, ensuring broader accessibility and more practical uses cases. Expanding functionality by incorporating additional body metrics such as oxygen levels, hydration status, stress detection, and gait analysis will provide hikers with more comprehensive health insights. Furthermore, integrating AI Agents will enhance real-time safety monitoring by analyzing historical user data, weather patterns, and terrain conditions to deliver personalized alerts, and emergency response guidance. By combining environmental and physiological data with AI-driven insights, HikeyBara will evolve into a smart safety assistant for hikers, ensuring an optimized, secure, and intelligent outdoor experience.

Track

Health

demo vid pt1: https://www.youtube.com/watch?v=XnmwqHR3qcw demo vid pt2: https://youtu.be/CTZhI9MMjqo

Built With

Share this project:

Updates