Inspiration

In today’s world, wearable devices can track heart rate or steps, but they still lack contextual intelligence — the ability to understand why something is happening or what to do next. We wanted to create a system that goes beyond passive monitoring to active health reasoning at the edge.

Inspired by agentic AI and the growing need for personalized, continuous, and privacy-preserving diagnostics, our team envisioned a wearable diagnostic assistant that doesn’t just collect data, but interprets, learns, and acts.

Our idea combines multi-sensor edge processing, agentic intelligence, and AI-driven diagnostics, enabling health insights without relying entirely on cloud connectivity.

What it does

AI-Enabled Diagnostic Assistant is a multi-sensor wearable system that uses edge-based agentic AI to monitor key physiological parameters and provide real-time diagnostic insights.

Each onboard sensor — temperature, SPO2, and ECG — functions as a local agent, performing edge-level reasoning. These agents collaborate through an on-device hub (Jetson nano) that fuses sensor data and communicates with a central AI diagnostic model for deeper insights.

The assistant can:

Continuously track vital signs (heart activity, temperature, SPO2).

Detect early health anomalies like stress, fatigue, overheating.

Provide on-device AI suggestions (e.g., “Take a break,” “Unusual ECG pattern detected”).

How we built it

We built the architecture around agentic AI on edge devices — where every sensor is an autonomous agent capable of local reasoning and communication.

Hardware Setup:

Temperature Sensor: LM35 – underarm/chest.

ECG Sensor: AD8232 (Analog) – inner left chest (fabric electrodes).

Edge Hub: Jetson nano.

Software & AI Architecture:

Local models perform anomaly detection (e.g. ECG irregularities).

Hub-level fusion model aggregates readings and performs contextual reasoning.

Cloud dashboard (optional) provides visualization, trend analytics, and AI model retraining.

Challenges we ran into

Sensor synchronization — Collected temperature, heart rate, and SpO₂ sensor data using a ESP32 SoC and transferred it via Wi-Fi to a Jetson Orin Nano for processing and analysis.

Data fusion complexity — combining multi-sensor signals for contextual reasoning.

On-device explainability — allowing agents to provide interpretable outputs rather than raw numbers.

Accomplishments that we're proud of

Successfully deployed multi-agent AI models across distributed sensors.

Achieved on-device inference for ECG and temperature anomaly detection.

Designed a plug-and-play architecture for modular sensor integration.

Created a low-latency data fusion pipeline.

What we learned

How agentic AI principles can decentralize decision-making in IoT systems.

The importance of hardware-software co-design in constrained environments.

Collaboration between multiple AI agents (sensors) leads to more adaptive and fault-tolerant systems.

What's next for Agentic AI Unleashed

Integrate LLM-based reasoning for contextual health conversations (“Why am I feeling dizzy?”).

Add cloud retraining pipeline with user feedback loops.

Implement secure data federation across devices for population-level learning.

Expand to Smart Clinics and Home Diagnostics, creating a distributed health network powered by edge AI agents.

Built With

Share this project:

Updates