Pulse Sense: Unmasking the Whispers of Wellness

Inspiration

Medical records are often snapshots in time—discrete data points like blood pressure or heart rate taken in a clinical setting. What they miss are the "weak signals": the subtle shifts in our mood, energy, and physical state that happen in the 99% of life spent outside the clinic.

We were inspired by the idea that our natural language is a rich, untapped biometric. A simple phrase like "I've been feeling a bit jumpy lately" or "I'm having trouble focusing after lunch" contains predictive value. Pulse Sense was created to build a "Neural Link" between daily conversation and clinical awareness.

What it does

Pulse Sense is a futuristic health monitoring prototype that decodes human experience into actionable data.

Life Signal Logging: Users log their state via text or voice. Our AI scans for markers across three core systems: Metabolic, Psycho-Emotional, and Cardiovascular.

The Live Life Score: An aggregate "Health Pulse" providing a real-time perspective of systemic stability. Proactive Recovery: Dynamic advice (from "Metabolic Stabilization" to "Clinical Consultation Required") triggered by biometric variance.

How we built it

We built a high-fidelity dashboard using React 19, Vite, and Tailwind CSS 4. For interactivity, we integrated the Web Speech API for context-aware voice analysis. The "Simulated Paradigm" aesthetic was crafted with Framer Motion for liquid animations.

The Math Behind the Signal The Live Life Score ( L ) is the arithmetic mean of the three primary metrics: $$ L = \frac{M + P + C}{3} $$ Where ( M ) is Metabolic, ( P ) is Psycho-Emotional, and ( C ) is Cardiovascular.

When a risk signal ( R ) is detected, the corresponding metric ( S ) is updated: $$ S_{new} = S_{prev} - \frac{R}{5} $$

For daily calibration, we calculate the stabilization delta ( \Delta ) as: $$ P_{new} = \min\left(100, \max\left(0, P_{prev} + \frac{Sleep_{score} - 50}{5}\right)\right) $$

Challenges we ran into

Integrating the Web Speech API's asynchronous nature with React's state management was a hurdle, especially ensuring the AI didn't interrupt itself during follow-up questions. We also faced challenges in designing SVG-based concentric rings that accurately reflect dynamic percentages without visual glitches.

Accomplishments that we're proud of

Immersive UX: We successfully created a dashboard that feels "alive" with heartbeat-synced backgrounds and kinetic orbs. Functional Voice Loop: The transition from speech to AI analysis to spoken response feels seamless. High-Fidelity Aesthetics: Pushing Tailwind 4 to its limits to create a premium, sci-fi feel.

What we learned

We learned the nuances of browser-based voice interaction and the importance of Biometric UX Design—balancing high data density with actionable simplicity. We also mastered advanced Framer Motion techniques for high-performance layout transitions.

What's next for PulseSense

Pulse Sense is the first step toward conversational biometrics. Our roadmap includes:

Gemini Integration: Moving from keyword heuristics to full LLM-driven sentiment analysis.

Wearable Sync: Correlating voice signals with real-time heart rate and ( SpO_2 ) data.

Medical Portal: A secure dashboard for healthcare professionals to view "Weak Signal" trends.

Built With

Share this project:

Updates