Project Story

About the Project

JamDeck was inspired by a simple idea: listening to music is passive, but playing music is difficult to get into. We wanted to create something in between, a system that lets anyone interact with music instantly, without needing prior musical training.

Coming from an embedded systems background, we were interested in turning physical inputs into meaningful outputs. Instead of using a traditional instrument, we explored how motion, pressure, and touch could be used to control sound in real time. The goal was to design a device that feels natural to use while still being expressive enough to resemble a real instrument.


How We Built It

The system is built around an ESP32 that reads multiple sensors and communicates with a server over WiFi. Each component of the hardware contributes to a different aspect of musical control:

  • Buttons are used to trigger notes based on scale degrees
  • A force sensor (FSR) controls how strongly a note is played (velocity)
  • A flex sensor is used to switch between musical modes
  • A gyroscope (MPU6050) adds vibrato through motion
  • An OLED display provides real-time feedback to the user

The ESP32 continuously reads sensor data and sends structured JSON events to a backend server. The server processes these events, maps scale degrees to actual notes based on the current key, and sends display updates back to the device.


System Design and Logic

The system operates in a non-blocking loop to maintain responsiveness. Instead of using delays, we rely on timed intervals to handle different tasks such as sensor polling, communication, and display updates.

Sensor inputs are mapped to musical parameters. For example, the force sensor is normalized into a velocity value:

$$ v = v_{\min} + \left( \frac{F - F_{\min}}{F_{\max} - F_{\min}} \right)(v_{\max} - v_{\min}) $$

where (F) is the raw sensor value and (v) represents how strongly a note is played.

For vibrato, we compute the magnitude of angular velocity:

$$ \text{mag} = |g_x| + |g_y| + |g_z| $$

and apply smoothing:

$$ \text{vibrato} = \alpha \cdot \text{prev} + (1 - \alpha)\cdot \text{normalized(mag)} $$

This ensures that the vibrato effect feels continuous rather than noisy.

By mapping buttons to scale degrees instead of fixed notes, the system guarantees that all output stays within a musical key, making it easier for users to produce pleasant sounds.


Challenges We Faced

One of the biggest challenges was maintaining real-time performance while using WiFi communication. Sending too many HTTP requests caused noticeable latency, so we implemented throttling and only transmitted significant changes in sensor values.

Another challenge was dealing with noisy sensor data. Buttons required non-blocking debouncing to prevent false triggers. The force sensor and gyroscope produced fluctuating readings, which required filtering and threshold tuning to achieve stable and predictable behavior.

We also had to carefully design the interaction model. If the mappings were too sensitive, the system felt chaotic. If they were too rigid, it felt unresponsive. Finding the right balance between expressiveness and control took multiple iterations.

Finally, synchronizing user input with music required us to think in terms of musical structure rather than raw notes. Using scale degrees allowed us to keep everything in key without requiring the user to understand music theory.


What We Learned

This project taught us how to build a real-time embedded system that interacts with external software layers. We gained experience in handling multiple sensors simultaneously while maintaining a responsive, non-blocking architecture.

We also learned how important signal processing is when working with physical inputs. Raw sensor data is rarely usable directly, and transforming it into meaningful control signals requires calibration, filtering, and thoughtful mapping.

In addition, we learned how system design impacts user experience. Small implementation details, such as debounce timing or smoothing factors, significantly affected how intuitive and responsive the system felt.

Finally, we gained experience integrating hardware, networking, and software into a single cohesive system, which required careful coordination between different components.


Conclusion

JamDeck transforms music from a passive experience into an interactive one. By combining embedded systems with motion, pressure, and touch, we created a system that allows users to engage with music in a natural and immediate way.

The result is a controller that lowers the barrier to musical interaction while still providing a level of expressiveness typically associated with traditional instruments.

Built With

  • audio
  • embedded
  • esp32
  • gemni
  • hardware
  • imu
  • music
  • sensor
Share this project:

Updates