Inspiration

The FurReact team is comprised of tech savvy furries who share an a passionate desire to possess adorable animal ears. XR allows users to have an embodied experience with the ears and feel that the ears are a true extension of their body.

What it does

FurReact ears utilize face and eye tracking from the Samsung Galaxy XR headset to control motorized animal ears that twitch and move to reflect a user’s emotions depending on facial expressions.

How we built it

This project was borne from love and passion; as well as servos, microcontrollers and hot glue. Two MG90S servos and an ESP-32 microcontroller are mounted to a 3D printed headband. The ESP-32 is powered by a battery bank and then wirelessly communicates with the Samsung Galaxy XR using MIT Singularity VR/AR Unity SDK. The Galaxy then provides face tracking data which is sent to Unity, processed, and then turned into instructions for the servos. The whole device can be looped onto the arms of the headset and adjusted to fit via elastic Velcro.

Challenges we ran into

The biggest challenge was getting the many interconnecting physical and software elements to work together. MIT Singularity needed to be adapted to work with the Samsung Galaxy XR (the code adapted to work the device was posted to the reality hack discord). There was a lot of back and forth with measuring and adjusting the headband components to ensure the best possible fit for both the user’s comfort and the functionality of the mechanisms.

Accomplishments that we're proud of

We're very proud of working on a project we all have a personal interest in and driving it to a satisfying prototype. Keeping the scope simple and manageable for our minimum viable product ensured that the core vision was able to be completed with enough time to expand upon the functionality and refine the details.

What we learned

We gained experience with rapid prototyping and physically fabricating peripherals to work with VR headsets, learned how to use and develop for the Samsung Galaxy XR, and all got better at working as a team to create a satisfying end product.

What's next for FurReact:

The ideal end product would be supported by advanced lightweight AR glasses that have eye and/or facial tracking technology. This would allow the ears to become more of an everyday wearable accessory, furthering the concept of using XR to embody a fantastical extension. Additionally, we would love to add more degrees of movement for the ears and more programmed gestures to allow for even more holistic animal-ear experience. There is also interest in adding functionality for controlling ear and face movement of avatars on social XR apps like VR Chat.

Built With

  • android-extensions-1.2.0
  • android-xr
  • arduino
  • c#
  • c++
  • esp-32
  • micro-servo
  • mit-singularity
  • samsung-galaxy-xr
  • unity
Share this project:

Updates