WE ALL LOVE BASKETBALL, except for Brian, who has always had terrible shooting form. Brian has always wanted to shoot like Lebron James, but without a model, he never could accurately mimic his favorite basketball player. That's when we discovered Myo, and its capabilities to record complex gestures like shooting a basketball. We quickly realized that we could use the Myo's gesture sensing capabilities to compare gestures between two different people; Brian and Lebron James. Lebron James could record his award winning MVP shots, and Brian can use the same model from the Myo data to accurately compare his style to the legendary form, where he can also view statistical data in the form of scatterplots to see his accuracy in pitch, roll, and raw.
This app is more of a test analysis of the Myo's gesture recording capabilities on highly complicated gesture combinations, and a demonstration of the many factors that a single Myo considers every hundreth of a second. RoleModel takes in up to 3 seconds of Myo gestural information, creating over 200 gesture timestamps, and then uses an appropriate algorithm with a sample size of over 200 to statistically determine the mean absolute average error between 7 factors: pitch, roll, raw, acceleration.x, acceleration.y, acceleration.z, and gestures. That error is then presented in a detailed scatterplot to show differences between the model recording, and the test recording. The technology we developed has the potential to be effectively used as statistical modeling for motion, one step closer to the IronMan suit.
Built With
- coreplot
- myo
- objectivec

Log in or sign up for Devpost to join the conversation.