Inspiration
With brainrot and memes taking over the internet, we thought being able to classify memes and hand gestures based on arm movement would be fun, and interesting to implement.
What it does
Neural Rot is a model that classifies and accurately labels brain rot and meme hand movements, and common hand signs/gestures.
How we built it
We used OpenCV, MediaPipe and SciKit-Learn to build this project. OpenCV for video capture, MediaPipe to track finger, wrist, arm and head movement, and SciKit-Learn to train the model to classify the hand/arm movement.
Challenges we ran into
Finding datasets applicable to our project was difficult, and so we had to spend a lot of time creating our own data by recording hundreds of videos of us performing the action. The model is currently trained on 6 labels, each having 700 samples each.
Accomplishments that we're proud of
We're proud that we learned how to use MediaPipe, OpenCV, and SciKit-Learn, and how to create high quality datasets, and pre-process them to minimize inaccuracies due to poor data quality.
What we learned
We learned how to use MediaPipe, OpenCV, SciKit-Learn, , data creating and data pre-processing techniques.
What's next for Neural Rot
The next step for us is to expand the data set, and then pivot to a neural network model to classify the data.
Built With
- css
- flask
- html
- javascript
- joblib
- matplotlib
- mediapipe
- numpy
- opencv
- python
- rest-api
- scikit-learn
Log in or sign up for Devpost to join the conversation.