MemEmotion
We wanted to give life to Microsoft's emotion detection API, by pushing its boundaries and at the same time have fun. Using the Microsoft Project Oxford API, we took video frames and looked at the emotions displayed by the person, especially interested in the change of emotion over time. One interesting use case we through about is tracking emotions during a live video conference, and give feedback to the participants. Another relevant use case is an app that records people's reactions to videos, and recognizing what emotions they are experiencing.
We were equally interested in having a little bit of fun and playing around with databases. Through this idea, we created a MongoDB database of memes from google, ran them through the Emotion API, and got a score of emotions shown in the meme. We then related the memes to the pictures and video frames to associate each still image with the meme having the emotion that the person in the photo is expressing.
Built With
- github
- json
- jupyter-notebook
- microsoft-project-oxford
- mongodb
- python
Log in or sign up for Devpost to join the conversation.