Disclaimer: Don’t dab if you have epilepsy.
To use it, make sure you are in a well lit environment, accept the camera use request, and stand up so that your whole body shows in the image.
It uses PoseNet, a ml5 model for Real-time Human Pose Estimation. With the data provided by the model, it preforms a really simple check to see if the pose should count as a Dab. It works by checking if the right hand is close to the left ear, and the angles of the elbows are above or under a specified threshold. Vice-versa for the left side. This is a very naive approach, and it wasn’t what I originally intended to do, but after implementing this as a prototype, I realized it worked much better than expected. You can see the function used in the code below, called
My plan was to train a simple categorization model using ml5, so it’s a bit of a shame that it doesn’t now. I intend to make some other experiment that does use that feature.
The video, skeleton and emojis are drawn using the p5.js rendering library, which works really well next to ml5.
Uuuhhh... Because my sense of humor got stuck a couple years ago and I non-ironically think this is funny.
In all seriousness, because I wanted to try something with pose recognition, and the first pose that came to mind that was interesting was a Dab.