Exploration into novel methods of interacting with your computer! This project makes use of on-device ML to track hands in your webcam. Try it out here: https://floating-hands-whiteboard.vercel.app/.
97aF_Y9gWM-mNkTd.mp4
You can run this project by running the following:
npm install
npm start
This project leverages Google's MediaPipe. We use a trained classification model on top of the simple thresholding techniques. The classifier can be customized through a notebook found in this example.
Other solutions for gesture recognition could be through K-Means like in this blog post.
Feel free to contribute to the project through a pull request if you have new features in mind!