robotics

Grip-X

Gesture-Controlled Robotic Claw Using ML5.js for Assistive Technology

ROLE
Software Engineer
Hardware Engineer
tools
Laser Cutting
Machine Learning ML5.js
JavaScript
Arduino + Circuits
timeline
Sept - Dec 24
Introduction

The project combines physical computing with gesture recognition. Instead of using buttons or a joystick, Grip-X reacts to hand gestures in front of a webcam. The ml5.js model detects finger positions and sends commands through p5.js to Arduino. The Arduino then drives three servo motors to move the robotic arm and claw.

The goal was to let the claw mimic human hand gestures as closely as possible.
Developing the Software + Hardware

I built a basic robotic claw with servo motors. One servo moves the arm up and down, one rotates, and one opens/closes the claw. Everything connects to an Arduino board. I used an Open Source library, ml5.js Handpose model inside a p5.js sketch.

The model tracks finger gestures in real-time using the computer camera. I mapped gestures (like open palm = open claw, pinch = close claw) and sent the signals through serial communication to the Arduino.

User Testing

Presented in NYU ITP 2024 Fall Winter Exhibition