Work

Projects

A collection of things I've built — mostly at the crossroads of AI, computer vision, and robotics.

Real Time Emotion Recognition

View on GitHub
Pythonpandasscikit-learnTensorFlowOpenCV

A live emotion recognition system that reads a webcam feed, detects faces in real time, and overlays the predicted emotional state directly onto the video stream.

This project grew out of my interest in how machines can interpret human behavior — something that bridges my neuroscience background with AI. The system uses OpenCV to capture and process each video frame, then passes detected faces through a convolutional neural network (CNN) trained with Keras to classify expressions into categories like happy, sad, angry, surprised, and neutral. Getting the model to perform accurately on a live feed — rather than just static images — required careful attention to preprocessing, frame rate, and inference speed. It was a rewarding challenge in making a model feel responsive and real-world ready.

Hand Tracking Mirrored to Virtual Hand

View on GitHub
PythonMediapipeOpenCVMuJoCo

A real-time hand tracking pipeline that captures finger and joint positions via webcam and mirrors the motion onto a Shadow hand model inside a MuJoCo physics simulation.

This project sits right at the intersection of computer vision and robotics — two areas I'm deeply interested in through my Robotics and Human-Centered AI concentration. Mediapipe's hand landmark model tracks 21 key points on the hand from a live camera feed, and those coordinates are then mapped to the corresponding degrees of freedom on a Shadow robotic hand model running in MuJoCo. The most interesting part was translating raw 2D landmark positions into meaningful joint angles that looked natural in the simulation. It gave me a hands-on introduction to how teleoperation and robot control systems work, and sparked a lot of ideas for where I want to take robotics projects next.