×
Vision based interaction for AMR
Gesture recognition for enhanced human-robot collaboration in intralogistics application and seamless ROS2 integration.
View Source
Features:
- Engineered a production-grade custom gesture recognition system
- Modular ROS2-based system architecture for gesture commands
- Implemented algorithms for multi-object detection using ZED 2i
- Built continuous integration pipeline and software stack in FAPS GitLab
- Designed and documented system architecture in Obsidian
- Developed a Behavior Tree for modular and reusable gesture commands
Tech Stack:
C++
ROS2
ZED2i
Python
Behavior Tree
GitLab
Obsidian