Enhancing Humanoid Robot functionality through vision-based navigation with fall recovery and object manipulation
DOI:
https://doi.org/10.3126/jiee.v8i1.82571Keywords:
Autonomous navigation, Degree of Freedom, Distance estimation, Humanoid RobotAbstract
The robotics sector struggles to integrate vision-based navigation on a bipedal humanoid robot capable of performing human-like tasks. Although the use of ultrasonic sensors and infrared sensors is a traditional method for object detection, it has significant drawbacks such as low range, high cost and sensitivity to the environment. “Enhancing Humanoid Robot Functionality Through Vision-Based Navigation with Fall Recovery and Object Manipulation” proposes to give vision to the robot, making it capable of transporting objects from one location to another. The two ESP32-CAMs are used as a stereo camera for image capturing, employing the use of YOLOv11 for object detection, and the principle of the stereo camera for depth calculation. With the use of one of the most robust and accurate object detection algorithms available, the project aims to enhance object transportation within the visual range of the robot. The final robot can navigate intelligently and grab objects using image processing. The developed humanoid robot encompasses the feature of automatic fall recovery in simulation and natural human movement patterns through kinematical calculations, showcasing potential applications in hazardous environments, industrial automation and interplanetary exploration.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 JIEE and the authors

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Upon acceptance of an article, the copyright for the published works remains in the JIEE, Thapathali Campus and the authors.