Date of Award

Fall 2024

Document Type

Open Access Thesis

Department

Mechanical Engineering

First Advisor

Ramy Harik

Abstract

Amidst the era of Industry 4.0, robots became integral to automated production lines by performing complex tasks with high precision and adapting to changing production needs in real-time. Therefore, ensuring their precise and reliable operation is paramount for maintaining high-quality standards and operational efficiency. This invoked the need for an automated motion validation tool that guarantees accurate program task execution, by detecting deviations or anomalies that may indicate mechanical faults or software errors. Given the exponential advancements in AI, particularly in Computer Vision, a vision-based solution is ideal for ensuring the correct positioning of a robot in real-time. Operating independently from internal sensors and mechanical equipment, this approach is robust and resistant to the errors that can affect those traditional systems. To address this need, our project aims to develop a robust framework using a 2D Pose Estimation algorithm (YOLOv8) that detects the robot’s joints in real-time via two RGB Intel cameras and validates them against their corresponding sensor joint coordinates. The model was trained on a custom dataset comprising over 24,000 labeled images. To avoid the labor-intensive process of manual labeling, we implemented an automatic labeling scheme using forward kinematics, camera calibration, and a custom joint visibility check algorithm to determine whether a joint is obstructed in a given image. Additionally, we developed a custom stereo triangulation algorithm that leverages 2D predictions from both camera views to deduce the 3D coordinates of each joint. A Graphical User Interface (GUI) was also designed and created, allowing operators to monitor and analyze the error between predictions and sensor values through time-series plots based on their chosen metrics and thresholds. The GUI provides features to save error logs, view camera specifications and current frames, and display both 3D and 2D plots of triangulated and reference robot poses in real-time. It also visualizes the predicted joints and sensor coordinate values, offering a comprehensive tool for real-time monitoring and validation of robot positioning. This development enables validation of robotic entities in industrial settings and facilitates adoption in critical manufacturing industries.

Rights

© 2025, Jad Samaha

Included in

Robotics Commons

Share

COinS