Preliminary Results on Screw Detection and localization in 3D Space
The purpose of this task was to:
- Check the depth accuracy of the 3D sensors (Realsense Modules)
- Estimate the error derived from the transformations that being occurred in order to convert the object detections from 2D to 3D coordinations
- Evaluate the impact of the perspective transformations (camera to robot base and inverse).
The following setup has been created in our lab (Figure 1).
- An aruco marker that represents the robot base has been put on the lab’s table. “Aruco point” refers to its center point.
- For precision purposes, 7mm diameter screws with 4mm cross in the center have been printed (Point 1-6). This is the real size of the screws on the microwave oven.
Real World distances.
- Grid paper has been used as a guide for measuring the distances between every point. Moreover, a hand meter and a ruler were used for confirmation purposes. At this stage, all the measurements have been calculated with respect to the camera location.
Ros Calculated distances.
- Using the mouse pointer in the RGB frame of the official realsense viewer, the center (Pixel 2D coordinates) of each screw (point) has been found. Those points have been converted to 3D coordinates using realsense deproject function and taking into account the intrinsic parameters of the camera. Based on the 3D coordinates the distance between each point has been calculated using the following formula:
A sample chart of the differences between Real World and Ros Calculated distances for point2 is figured above.
- For visualization purposes, PointStamped ros messages have been created for each point using the aforementioned 3D coordinates. Figure 2 illustrates the pointStamped messages derived from the above procedure.
Robot Base perspective transformation
- The same procedure and lab setup has been followed for this approach too. The only difference is that every coordinate in this approach is transformed with respect to the Aruco marker (robot base). This procedure has been used to estimate the perspective transformation error. Figure 3 shows the transformed pointStamped messages in Rviz.
Additional Notes and Conclusion
- Supplementary charts that adequately demonstrate important relationships or patterns between the data points can be provided upon request. Those charts mainly highlight the offset between each point.
- In general the procedure that has been followed is error prone in mm due to the hand calculated measurements and the hand picked pixel values. However, the results seem to be quite good.
- A real scenario of the microwave oven that will be used in ECORESET during the 1st pilot of the project has been examined too. Screws in this approach have been detected in 2D using the enhanced screw detector module that has been developed by CERTH. The middle (cross middle) pixel of each screw has been projected to 3D space. A pointStamped message has been created and visualized in rviz alongside with the scene’s point cloud. Using again the attached .rviz configuration you can easily observe the precision of the approach that has been followed in our lab. Figure 5 offer a screenshot of this visualization.