Monthly Archives

# November 2021

Preliminary Results on Screw Detection and localization in 3D Space

The purpose of this task was to:

• Check the depth accuracy of the 3D sensors (Realsense Modules)
• Estimate the error derived from the transformations that being occurred in order to convert the object detections from 2D to 3D coordinations
• Evaluate the impact of the perspective transformations (camera to robot base and inverse).

The following setup has been created in our lab (Figure 1).

• An aruco marker that represents the robot base has been put on the lab’s table. “Aruco point” refers to its center point.
• For precision purposes, 7mm diameter screws with 4mm cross in the center have been printed (Point 1-6). This is the real size of the screws on the microwave oven.

Real World distances.

• Grid paper has been used as a guide for measuring the distances between every point. Moreover, a hand meter and a ruler were used for confirmation purposes. At this stage, all the measurements have been calculated with respect to the camera location. Ros Calculated distances.

• Using the mouse pointer in the RGB frame of the official realsense viewer, the center (Pixel 2D coordinates) of each screw (point) has been found. Those points have been converted to 3D coordinates using realsense deproject function and taking into account the intrinsic parameters of the camera. Based on the 3D coordinates the distance between each point has been calculated using the following formula: A sample chart of the differences between Real World and Ros Calculated distances for point2 is figured above. • For visualization purposes, PointStamped ros messages have been created for each point using the aforementioned 3D coordinates. Figure 2 illustrates the pointStamped messages derived from the above procedure. Robot Base perspective transformation

• The same procedure and lab setup has been followed for this approach too. The only difference is that every coordinate in this approach is transformed with respect to the Aruco marker (robot base). This procedure has been used to estimate the perspective transformation error. Figure 3 shows the transformed pointStamped messages in Rviz.  