1
$\begingroup$

I'm working on SLAM using Nav2 and Slam_Toolbox and am attempting to create a 2D Occupancy Map.

My setup is as follows:

  • Jetson Orin AGX
  • ROS2 Humble
  • Ouster OS1-32 Lidar
  • pointcloud_to_laserscan to convert my Lidar's 3D point cloud into a 2D laser scan
  • I have an odometry node that's publishing successfully

The Problem

I'm attempting to create a map of my apartment using slam_toolbox, however, when I run pointcloud_to_laserscan and select the /scan topic in LaserScan in Rviz, the left/right sides of the map are flipped and the robot is facing backwards.

Here's what the correct map in Rviz looks like using my Ouster Lidar driver:

ouster

And here's what my laserscan looks like in Rviz (if you compare the following laserscan screenshots to the above ouster lidar driver screenshot, you'll see the the left/right sides of the laserscan map are flipped and the robot is facing backwards):

laser

laser-2

TF Tree

Below is a pdf of my tf tree that's generated when running: ros2 run tf2_tools view_frames

tf

Here's an image of my actual robot:

r

My Code

  • Here's my static transforms file:
from launch import LaunchDescription from launch_ros.actions import Node def static_tf(x, y, z, roll, pitch, yaw, frame_id, child_frame_id): return Node( package="tf2_ros", executable="static_transform_publisher", name=f"static_tf_{child_frame_id.replace('/', '_')}", arguments=[str(x), str(y), str(z), str(roll), str(pitch), str(yaw), frame_id, child_frame_id], output="screen" ) def generate_launch_description(): return LaunchDescription([ static_tf(0, 0.287655, 0, 0, 0, 0, "base_footprint", "left_wheel_link"), static_tf(0, -0.287655, 0, 0, 0, 0, "base_footprint", "right_wheel_link"), static_tf(0.41275, 0, 0, 0, 0, 0, "base_footprint", "caster_link"), static_tf(0.47625, 0, 0.24655, 0, 0, 0, "base_footprint", "oak_d_base_frame"), static_tf(0.02, 0, 0, 0, 0, 0, "oak_d_base_frame", "oak_d_frame"), static_tf(0, 0, 0, -1.5708, 0, -1.5708, "oak_d_frame", "oak_rgb_camera_optical_frame"), static_tf(0, 0, 0.3912, 0, 0, 0, "base_footprint", "os_sensor"), static_tf(0, 0, 0.025, 0, 0, 3.1416, "os_sensor", "os_lidar"), ]) 
  • Here's my pointcloud_to_laserscan command:
ros2 run pointcloud_to_laserscan pointcloud_to_laserscan_node --ros-args -r cloud_in:=/ouster/points -r scan:=/scan -p target_frame:=base_link -p transform_tolerance:=0.01 -p min_height:=-1.0 -p max_height:=1.0 -p angle_min:=-3.14 -p angle_max:=3.14 -p angle_increment:=0.003 -p range_min:=0.3 -p range_max:=10.0 -p scan_time:=0.1 -p use_header_stamp:=true 
  • Here's my slam_toolbox launch command (I haven't changed any of the default params here):
ros2 launch slam_toolbox online_async_launch.py 

What I've Done to Debug Thus Far

A very helpful stack exchange member looked through the ouster lidar coordinate frames documentation and suggested that I modify my pointcloud_to_laserscan command to this:

ros2 run pointcloud_to_laserscan pointcloud_to_laserscan_node --ros-args -r cloud_in:=/ouster/points -r scan:=/scan -p target_frame:=base_footprint -p transform_tolerance:=0.01 -p min_height:=-1.0 -p max_height:=1.0 -p angle_min:=0.0 -p angle_max:=6.2832 -p angle_increment:=0.003 -p range_min:=0.3 -p range_max:=10.0 -p scan_time:=0.1 -p use_header_stamp:=true 

So I ran that command, and now the robot seems to be facing the correct way and the orientation of the map looks correct. However, half the map is being cut off as you can see from the images below:

m

s

If anyone has any suggestions, it'd be much appreciated!

$\endgroup$
3
  • $\begingroup$ Hey, glad things are getting better! It's getting late for me, so I won't be able to look at things in more detail until tomorrow, but for the moment I would urge you to visualize the map, the laser scan, and the point cloud all together. The fact that it looks like half your laser scan output is missing is highly suspect, and I would investigate this as some issue with the point cloud to laser scan conversion before considering it a failure in the mapping node. Looking at it again, the top image shows laser scan points 360 degrees around, but the bottom image doesn't. Look into that first. $\endgroup$ Commented Jun 30 at 1:43
  • $\begingroup$ One thing I'm wondering is if your scan time is appropriate. Looking at things with fresh eyes (and on a bigger screen) I see that your point cloud and laser_scan are not aligned. I'm assuming you did not move the robot between images, but as I mentiond above the best thing to do is to visualize all the topics at the same time, which will give you immediate feedback about correctness of the conversion. $\endgroup$ Commented Jun 30 at 13:12
  • $\begingroup$ A timing issue would not explain the difference between your earlier laser_scan, which shows 360 degrees of points, and the later image, which shows 180 degrees of points. The timing issue I'm thinking of is that I thought the Ouster provided data at 320 Hz, with "wedges" of data coming at something like 32 wedges per revolution. The point cloud header you gave earlier shows 32x1024 points, which is the correct size for a wedge and would not be a full revolution. $\endgroup$ Commented Jun 30 at 13:26

1 Answer 1

1
$\begingroup$

After a bit more digging, I found this question where the user is only getting 180 degrees out of pointcloud_to_laserscan with the angular limits set to 0 to 2 pi. The accepted answer is to use -pi to pi, which did get the full scan that you had earlier, but that also seemed to be causing a 180 degree rotation in your output data.

The cropping-to-180-degrees issue seems like a bug to me, but I think it's possible to work around this by putting the laser_scan data in its own frame, and then you can use static transforms to adjust that frame.

Consider the following static transform:

ros2 run tf2_ros static_transform_publisher 0 0 0 3.14159 0 0 os_lidar laser_scan_output 

This gets you a frame, laser_scan_output, that is coincident with os_lidar but is rotated 180 degrees (pi radians). Then you can use the target_frame parameter to transform the pointcloud into that frame before converting to the laser scan. I believe this should wind up rotating the pointcloud 180 degrees (the mismatch between the Ouster's 0-360 scan and the laser_scan's (-180)-180 scan), and then rotating the pointcloud 180 degrees again (the difference between the os_lidar and laser_scan_output frames), netting you the same orientation.

The parameter specifically:

-p target_frame:=laser_scan_output 

The full command you're using, reverting the min/max angles and changing the target frame:

ros2 run pointcloud_to_laserscan pointcloud_to_laserscan_node --ros-args -r cloud_in:=/ouster/points -r scan:=/scan -p target_frame:=laser_scan_output -p transform_tolerance:=0.01 -p min_height:=-1.0 -p max_height:=1.0 -p angle_min:=-3.14 -p angle_max:=3.14 -p angle_increment:=0.003 -p range_min:=0.3 -p range_max:=10.0 -p scan_time:=0.1 -p use_header_stamp:=true 
$\endgroup$
1
  • 1
    $\begingroup$ That worked! Thank you so much @Chuck - you're a legend! Really appreciate all your help. Now that the orientation is fixed and I'm getting a full map, when I drive my robot around my apartment the quality of the map is very poor. The map seems to jump out of place and starts mapping over the current map. I'm not sure if this is loop closure trying to sort itself out. I made a new post here: robotics.stackexchange.com/questions/116947/… If you have any suggestions, it'd be much appreciated! $\endgroup$ Commented Jul 1 at 5:02

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.