
Picking up where I left ekf_localization-node-not-responding, I am trying to fuse position in x and y from 2 sonars (Tritech Micron MK III) with (double integrated accelerometer) position from an IMU (xsens MTi 30).
This happens inside a rectangular pool, so each sonar
- will track a side wall and measure the absolute distance to it (one sonar looking at the short wall, the other at the long wall),
- we have a distance and a heading attached to it
- and report x and/or y position either to use as
- absolute position with x axis || long side of pool, y || short side of pool
- or relative position to the starting point (differential_integration == true)
(The point being trying to get a quicker update rate compared to doing machine vision on a full scan)
The problem now is: How can I account for the sonar's offset from base_link since the measurements aren't really in the body frame?
I could either
- report the data with frame_id
odomormap, but how could I account for the mounting offset especially when the sub is not aligned with the side walls. - would that be possible with a single instance of robot_localization, or should then tread them as 'GPS'?
- report the data in a
Sonar1andSonar2frame and use the current heading of the sonar head (plus the fixed translation) to publish a transform fromSonar1tobase_link
- This means that I will have to use the attitude solution to track the wall, but that's outside the scope of robot_localization.
I believe the latter is the better approach. Does this make sense or how could it be improved?
Thanks,
Rapha
Wrap up on how to fuse the sonars
I am still working through implementing some of these steps, but I will document the process here.
What are the frames and transforms needed for fusing the sonars:
- Frames:
odompoolsonar1(will be fused for x only)sonar2(will be fused for y only)base_linkimu- Transforms:
odom->base_link(generated by ekf_localization)odom->pool(static transform, accounting for rotational offset of pool walls/axes with ENU)pool->sonar1(broadcasts a translation of the sonar mounting position relative tobase_linkrotated/trigonometried by the yaw angle ofpool->base_linktransform)pool->sonar2base_link->imu- Data:
- /sonar/position/x in frame
sonar1 - /sonar/position/y in frame
sonar2 - /imu/data in frame
imu - /svs/depth in frame
svs
WORK IN PROGRESS
EDIT (regarding 1st edit in Answer):
I have set it up the way you recommended and I publish a transform from odom-->sonar every time I publish a new sonar reading (just before). The rotational part of that transform uses the current attitude as published by the robot_localization node.
Robot_localization sees the following input at the moment:
- attitude from the IMU (imu/data)
- position from the sonars (sonar/position0 and sonar/position1)
However, after a few seconds the robot_localization node slows down to about 1 update every 5 seconds... EDIT: By 'slows down' I mean the message broadcast rate slows down (rostopic hz /odometry/filtered)
I wasn't sure if using the odom attitude was the right thing to use, so I tried it with the direct attitude solution from the imu.
The robot_localization node still slows down.
My Launchfile is below and I've uploaded a rosbag here. I tried to create a debug file, but no file was created anywhere. Any ideas? Thank you so much,
Rapha
EDIT 2: I am running branch master 3c3da5a537
EDIT 3: I have investigated a little further. You said it might be to do with missing/broken transforms.
When initially posting this question I was publishing the transforms (odom->sonar) upon every execution of the sonar processing node (whenever a raw message from the sonar arrived) --> immediately before publishing the sonar position (in frame sonar) itself. I wasn't sure how ROS handles timestamps and if robot_localization would receive and process the transform before the position update. So I decided to publish the transform whenever the IMU attitude gets published. This happens at 400 Hz and is thus much more often than then sonar position (25-30Hz).
This improved the problem: Instead of slowing down virtually instantly it now runs for a while before slowing down and eventually stopping altogether. Also if I now stop the ekf_localization node and restart it, it will start running again for about ~45s. Stopping the process by ctrl+c takes about 10s to do. When the node is running normally, this is virtually instant.
Sometimes (1/5 tries perhaps) the ekf_localization complains:
at line 240 in /tmp/buildd/ros-indigo-tf2-0.5.6-0trusty-20141014-0329/src/buffer_core.cpp Error: TF_NAN_INPUT: Ignoring transform for child_frame_id "base_link" from authority "unknown_publisher" because of a nan value in the transform (-nan -nan -nan) (-nan -nan -nan -nan) I believe there might be a connection with sometimes erratic data from the sonars, but I can't quite pinpoint it. I'll try improving their accuracy in the meantime.
EDIT 4: After changing the covariance matrix robot_localization does not crash any more. But...
I have looked at more data sources working together and I came across another problem. This very much relates to the initial question: ekf_localization-node-not-responding.
I am fusing the absolute orientation from the IMU differentially. However, when I am looking at the attitude solution generated (and I've done this for the 1st time in degrees... now) the value does not track the change in attitude.
I.e. a 90° change in yaw gets registered as ~3 to 9° and its quite sluggish.
I've tried this with only the IMU publishing data and robot_localization running and without robot_localization.
Would you mind looking to the attached bag files? The files show a -90° rotation about yaw followed by +90 ° about yaw. (Ending up in same orientation)
- The launchfile below is still up to date
- the IMU publishes in the
xsensframe and (as discussed in the other thread) - there is a static transform
base_link->xsensto account for the mounting.
Thank you very much for your help, Rapha
EDIT 5: Since fixing the covariance matrix robot_localization hasn't crashed anymore.
I am a little bit confused as of how the depth sensor (aka absolute z in odom) and the sonars (absolute x,y in odom) get fused. As per your suggestion I am publishing a transform with the offset of the sensor mounts to base_link and the current attitude solution from robot_localization as the rotation. (I hope I understood you correctly here). I have set ''differential integration = true'' for sonars (x,y in odom), but I don't want to do this for the depth sensor (z in odom).
When only fusing the depth sensor (and only using z in the ekf) I also get small values in the fused x,y. To me this odd, as I thought only the even z measure gets fused, but I can kind of make sense of it: The ekf fuses the untransformed z, then transforms it to base_link which produces x,y distances.
My question is: If I'm trying to fuse the x, y data ''differentially'' integrated, won't the fused x, y position be locked to the value generated from fusing the z measurements?
I could differentially fuse it, which should solve this, but the depth sensor does make for a very nice absolute measurement.
EDIT 5b: 2015-02-11-01-37-30.bag I updated the ekf launchfile below, but the rosbag is without the ekf.
I can currently only move along heave (z) (and due to issues with the buoyancy not that much):
- We started off slightly submerged,
- went down along negative heave / z (no change in attitude intended)
- and up to the surface.
Z aka/svs/depth represents it exactly.
X aka sonar/positionBMT and Y aka sonar/positionUWE should not have changed, but there is a jump in X when surfacing (from ~4m to ~2m).
Thank you very much for your help.
@EDIT 6: Below is the diagram. For the time being the sub's orientation is fixed in relation to the pool walls, but that will change. The sonars are each tracking one of the pool walls.
I also realised 2 things I am doing wrong:
- I assumed my
odomframe is aligned with ENU as well as the pool. Since the pool isn't pointing north this is not true. - The sonars measure distance in relation to the long and short side of the pool.
- I therefore need a
poolframe that is rotated fromodom, but shares the same origin. - The
odom->sonarBMT/sonarUWE/SVStransforms I am publishing are wrong I believe: - I think I misunderstood Toms instructions in the initial answer. At the moment my transform looks like this:
- Rotational component has the rotation of
odom->base_link - Translational component has the distance of
base_linktosonarBMT/sonarUWE/SVSmeasured inbase_link
- Rotational component has the rotation of
- I think it should be the a pure translational transform, where the translation is computed from the current robot heading (aka the rotation in
pool->base_link) and the mounting position offset inbase_link.

EDIT 6b: I have these new frames and transformations running and the pool->SONAR_UWE as well as pool->SONAR_BMT transforms look good. The Depth data seems to get fused correctly, but neither does the X or Y position from the sonars.
Might the problem be that my X and Y data coming from the sonars is fused separately? After being rotated, each individual measurement is not 'just' x or y. Once in odom it has a non-zero x or y component respectively and is then fused absolute... I might be totally misunderstanding part of the process here though.
EDIT 6c: 6b might have some truth to it. My SONAR_BMTaka the x axis (pool frame) sonar wasn't outputing data. While that is the case rosrun tf tf_echo pool base_link returns a fitting solution and so does rosrun tf tf_echo odom base_link. As soon as the sonar outputs data again the ekf solution outputs rubbish again.
I will try to put the sonar measurements together in one message and try to fuse that.
Originally posted by Raphael Nagel on ROS Answers with karma: 15 on 2015-01-16
Post score: 1
Original comments
Comment by Tom Moore on 2015-01-20:
Any chance you can record a bag file without running robot_localization? I need the tf tree to not have the odom->base_link transform in it.
Comment by Raphael Nagel on 2015-01-20:
No problem: 2015-01-20-19-10-36.bag
Most of the dynamic and static transforms (intentionally) have got a translation of (0,0,0) for the time being
Comment by Tom Moore on 2015-02-02:
Re: edit 4, it sounds like it's going to be a covariance issue. I'll check it out. Any reason you're using the differential setting?
Comment by Tom Moore on 2015-02-11:
Sorry, when you say that X should not have changed when surfacing, do you mean the raw data should not have changed, or that you think the EKF jumped? In the bag file, the /sonar/positionBMT topic changes from ~4 meters to ~2 meters at around 34 seconds into the bag file.
Comment by Raphael Nagel on 2015-02-12:
Sorry, that was a little unclear. It is a sonar processing artifact.The sonar has a vertical beamwidth of 30° and unless the robot is perfectly horizontal it will get a reflection of the water surface if the sub is not fully submerged.