
Hello!!
I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source that provides the difference between the previous position and the current one by giving a traslation vector and a quaternion.
IMU data is in baselink frame and the other source is in the reference frame (I selected odom). I thought of using a PoseWithCovarianceStamped for the second source, but I'm not sure if robot localization works with variations in pose, so the questions are:
Is there any way I can fuse them with robot_localization? As the second source does not give a complete pose but a change in pose do I have to modify this data? Should I add all the previous estimations plus the current one to get a the "global" pose (instead of the variation) and send that one to EKF along with the IMU?
Thank you.
EDIT:
I was trying to fuse just my estimation to see how it performed, this is the yaml code:
frequency: 5 sensor_timeout: 2 two_d_mode: false transform_time_offset: 0.0 transform_timeout: 0.0 print_diagnostics: false debug: true debug_out_file: /home/javier/pruebas/MasPruebas/Data/file.txt publish_tf: true publish_acceleration: false map_frame: map # Defaults to "map" if unspecified odom_frame: odom # Defaults to "odom" if unspecified base_link_frame: erlecopter/imu_link # Defaults to "base_link" if unspecified world_frame: odom # Defaults to the value of odom_frame if unspecified odom0: /Camera_estimation odom0_config: [false, false, false, false, false, false, true, true, true, true, true, true, false, false, false] odom0_differential: false; odom0_relative: true odom0_queue_size: 2 #imu0: /erlecopter/imu #imu0_config: [false, false, false, # false, false, false, # false, false, false, # true, true, true, # true, true, true] #imu0_relative: true #imu0_queue_size: 100 process_noise_covariance: [diag(small values)] initial_estimate_covariance: [diag(1e-6)] And the main code is something like this:
void ImageCallback() { /*Estimate displacements*/ double interval=(stamp-Prev_time).toSec(); nav_msgs::Odometry pub_msg; pub_msg.header.seq=seq; pub_msg.header.stamp=stamp; pub_msg.header.frame_id="odom"; pub_msg.child_frame_id="odom"; pub_msg.twist.twist.linear.x=tras.at<double>(0)/interval; pub_msg.twist.twist.linear.y=tras.at<double>(1)/interval; pub_msg.twist.twist.linear.z=tras.at<double>(2)/interval; pub_msg.twist.twist.angular.x=(angles.roll-Prev_angles.roll)/interval; pub_msg.twist.twist.angular.y=(angles.pitch-Prev_angles.pitch)/interval; pub_msg.twist.twist.angular.z=(angles.yaw-Prev_angles.yaw)/interval; pub_msg.twist.covariance=Covariance_default; Pub_estimation.publish(pub_msg); Prev_time=stamp; } int main(int argc, char **argv) { ros::init(argc, argv, "image2angles"); ros::NodeHandle nh; image_transport::ImageTransport it(nh); image_transport::Subscriber sub_img = it.subscribe("/erlecopter/bottom/image_raw", 1, imageCallback); ros::Subscriber sub_h = nh.subscribe("/erlecopter/imu", 1, imuCallback); ros::Subscriber sub_h2 = nh.subscribe("/odometry/filtered", 1, ScaleCallback); Pub_estimation=nh.advertise<nav_msgs::Odometry>("Camera_estimation",1); ros::spin(); } I've checked the Camera_estimation topic and the messages are being published, but odometry/filtered isn't publishing anything.
Originally posted by Aquas on ROS Answers with karma: 32 on 2020-06-19
Post score: 0