Quote: "The robot_localization package is used to provide a fused and locally accurate smooth
odometry information from the data provided by N odometry sensor inputs. These information
can be provided to the package through nav_msgs/Odometry, sensor_msgs/Imu,
geometry_msgs/PoseWithCovarianceStamped, and geometry_msgs/TwistWithCovarianceStamped
messages.
A usual robot setup consists of at least the wheel encoders and IMU as its odometry sensor
sources.
When multiple sources are provided to robot_localization, it is able to fuse the odometry
information given by the sensors through the use of state estimation nodes. These nodes make use
of either an Extended Kalman filter (ekf_node) or an Unscented Kalman Filter (ukf_node)
to implement this fusion. In addition, the package also implements a
navsat_transform_node
which transforms geographic coordinates into the robot's world frame when working with GPS.
Fused sensor data is published by the robot_localization package through the
odometry/filtered
and the accel/filtered topics, if enabled in its configuration. In addition, it can also
publish the odom => base_link transform on the /tf topic.
If your robot is only able to provide one odometry source, the use of
robot_localization
would have minimal effects aside from smoothing. In this case, an alternative approach is to
publish transforms through a tf2 broadcaster in your single source of odometry node.
Nevertheless, you can still opt to use robot_localization to publish the transforms
and some smoothing properties may still be observed in the output.
We will use an Extended Kalman Filter (ekf_node) to fuse odometry information and publish the odom => base_link transform.
We need to set the parameters for the ekf_node using a YAML file. In a folder called config, create a file named ekf.yaml.
### ekf config file ### ekf_filter_node: ros__parameters: # The frequency, in Hz, at which the filter will output a position estimate. Note that the filter will not begin # computation until it receives at least one message from one of theinputs. It will then run continuously at the # frequency specified here, regardless of whether it receives more measurements. Defaults to 30 if unspecified. frequency: 30.0 # ekf_localization_node and ukf_localization_node both use a 3D omnidirectional motion model. If this parameter is # set to true, no 3D information will be used in your state estimate. Use this if you are operating in a planar # environment and want to ignore the effect of small variations in the ground plane that might otherwise be detected # by, for example, an IMU. Defaults to false if unspecified. two_d_mode: false # Whether to publish the acceleration state. Defaults to false if unspecified. publish_acceleration: true # Whether to broadcast the transformation over the /tf topic. Defaultsto true if unspecified. publish_tf: true # 1. Set the map_frame, odom_frame, and base_link frames to the appropriate frame names for your system. # 1a. If your system does not have a map_frame, just remove it, and make sure "world_frame" is set to the value of odom_frame. # 2. If you are fusing continuous position data such as wheel encoder odometry, visual odometry, or IMU data, set "world_frame" # to your odom_frame value. This is the default behavior for robot_localization's state estimation nodes. # 3. If you are fusing global absolute position data that is subject to discrete jumps (e.g., GPS or position updates from landmark # observations) then: # 3a. Set your "world_frame" to your map_frame value # 3b. MAKE SURE something else is generating the odom->base_link transform. Note that this can even be another state estimation node # from robot_localization! However, that instance should *not* fuse the global data. map_frame: map # Defaults to "map" if unspecified odom_frame: odom # Defaults to "odom" if unspecified base_link_frame: base_link # Defaults to "base_link" ifunspecified world_frame: odom # Defaults to the value ofodom_frame if unspecified odom0: demo/odom odom0_config: [true, true, true, false, false, false, false, false, false, false, false, true, false, false, false] imu0: demo/imu imu0_config: [false, false, false, true, true, true, false, false, false, false, false, false, false, false, false]
For more information on the other parameters you can modify, see Parameters of state estimation nodes. A sample efk.yaml can be found here.
To add a sensor input to the ekf_filter_node, add the next number in the sequence to its base name (odom, imu, pose, twist). In our case, we have one nav_msgs/Odometry and one sensor_msgs/Imu as inputs to the filter, thus we use odom0 and imu0. We set the value of odom0 to demo/odom, which is the topic that publishes the nav_msgs/Odometry. Similarly, we set the value of imu0 to the topic that publishes sensor_msgs/Imu, which is demo/imu.
You can specify which values from a sensor are to be used by the filter using the _config parameter. The order of the values of this parameter is x, y, z, roll, pitch, yaw, vx, vy, vz, vroll, vpitch, vyaw, ax, ay, az. In our example, we set everything in odom0_config to false except the 1st, 2nd, 3rd, and 12th entries, which means the filter will only use the x, y, z, and the vyaw values of odom0.
In the imu0_config matrix, you'll notice that only roll, pitch, and yaw are used. Typical mobile robot-grade IMUs will also provide angular velocities and linear accelerations. For robot_localization to work properly, you should not fuse in multiple fields that are derivative of each other. Since angular velocity is fused internally to the IMU to provide the roll, pitch and yaw estimates, we should not fuse in the angular velocities used to derive that information. We also do not fuse in angular velocity due to the noisy characteristics it has when not using exceptionally high quality (and expensive) IMUs.
For more advise on configuration of input data to robot_localization, see Preparing Your Data for Use with robot_localization, and Configuring robot_localization.
Now, let us add the ekf_node into the launch file (see the .ZIP file available by a link above for a complete code of a launch file):
robot_localization_node = launch_ros.actions.Node( package='robot_localization', executable='ekf_node', name='ekf_filter_node', output='screen', parameters=[os.path.join(pkg_share, 'config/ekf.yaml'), {'use_sim_time': LaunchConfiguration('use_sim_time')}] )
Next, add the following launch arguments within the return launch.LaunchDescription(...
launch.actions.DeclareLaunchArgument(name='use_sim_time', default_value='True', description='Flag to enable use_sim_time'),
Lastly, add robot_localization_node, above the rviz_node line to launch the robot localization node.
robot_state_publisher_node, spawn_entity, robot_localization_node, rviz_node ])
Next, we need to add the robot_localization dependency to our package definition. Open package.xml and add the following line below the last <exec_depend> tag.
install( DIRECTORY src launch rviz config DESTINATION share/${PROJECT_NAME} )