Contents
Overview
The ZED ROS wrapper provides access to all camera sensors and parameters through ROS topics, parameters and services. The zed-ros-wrapper is available for all ZED stereo cameras: ZED2, ZED Mini and ZED.
Getting Started
Follow the official Stereolabs "Getting Started" guide to install the ZED ROS wrapper with all the latest features.
Github repositories
zed-ros-wrapper: this is the main repository. It contains the source code of the ZED Wrapper node and the instruction about how to compile and run it.
zed-ros-examples: this repository is a collection of examples and tutorials to illustrate how to better use the ZED cameras in the ROS framework.
Start the node
The ZED Wrapper node with the default parameters can be started using one of the three launch files, according to the model of camera:
ZED: $ roslaunch zed_wrapped zed.launch
ZED Mini: $ roslaunch zed_wrapped zedm.launch
ZED 2: $ roslaunch zed_wrapped zed2.launch
Published Topics
Left Camera
rgb/image_rect_color (sensor_msgs/Image)
- Color rectified image (left RGB image by default).
- Grayscale rectified image (left RGB image by default).
- Color camera calibration data.
- Color unrectified image (left RGB image by default).
- Grayscale unrectified image (left RGB image by default).
- Color unrectified camera calibration data.
- Left camera color rectified image.
- Left camera grayscale rectified image.
- Left camera calibration data.
- Left camera color unrectified image.
- Left camera grayscale unrectified image.
- Left unrectified camera calibration data.
Right Camera
right/image_rect_color (sensor_msgs/Image)
- Color rectified right image.
- Grayscale rectified right image.
- Right camera calibration data.
- Color unrectified right image.
- Grayscale unrectified right image.
- Right unrectified camera calibration data.
Sensors
imu/data (sensor_msgs/Imu)
- Accelerometer, gyroscope, and orientation data in Earth frame [only ZED-M and ZED 2].
- Accelerometer and gyroscope data in Earth frame [only ZED-M and ZED 2].
- Calibrated magnetometer data [only ZED 2].
- Atmospheric pressure data [only ZED 2].
- Temperature of the IMU sensor [only ZED 2].
- Temperature of the left camera sensor [only ZED 2].
- Temperature of the right camera sensor [only ZED 2].
- Transform from left camera to IMU sensor position.
Stereo Pair
stereo/image_rect_color (sensor_msgs/Image)
- Stereo rectified pair images side-by-side.
- stereo unrectified pair images side-by-side.
to retrieve the camera parameters you can subscribe to the topics left/camera_info, right/camera_info, left_raw/camera_info and right_raw/camera_info
Depth and Point Cloud
depth/depth_registered (sensor_msgs/Image)
- Depth map image registered on left image (32-bit float in meters by default).
- Depth camera calibration data.
- Registered color point cloud.
- Confidence map (floating point values to be used in your own algorithms).
- Disparity map.
Positional Tracking
odom (nav_msgs/Odometry)
- Camera position and orientation in free space relative the Odometry frame (pure visual odometry for ZED, visual-intertial for ZED 2 and ZED-M).
- Camera position and orientation in free space relative the Map frame (given by sensor fusion + SLAM + loop closure algorithm).
- Camera pose relative to Map frame with covariance.
- Trajectory of the camera in Map frame without loop closures (continuous).
- Trajectory of the camera in Map frame with loop closures (may contain jumps).
Mapping
mapping/fused_cloud (sensor_msgs/PointCloud2)
- 3D point cloud map generated from fusion of point cloud over the camera trajectory.
Published only if mapping is enabled, see mapping/mapping_enabled parameter and start_3d_mapping service.
Object Detection [only ZED 2]
objects (zed_interfaces/object_stamped)
- Array of the detected/tracked objects for each camera frame [only ZED 2].
- Array of markers of the detected/tracked objects to be rendered in Rviz [only ZED 2].
Published only if Object Detection is enabled, see object_detection/od_enabled parameter and start_object_detection service.
Diagnostic
diagnostics (diagnostic_msgs/DiagnosticStatus)
- ROS diagnostic message for ZED cameras.
Node Parameters
You can specify the parameters to be used by the ZED node modifying the values in the files
params/common.yaml: common parameters to all camera models
params/zed.yaml: parameters for the ZED camera
params/zedm.yaml: parameters for the ZED Mini camera
params/zed2.yaml: parameters for the ZED 2 camera
General Parameters
general/camera_name (string, default: "zed")
- A custom name for the ZED camera. Used as namespace and prefix for camera TF frames
- Type of Stereolabs camera. Used to load the correct model in RVIZ example and to start the correct SDK modules.
- Select a ZED camera by its ID. IDs are assigned by Ubuntu. Useful when multiple cameras are connected. ID is ignored if an SVO path is specified
- Select a ZED camera by its Serial Number
- Set ZED camera resolution [0: HD2K, 1: HD1080, 2: HD720, 3: VGA]
- Set ZED camera video framerate
- Select a GPU device for depth computation
- Frame_id of the frame that indicates the reference base of the robot
- Enable/disable the verbosity of the SDK
- Set SVO compression mode for saving [0: LOSSLESS (PNG/ZSTD), 1: H264 (AVCHD) ,2: H265 (HEVC)]
- Enable/disable self calibration at starting
- Flip the camera data if it is mounted upsidedown
Video Parameters
video/img_downsample_factor (double, default: 1.0)
- Resample factor for images [0.01,1.0]. The SDK works with native image sizes, but publishes rescaled image.
- if false extrinsic parameter in camera_info will use ROS native frame (X FORWARD, Z UP) instead of the camera frame (Z FORWARD, Y DOWN) [true use old behavior as for version < v3.1]
Sensor Parameters [only ZED-M and ZED 2]
sensors/sensors_timestamp_sync (bool, default: false)
- Synchronize Sensors message timestamp with latest received frame
Depth Parameters
depth/quality (int, default: 1)
- Select depth map quality [0: NONE, 1: PERFORMANCE, 2: MEDIUM, 3: QUALITY, 4: ULTRA]
- Select depth sensing mode (change only for VR/AR applications) [0: STANDARD, 1: FILL]
- Enable depth stabilization. Stabilizing the depth requires an additional computation load as it enables tracking
- Convert 32bit depth in meters to 16bit in millimeters [0: 32bit float meters, 1: 16bit uchar millimeters]
- Minimum value allowed for depth measures. Min: 0.3 (ZED), 0.1 (ZED-M) 0.2 (ZED 2), Max: 3.0 - Note: reducing this value will require more computational power and RAM memory. In cases of limited computation power, increasing this value can provide better performance.
- Maximum value allowed for depth measures [Min: 1.0, Max: 30.0 - Values beyond this limit will be reported as TOO_FAR]
- Resample factor for depth data matrices [0.01,1.0]. The SDK works with native data sizes, but publishes rescaled matrices (depth map, point cloud, ...)
Position Parameters
pos_tracking/publish_tf (bool, default: true)
- Enable/disable publishing of the transforms between odom → base_link and imu_link → base_link. Note: The IMU frame can be published at a higher frequency than odom and map.
- Enable/disable publishing of the map TF frame relative to the position of the base frame (i.e. base_link) in the map frame. Note: the value is ignored if publish_tf is false.
- Frame_id of the pose message
- Frame_id of the odom message
- Enable area learning to correct odometry drift with loop closures.
- Enable smoothing of pose correction when drift is detected with loop closures. Only available if area_memory is enabled.
- Path of the database file for loop closure and relocalization that contains learnt visual information about the environment.
- If true, it detects the floor plane during position tracking initialization. The initial pose of zed_camera_center in map frame is estimated relative to the floor plane and the initial pose of base_link is recomputed according to the TF tree.
- Set the the initial pose of base_linkin map frame -> [X, Y, Z, R, P, Y]
- Set the initial pose of the odometry with the first valid pose received from positional tracking. Format: "x y z roll pitch yaw".
- Frequency (Hz) of publishing of the path messages
- Maximum number of poses kept in the pose arrays (-1 for infinite)
Mapping Parameters
mapping/mapping_enabled (bool, default: false)
- Enable/disable the mapping module
- Resolution of the fused point cloud [0.01, 0.2]
- Maximum depth range used for mapping, in meters (-1 to use recommended range depending on the selected resolution) [2.0, 20.0]
- Publishing frequency (Hz) of the 3D map as fused point cloud
Object Detection Parameters [only ZED 2]
object_detection/od_enabled (bool, default: false)
- Enable/disable the Object Detection module
- Minimum value of the detection confidence of an object [0,100]
- Enable/disable the tracking of the detected objects
- Enable/disable the detection of persons
- Enable/disable the detection of vehicles
Dynamic Parameters
Dynamic parameters cannot have a namespace, so they cannot be placed within their correct module. A note is added in the description of the parameter to reflect the correct module they belong.
pub_frame_rate (float, default: 15.0)
- [General] Frequency of the publishing of Video and Depth images (equal or minor to grab_frame_rate value) [0.1,60.0]
- [Depth] Threshold to reject depth values based on their confidence. Each depth pixel has a corresponding confidence. A lower value means more confidence and precision (but less density). An upper value reduces filtering (more density, less certainty). A value of 100 will allow values from 0 to 100. (no filtering). A value of 90 will allow values from 10 to 100. (filtering lowest confidence values). A value of 30 will allow values from 70 to 100. (keeping highest confidence values and lowering the density of the depth map). The value should be in [1,100]. By default, the confidence threshold is set at 100, meaning that no depth pixel will be rejected.
- [Depth] Threshold to reject depth values based on their textureness confidence. A lower value means more confidence and precision (but less density). An upper value reduces filtering (more density, less certainty). The value should be in [1,100]. By default, the confidence threshold is set at 100, meaning that no depth pixel will be rejected.
- [Depth] Frequency of the pointcloud publishing (equal or minor to grab_frame_rate value) [0.1,60.0]
- [Camera] Defines the brightness control [0,8]
- [Camera] Defines the contrast control [0,8]
- [Camera] Defines the hue control [0,11]
- [Camera] Defines the saturation control [0,8]
- [Camera] Defines the sharpness control [0,8]
- [Camera] Defines the sharpness control [1,9] Note: available with ZED SDK >=v3.1
- [Camera] Defines if the Gain and Exposure are in automatic mode or not
- [Camera] Defines the gain control [only if auto_exposure_gain is false] [0,100]
- [Camera] Defines the exposure control [only if auto_exposure_gain is false] [0,100]
- [Camera] Defines if the White balance is in automatic mode or not
- [Camera] Defines the color temperature value (x100) [42,65]
Services
start_svo_recording (zed_interfaces/start_svo_recording)
- Starts recording an SVO file. If no filename is provided the default zed.svo is used. If no path is provided with the filename the default recording folder is ~/.ros/
- Stops an active SVO recording
- Starts streaming over network to allow processing of ZED data on a remote machine. See Remote streaming
- Stops streaming over network
- Sets the current camera pose in map frame to the value passed as parameter. For example, can be used with GPS position corrections.
- Resets position in /map frame to the initial pose value available in the param server.
- Resets the odometry to the last pose in map frame received from positional tracking. It can be used to reset to the current "map position" when the odometry drift is large.
- Sets the status of the blue led -> True: LED ON, False: LED OFF
- Toggles the status of the blue led, returning the new status
- Starts the Spatial Mapping processing. See Spatial Mapping
- Stops the Spatial Mapping processing (works also with automatic start from configuration file)
- Starts the Object Detection processing. See Object Detection Note: returns error if not using a ZED 2 camera
- Stops the Object Detection processing (works also with automatic start from configuration file)
Nodelets
The zed_wrapper package contains the standalone ZED Wrapper node that can be started as is using the provided launch files, as described in the ZED Wrapper Node documentation.
But the ZED Wrapper has been designed to take the maximum advantage of the nodelet package, in order to run multiple algorithms in the same process with zero copy transport between algorithms.
The core of the ZED Wrapper is indeed the ZEDWrapperNodelet nodelet, available in the zed_nodelets package that provides the interface between the ZED SDK and the ROS environment.
zed_nodelets/ZEDWrapperNodelet
Same usage as the zed_wrapper_node.
Available as: zed_nodelets/ZEDWrapperNodelet
zed_nodelets/RgbdSensorsSyncNodelet
Sometimes it is useful to synchronize and mux a few topics in one single topic using their timestamp, for example to save a rosbag or to guarantee that all the topics published in the same instant are received on the other side of a network. To perform this operation we provide the nodelet zed_nodelets/RgbdSensorsSyncNodelet that allows to synchronize RGB, Depth, IMU and Magnetometer information in a single custom topic RGBDSensors.Subscribed Topics
rgb/image_rect_color (sensor_msgs/Image)- RGB image stream.
- Registered depth image stream.
- RGB camera metadata.
- Inertial data
- Magnetometer data
Published Topics
rgbd_sync (zed_interfaces/RGBDSensors)- The sync topic with RGB, Depth, Imu and Magnetometer data
Parameters
~zed_nodelet_name (string, default: zed_nodelet)- The name of the ZED nodelet publishing the topics to be synchronized.
- Use approximate synchronization for the input topics. If false all the message must have the same timestamp, this is almost impossible if subscribing also to IMU and Magnetometer topics and the parameter sensors_timestamp_sync is false in the ZED nodelet.
- Size of message queue for each synchronized topic.
- Synchronize IMU messages.
- Synchronize Magnetometer messages.
zed_nodelets/RgbdSensorsDemuxNodelet
A RGBDSensors topic message cannot be used by the standard ROS node. You must create your own node that subscribes it and extract all the available data. To make it easier to access the information stored in the RGBDSensors topic, we provide a demux nodelet: RgbdSensorsDemuxNodelet.Subscribed Topics
rgbd_sync (zed_interfaces/RGBDSensors)- The sync topic with RGB, Depth, Imu and Magnetometer data to be demuxed
Published Topics
rgb/image_rect_color (sensor_msgs/Image)- RGB image stream.
- Registered depth image stream.
- RGB camera metadata.
- Inertial data (only if published by zed_nodelets/RgbdSensorsSyncNodelet)
- Magnetometer data (only if published by zed_nodelets/RgbdSensorsSyncNodelet)
Examples and Tutorials
With the release v3.0 of the ZED Wrapper the examples and the tutorials have been removed from the main repository and a new special repository has been created.
Examples
zed_nodelet_example: shows how to use the nodelet intraprocess communication.
zed_rtabmap_example: shows how to use the ZED with the RTABMap package to generate a 3D map with the rtabmap_ros package.
zed_ar_track_alvar_example: shows how to use the ZED with the ar_track_alvar package to detect and track the position of AR tags.
Tutorials
zed_video_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type sensor_msgs/Image to retrieve the Left and Right rectified images published by the ZED node.
zed_depth_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type sensor_msgs/Image to retrieve the depth images published by the ZED node and to get the measured distance at the center of the image.
zed_tracking_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type geometry_msgs/PoseStamped and nav_msgs/Odometry to retrieve the position and the orientation of the ZED camera in the map and in the odometry frames.
zed_obj_det_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type zed_interfaces/objects to retrieve the list of objects detected by a ZED 2 camera Object Detection module.
zed_sensors_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type sensor_msgs/Imu, sensor_msgs/MagneticField, sensor_msgs/Temperature and sensor_msgs/FluidPressure published by ZED Mini and ZED 2.
Output visualization
zed_display_rviz: launch files to start many preconfigured rviz instances to display all the information provided by ZED2, ZED Mini and ZED.
zed_sensors_sub_tutorial: launch files to display sensor data with plotjuggler (more info)
Report an Issue
If you encounter an issue with the zed-ros-wrapper, please refer to the dedicated issue list on Github.