Hi there,
Our company owns Nerian Scarlet (25cm) and we want to use it for real time stereo vision + IMU SLAM, which it is advertised for. We were able to obtain both visual (left and right image) and IMU data using slightly modified ros1 node and record a bag, however we were unable to find transformation matrix (rotation and translation) between coordinate systems of (at least one) camera and IMU, which we need in order to use SLAM algorithm. Where can we find this information? I checked both Scarlet manual and forum unsuccessfully. Please help.
Kind regards,
Jan
IMU position relative to cameras
Re: IMU position relative to cameras
If there is no such data available (IMU-Cam calibration is not performed before Scarled being sold) I'll just perform the calibration process myself, I was just hopping I could avoid it
-
- Posts: 125
- Joined: Mon Mar 25, 2019 1:12 pm
Re: IMU position relative to cameras
Hi,
The rotation axes should be aligned with the coordinate system of the 3D point cloud (that is z-axis is the depth axis). I measured the offset of the IMU relative to the center of the left image sensor is. This is:
Along the baseline (X): 145.6mm
Vertically (Y): 13.1mm above the image sensor center
Depth (Z): 23 mm behind the image sensor
Best regards,
Konstantin
The rotation axes should be aligned with the coordinate system of the 3D point cloud (that is z-axis is the depth axis). I measured the offset of the IMU relative to the center of the left image sensor is. This is:
Along the baseline (X): 145.6mm
Vertically (Y): 13.1mm above the image sensor center
Depth (Z): 23 mm behind the image sensor
Best regards,
Konstantin
Re: IMU position relative to cameras
Hi Konstantin,
thank you very much.
Kind regards,
Jan
thank you very much.
Kind regards,
Jan
Re: IMU position relative to cameras
Hi Konstantin,
A follow-up question. We were able to use information from previous message to run stereo visual SLAM with our Nerian Scarlet also using IMU but algorithm diverges for some reason. It converges when we use only stereo visual SLAM but as soon as we use IMU data, it always diverges and we've gotten the same results with other stereo visual SLAM algorithms. This is why we suspected there might be something wrong with IMU data. We therefore recorded a ROS bag when camera was still/not moving. And identified strange behavior (see appended image of IMU data):
1) angular velocity data is much more smooth (approx. 4-times) i.e. observe flat lines in linear acceleration data in all axes where value is assumed to be constant (all values are sampled at the same frequency but linear acceleration data behaves as if it doesn't get values )
2) changes in linear acceleration data always happens at the same moment for all 3 axes.
Any ideas why this is happening? Also, have you or anyone else tested stereo visual + IMU SLAM with Nerian Scarlet? If so, can you share repository?
Kind regards,
Jan
A follow-up question. We were able to use information from previous message to run stereo visual SLAM with our Nerian Scarlet also using IMU but algorithm diverges for some reason. It converges when we use only stereo visual SLAM but as soon as we use IMU data, it always diverges and we've gotten the same results with other stereo visual SLAM algorithms. This is why we suspected there might be something wrong with IMU data. We therefore recorded a ROS bag when camera was still/not moving. And identified strange behavior (see appended image of IMU data):
1) angular velocity data is much more smooth (approx. 4-times) i.e. observe flat lines in linear acceleration data in all axes where value is assumed to be constant (all values are sampled at the same frequency but linear acceleration data behaves as if it doesn't get values )
2) changes in linear acceleration data always happens at the same moment for all 3 axes.
Any ideas why this is happening? Also, have you or anyone else tested stereo visual + IMU SLAM with Nerian Scarlet? If so, can you share repository?
Kind regards,
Jan
- Attachments
-
- nerian_imu.png (213.53 KiB) Viewed 904 times
-
- Posts: 125
- Joined: Mon Mar 25, 2019 1:12 pm
Re: IMU position relative to cameras
Hi Jan,
this is the data sheet for the IMU inside Scarlet:
https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bno055-ds000.pdf
The output characteristics for the accelerometer (linear accelerations) and gyroscope (angular velocities) can be found on pages 14 to 16.
For the accelerometer, there is an expected 2% cross axes sensitivity. In the real-world I would expect a stronger coupling, as it is difficult to perfectly stimulate only one axis.
We have other customers who are using Scarlet or Ruby in SLAM applications (both use the same IMU), but unfortunately we have no insight into their software.
If you are using our ROS node, please be aware that it transforms the 3D pointcloud coordinate system from our coordinate system (z pointing forward) to the ROS coordinate system (z pointing upwards) if the parameter ros_coordinate_system is set to true. If you use this feature you need to make sure that the IMU data is transformed accordingly.
Best regards,
Konstantin
this is the data sheet for the IMU inside Scarlet:
https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bno055-ds000.pdf
The output characteristics for the accelerometer (linear accelerations) and gyroscope (angular velocities) can be found on pages 14 to 16.
For the accelerometer, there is an expected 2% cross axes sensitivity. In the real-world I would expect a stronger coupling, as it is difficult to perfectly stimulate only one axis.
We have other customers who are using Scarlet or Ruby in SLAM applications (both use the same IMU), but unfortunately we have no insight into their software.
If you are using our ROS node, please be aware that it transforms the 3D pointcloud coordinate system from our coordinate system (z pointing forward) to the ROS coordinate system (z pointing upwards) if the parameter ros_coordinate_system is set to true. If you use this feature you need to make sure that the IMU data is transformed accordingly.
Best regards,
Konstantin