-
Notifications
You must be signed in to change notification settings - Fork 760
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RGBD and 2d lidar fusion #1261
Comments
If you use that configuration, it is explained in this paper:
So in your case it is a two steps process:
The robot can also be localized by only lidar using proximity detection (described in that paper). Fusing two global localization approaches (not odometry) is more complicated has they would fight each other from their respective map frames. |
hello, i have a small question about principle. hope you can help me. If I use an RGBD camera and a 2D lidar as sensors when using rtab-map, how does rtab-map fuse the two types of data? Does it perform fusion in terms of localization? If so, what method does it utilize? Is it extended Kalman filtering? If I want to further improve the localization accuracy by utilizing UKF in combination with RTAB-Map and other LiDAR SLAM systems, is it feasible?
The text was updated successfully, but these errors were encountered: