Hello. I am currently working on using the 4D Lidar L1, which is pre-installed on the lower part of the head of the Unitree Go2 EDU, to perform SLAM and measurements.
I have a question regarding this: while Go2 is in operation, the localization becomes inaccurate, and the map ends up being distorted.
Could you suggest any ways to move the robot or any methods that could help achieve more accurate measurements?
Additionally, I have never used the calibration feature in the Go2 app, so if that is a useful tool, I would appreciate it if you could guide me on how to use it.
If you are using our SLAM or localization I would suggest tuning the AMCL in the nav2 params file. If you are using a custom one I would suggest the same.
I would recommend try other solutions like LIO SLAM or Fast LIO Slam or perhaps VSLAM.
Thank you for your response. Currently, I am using other documentation, so I plan to look into the relevant parameters.
Also, I would like to clarify something. Regarding the qre_go2 that I asked about the other day, do you think it would be difficult for me to access it? I would appreciate it if you could confirm this.
According to my knowledge, the Go2 app has two calibration functions i.e. leg calibration and IMU calibration which might not be useful in catering to the problem you are facing.
The qre_go2 repository is exclusively for our customers who have bought the robot from us so if you have purchased the robot from MyBotShop then please send us the purchase ID on support@mybotshop.de with your GitHub username to get access to the repository.
It seems that my question was slightly incorrect. Currently, SLAM is not related, and I am focusing on eliminating distortion in the point cloud data. For example, do you have any solutions for situations like a room being tilted or an object (a chair) being rendered multiple times? I understand that errors can accumulate when walking the robot for a long time, but even after walking just one round around the chair, such misalignments occur.