[Unitree Go1] Safety Low-level Robot Suspension Question in LAN mode

Dear @Sohail ,

In your GO1 documentation, you have mentioned this:

What do you mean by referring to “suspend the robot”? Does it mean we must first use the GO1 bars to create the safe box and then hold the robot to that?

Best regards,
Amir Mahdi

Dear @Amir_Mahdi,

By suspend, we mean this setup that can be viewed at the end of the page. You can of course use it without suspensions, but there is a risk of damaging the robot if wrong values are sent.

Regards,
S. O. Sohail

Dear @Sohail

Thank you for your email. I checked the link, and there was no tutorial for installing the rack for hanging the robot for suspension. Is there any source of how I can correctly install the rack for that?

I have another question. I want access to the value of “end-effector joint force sensing.” Is it accessible in low-level mode?

Regards,
Amir Mahdi

Dear @Amir_Mahdi,

I think the rack we built ourselves. There is one available from unitree as well as a product I believe. Otherwise you can construct a contraption for suspension yourself as well.

I have another question. I want access to the value of “end-effector joint force sensing.” Is it accessible in low-level mode?

This feature was discontinued in most of the latest GO1 units due to un-reliability and false readings. I am not sure if your unit has one.

Dear @Sohail

Thank you for your reply. I checked, and our robot has foot-force sensing.

In your packages, do you have any package for accessing this value?
If you haven’t developed any support for this, can you guide me on how I can access this value?

If I recall correctly the data should be coming in of the topics when the driver is running if the data is available.

Dear @Sohail

Thank you for your reply.

Two short questions:

  1. Is this data available through high-level mode?
  2. In your software packages, has your team developed for publishing this data through a topic? Or should it be available by Unitree packages?

Regards,
Amir Mahdi

Dear @Amir,

  1. Yes
  2. Both

Regards,
S. O. Sohail

Dear @Sohail

Thank you for your reply.

If your team has developed packages for accessing this data, how can I get this data? Should I be available by running roslaunch go1_bringup ...? Or should I roslaunch another package?

Best regards,
Amir Mahdi

When you launch the high_level or low_level driver from the go1_bringup, one of the topics that are being published should have the requested information. Once the driver is launched, execute:

rostopic list

and in one of the topics should be the information. To check the information stream of the topic, you can echo it via:

rostopic echo /topic_name

Additionally, you can check to see if foot force is mentioned via

rostopic info /topic_name

Dear @Sohail

I only see the following topics:

/cmd_odom
/cmd_vel
/cmd_vel_2
/diagnostics
/e_stop
/go1_controller/cmd_vel
/go1_controller/imu/data
/go1_controller/odom
/go1_controller/state
/joint_states
/joy_teleop/cmd_vel
/joy_teleop/joy
/joy_teleop/joy/set_feedback
/lcm_node/obs_env
/lcm_node/ultrasonic_env
/motor_states
/move_base_simple/goal
/pointcloud_process/ground_pointcloud
/range_front
/range_left
/range_right
/range_ultrasonic_face
/range_ultrasonic_left
/range_ultrasonic_right
/ros2udp/odom
/ros2udp_motion_mode_adv/joystick
/rosout
/rosout_agg
/tf
/tf_static
/twist_marker_server/cmd_vel
/twist_marker_server/feedback
/twist_marker_server/update
/twist_marker_server/update_full
/ukd_triple/pose
/ukd_triple/state
/ukd_triple_2_goal/path_tag_line
/ukd_triple_2_goal/path_tag_window

Does it explicitly mention the /force_senseing?

It would be either in the /go1_controller/state or in the /diagnostics

Dear @Sohail

I appreciate your help :slightly_smiling_face:

As you mentioned, the force sensing needs to be calibrated for a fair approximation. May I ask you if you know any approach for calibration?

Here is one approach I was thinking about: Whenever I wanted to check the force sensing data, I thought I could hold the robot in the rack (in height) and use rosbag record to save data. Then, calculate the average value for each leg (call this the drift or noise) and subtract it from the actual data when the robot is on the ground. Using this approach, we will have an approximation of the force sensing.

Corrected Force = Raw Forceaverage Drift

In this formula:

  • Raw Force is the force sensing data when the robot is on the ground.
  • Average Drift is the average value calculated for each leg while holding the robot in the rack.

Dear @Amir_Mahdi,

We have not worked much with foot_sensors. The proposed approach seems to make sense and it might work.