[Unitree H1-2] Unitree Sdk Low Level

I have several questions about the Unitree SDK (low-level control):

  1. Switching modes with MotionSwitcherV2

    How can I switch back to MFC mode (high-level) after being in low-level mode?

    I couldn’t find any examples showing how to return to high-level control once low-level is active.

  2. Mixed control of the H1-2

    Is it possible to control only the arms + hands of the H1-2 in low-level mode while keeping the lower body under high-level control?

  3. Parameter limits

    Are there known limits for the parameters dq, kp, kd, and tau?

    I only found documentation for the maximum motor angles, but nothing about these other parameters.

  4. Clarification of MotorCmd parameters

    What exactly do tau and dq do in the MotorCmd?

    I tested both on the Go2 EDU but couldn’t see a clear effect.

Hi,

It is mentioned in the Unitree website the statemachine chart, you can verify from there. I believe once you go into debug mode you have to reboot to go back to normal mode.

Yes by default you can use the arms in low-level mode while the legs are in high-level mode.

There is no documentation for this but you can find the ideal values for these in Isaac Lab and their RL deployed repositories on GIthub such as the xr_teleoperate for H1.

On the Go2/SDK2, each joint’s MotorCmd is a PD-with-feedforward command:

  • q — desired joint position (rad)
  • dq — desired joint velocity (rad/s)
  • kp, kdposition and velocity gains
  • taufeed-forward torque (N·m), added on top of the PD term

The motor driver computes something equivalent to:

τ_out = kp * (q_des − q_meas) + kd * (dq_des − dq_meas) + tau

So:

  • Changing dq only has effect if kd > 0 (or if you’re explicitly running a velocity mode).
  • Changing tau biases the output torque (e.g., for gravity compensation), but it can be clipped by safety limits and will be overshadowed if kp/kd are large

However, i cannot confirm exactly as their low-level drivers are closed source.

For the H1-2:
Is it possible accessing the camera from the dev-board without qre repo access. I havent found any ros topic, unitreee_sdk2 support, neither is it directly connected to the dev board. Also on the website it says gst streaming is not yet supported so how can i get the video feed ?
Thank you :smile:

Yes, there is 1 of 2 ways.

  1. Plug in the Intel Realsense camera to the dev board, it should be one of the cable in the side of the robot. You can check via lsusb while plugging and displuging to check which is the correct one. Once you have it, you can just use intel realsenses library to view image and depth stream.

  2. The frontvideo stream topic should contain the video stream coming from PC1, but we have noted in different H1’s some ar active and some aren’t so option 1 is the best way to go about it.

We are currently working on migrating our control stack from Isaac Sim to the real H1-2 hardware using ROS2. We have run into a few issues regarding the low-level API, but we are currently facing a critical error that has stopped our progress.

  1. Elbow Joint Noise/Jitter after LowCmd We recently tested the lowcmd interface to control the elbow pitch motor. Since that test, when we switch back to running standard unitreesdk examples, there is a persistent “buzzing” noise coming from those specific joints. It sounds like Control Loop Jitter or a Frequency Mismatch. Could this be a lingering issue from setting the specific mode via the lowcmd API? Does the motor state need to be explicitly reset to clear this behavior?
  2. Wrist Control and /arm_sdk Limitations We noticed that the /arm_sdk interface does not seem to expose a direct way to control wrist movements. We need to control the hand/end-effector without triggering the high-level lower-limb mode (normal walking mode). What is the recommended workflow to actuate the wrists while keeping the legs in a passive or standing state via the SDK?
  3. Direct Devboard Connection Is there anyway to connect to the devboard on the robot using a monitor, keyboard and mouse? So far we have only managed to connect over ethernet and a wifi dongle.
  4. [BLOCKER] Calibration & “Transient Over Voltage” Error How do we properly calibrate the robot? The video tutorial in the app is quite complicated and doesn’t fully explain exactly what needs to be done to ensure 100% accurate calibration.
    After we attempted the calibration, our robot’s motors went offline with the following error: firmware error: joint limit transient over voltage

We currently cannot do any development because of this. How can we fix this error to get the motors back online?

Thank you so much for your support!

You can use the robot in debug mode if you need the other limbs in a passive state.

Its better to install any type of VNC on the robot for GUI access.

Perhaps you can add a recording as we have not experienced this. Most likely its a change in parameters. You may have to experiment or contact Unitree for this.

@Azib could you please assist with what could be wrong in the calibration step?

We have managed to fix the firmware error, it was indeed a calibration error. Bit we are still struggling with the arm_sdk. We wrote some interface to control the robot arm with move_it in ros2. It works and every joint is very smooth apart from the ellbow_joint which is quite jittery.
Thank you so much, as there are very limited ressources on the internet :blush: