ROS MoveIt Servo with Kinova Robot Arm

March 12, 2024

Author : Arthur Gomes

Read time: 5 mins

Robot arms are used in a wide range of applications, each differing based on the type of action or motion performed by the robot’s end-effector. Applications involving pick-and-place tasks are essentially point to point navigation but do not necessitate a specific path to be followed during the transition between start and goal states. However, some applications are more dynamic and require the end-effector to respond to a rapidly changing control signal in a more “real-time” way. Some of the examples include a gesture controlled robot in mixed reality applications, surgical robotics, remote controlling a robotic arm in hazardous situations, and many more.

In such situations, servoing is used to great effect.

Servoing is simple in concept: the end-effector of the robot arm responds to incremental changes in spatial pose, either directed through velocities or a constantly changing target pose to track. Such motion is achieved with velocity mapping via an inverse Jacobian. The Jacobian for a robotic arms provides the relation between the joint velocities (joint space) and the end effector velocities (cartesian space). There is no trajectory planning involved in this control scheme, which boosts the computational speed.

End Effector Veloctities = Jacobian * Joint Velocities‍
End Effector Veloctities = Jacobian * Joint Velocities

MoveIt Servo

Most robotic platforms such Universal Robots, Kinova, and UFactory come equipped with servoing capability out-of-the-box. However, these default servoing stacks lack one critical feature — environment awareness.

Environment awareness involves detecting the presence of external obstacles to avoid collisions. This function relies on perception and requires additional sensors like depth cameras, or LiDARs. Moveit Servo is a ROS package that offers environment-aware servoing capabilities that ensure collisions are prevented while maintaining the low-latency control that servoing applications require.

System Architecture involving Moveit Servo
System Architecture involving Moveit Servo
MoveIt allows integration of perception data for obstacle avoidance (planning_scene)
MoveIt allows integration of perception data for obstacle avoidance (planning_scene)

Setting Up MoveIt Servo

One can get started with Moveit Servo by referencing the official tutorial. This is a hands-on demonstration using ROS noetic and a UR5 Arm simulated in Gazebo Classic. However, it does not work out of the box and will require some changes to launch files, configure YAML, and Readme instructions.

At a conceptual level, configuring Moveit Servo requires a suitable ROS interface to the robot driver (be it position, velocity, or effort), a depth stream, configuring the move_group node to use that depth stream and publish a planning scene, and configuring the Moveit Servo node to listen to that planning scene, joint_states from the robot, and user provided inputs. move_group setup can be ignored in case obstacle detection is not needed. Here is a more detailed breakdown of these steps -

  1. Setting up move_group: Moveit offers a GUI to quickly generate configuration data required for both the main pipeline and the servo node. The move_group pipeline processes sensor data to inform Moveit Servo of obstacles in the environment. A robotic project will likely require Moveit’s point-to-point motion planning pipeline for ancillary tasks, even if the primary goal of the project is to achieve servoing.
  2. Prepare Configuration Data: The Moveit Servo node has some ROS parameters in common with the Moveit pipeline as well as some servo specific parameters. Both sets need to be inputted correctly. The required data includes robot description, robot semantic description, kinematic solver settings, and servo parameters. Moveit Setup Assistant will generate the first three automatically, the fourth will have to be written as per this example.
  3. Run the Moveit Servo Node: The moveit_servo ROS package contains an executable: servo_server. This executable is a ROS node that will take input spatial velocity or joint velocity commands from a topic interface and output actuator commands over another topic interface. System feedback is provided by the joint state topic.
  4. Give Commands: Twist message commands must be published at a constant rate to the servo node over the appropriate ROS topic. The commands must be time-stamped for the servo node to reliably process them.

Gesture Controlled Robot with Moveit Servo

We implemented Moveit Servo on a Kinova Gen3 robot arm, both in simulation and on real hardware. We used mediapipe to implement a gesture recognition deep learning model that identifies the position of the moving hand in the image frame. We map these pixels into real world coordinates using appropriate scaling, and feed in this information to Moveit Servo. As a result, the robot’s end-effector follows the user. It has collision detection enabled with an external depth camera.

Some Challenges with Moveit Servo

Moveit Servo, while effective, has its fair share of shortcomings and pitfalls. Over the course of developing and extensive testing with Moveit Servo, we identified some useful pointers to achieve the best results:

  1. Works best with velocity control: We found the behavior of the position controller mode to be shaky. We saw some instances of a downward drooping in the robot arm. The servo node outputs joint positions, resulting in a downward-drooping motion when using effort-based joint controllers. Our preliminary efforts to debug the position controller (or the simulation) were futile. As a simple circumvention to this problem, we found the response from the velocity controller to be more reliable in simulation, and the physical rob
  2. Avoid starting from singularity positions: A typical robot home configuration is fully extended, either horizontally or vertically. Starting a servoing process at either of these configurations will result in a singularity. A singularity is a robot configuration that will constrain the robot’s ability to have end-effector velocity in one or more axes. Simply put, singularities usually occur at the edges of an arm’s work envelope. Ensure to move the arm into a non-singularity “ready” position using a point-to-point planner like Moveit before issuing servoing commands.

Conclusion

Considering the ubiquity of Moveit in the articulated arm sphere, Moveit Servo is a useful yet lesser-known add-on. The framework makes it possible to deploy servoing on robotic manipulators with ease. At the same time, it does take some understanding on the front of ROS/ROS2, Moveit, ROS controllers, simulation setups, and robotic arms to fully leverage the package. There is also room to incorporate a more assistive teleoperation functionality, where the robot arm respects the user commands while safely avoiding obstacles. If you’re looking to build intelligent and perception integrated solutions for robotic arms, and your requirements are more than what plug-and-play open-source provides — reach out to us!

Want to reduce costs and time to market for your autonomous robots?