Smoother linear motion with xArm 7 + Leap Motion

Hi there,

For context, I’m aware that Leap Motion was supported for uArm previously although I’m not very sure of its application.

However currently we are trying to build a prototype using Leap Motion to control xArm’s movement. Basically we are trying to get the xArm to imitate a person’s hand motion, simply tracking a hand motion and translating that position to X Y Z position of the xArm.

Is there anyone working on something similar? Or would xArm has any recommendation on how to approach this?

Right now we are purely using set_position to set the cartesian coordinates of xArm whenever the hand positions are being output by Leap Motion. But because Leap Motion outputs about 120 frames per second, the xArm doesn’t move fast enough and the xArm moves and stop from set_position to set_position, and I think we may have exceeded the number of caches stored.

I’m considering using move_arc_lines and specifying path list instead.

Please advise! Thank you.

Yes, use move_arc_line and set wait=fales to let the controller to do the path planning.

I’m trying it now, setting first_pause_time to zero. Can you help to better clarify what does it do?

that one is set a time duration for the xArm Controller to do the planning, you could try 0.1s first

Hey Daniel,

I’ve tried move_arc_line. It does interpolate and smoothen the movements for the multiple paths injected, but it needs to calibrate first.

With this, we can’t imitate an actual person’s hand motion in real time.

This is being done with the Python SDK. Would the ROS SDK be able to help us achieve what we want to do?

Don’t know if this will help you …

In my experience, the only way to make the xArm6 robot respond quickly (> 100Hz) is:

  1. Turn reporting off. When reporting is on, the robot controller is sending
    updates back to your application at about 10 Hz. This overhead can screw
    up timing severely.

  2. For very smooth motion, you need to use “servoj” commands at a high rate.
    In our experience, around 40 Hz starts looking smooth. The problem is that
    servoj commands want angles not xyz/rpy coordinates so IK (inverse kinematics)
    is up to you.

We have experimented with around 40Hz in house using a dedicated network and precomputed
IK. In other words, we precompute angles into a file and feed the robot with servoj angle commands at
about 40Hz.

In your case, your realtime needs mean you can’t precompute angles, you have to do it on the fly.
Probably means you have to use a fast IK implementation such as IKFAST (see openrave site http://openrave.org/docs/0.8.2/openravepy/ikfast/) so you can generate the appropriate angles
and feed them to the robot at a fast enough rate. I haven’t tried it though so don’t know how much
of a pain that might be.

CAVEAT EMPTOR: I’m no expert at this robot so perhaps UFactory can give you better advice! Just thought it would
be useful to you to hear what others are experiencing.

Good luck,

Andre

Not sure if this will work, but you could try with the ROS.

You could use set_mode(1) and set_servo_angle_j. You can use PyBullet for inverse kinematics (<0.3 milliseconds) and visualization and then send the joint controls to xArm 6 (should be similar for xArm 7, just replace the URDF file). Here is some example code that computes IK on-the-fly: xArm-Python-SDK/example/wrapper/xarm6 at master · erwincoumans/xArm-Python-SDK · GitHub
Here is video of PyBullet IK in sim and on real xArm: real xArm6 and PyBullet xArm visualizer and inverse kinematics - YouTube
(I briefly tried ikfast, but it was not reliable, often no solutions)

1 Like

Did you try PyBullet on xArm 7?

Yes, I just created a URDF file for xArm 7 (with optimized collision meshes) and inverse kinematics support:
xarm 7 using PyBullet inverse kinematics and dynamics simulation - YouTube and
xArm-Python-SDK/example/wrapper/xarm7 at master · erwincoumans/xArm-Python-SDK · GitHub
accurate IK takes about 280 microseconds (0.28 milliseconds) for 50 iterations and using null-space to keep the arm upward.

You can try it out on Linux, Mac OSX and Windows using
pip3 install pybullet
git clone GitHub - erwincoumans/xArm-Python-SDK
cd examples/wrapper/xarm7
python3 loadxarm_sim.py
I didn’t try running on real xArm 7 (In don’t have one) but you could try it like this:
python3 http://xarm_real_ik.py/

Thanks, let us try on xArm 7 and get back to you.

Please do so and report back so we can improve things if needed. Note you can set useNullSpace to 0 for higher precision, in xarm_sim.py, but the arm may drift down eventually, without null space.

I run the scripts on my PC and to connect a xarm controller with a real xArm 7 robot, it did work.
But the motion is a bit odd, it moves aournd 2-3cm and then stop and then moves another 2-3cm and then stop, each stop is short but it make the robot looks like shaking during the movement, we can not get continue movements as the real time 3D model shows.
I think the biggest problem should the communication delay between my PC and the xarm controller.
I will try to run the scrips on the controller directly, but i think that is not the best way, since it execute calculatation on the Linux Gentoo(the system the xarm controller use) by Python and then pass the commands to firmware and the send to the xarm robot, there will be also a delay on the process of sending the commands from Python to the firmware.
So the best way might be integrated the calculation part to the firmware which based on C++, that means we need let that scripts run on C++ platform, we found the scripts are based on python and python libs such as numpy, we are trying to figure out away to let it run on C++ but have no idea yet.
Any advice will be appreciate.

Thanks

Could you please share a video to show the motion on real xArm 7 arm? Perhaps your PC has slow graphics? You could try replacing the p.connect(p.GUI) by p.connect(p.DIRECT)?

I didn’t notice the delay on our xArm 6, using Windows/Python 3:

If I have time, I can create a quick C++ example using the cmake build system. PyBullet is all C++ with Python bindings.

Got that, I used a Linux virtual machine which I did not assign much source for it, that’s may one of the reason. Let me try my Windows PC directly and post a video then.
Thanks very much.

It works! Let me share the video to you by Email.