For context, I’m aware that Leap Motion was supported for uArm previously although I’m not very sure of its application.
However currently we are trying to build a prototype using Leap Motion to control xArm’s movement. Basically we are trying to get the xArm to imitate a person’s hand motion, simply tracking a hand motion and translating that position to X Y Z position of the xArm.
Is there anyone working on something similar? Or would xArm has any recommendation on how to approach this?
Right now we are purely using set_position to set the cartesian coordinates of xArm whenever the hand positions are being output by Leap Motion. But because Leap Motion outputs about 120 frames per second, the xArm doesn’t move fast enough and the xArm moves and stop from set_position to set_position, and I think we may have exceeded the number of caches stored.
I’m considering using move_arc_lines and specifying path list instead.
In my experience, the only way to make the xArm6 robot respond quickly (> 100Hz) is:
Turn reporting off. When reporting is on, the robot controller is sending
updates back to your application at about 10 Hz. This overhead can screw
up timing severely.
For very smooth motion, you need to use “servoj” commands at a high rate.
In our experience, around 40 Hz starts looking smooth. The problem is that
servoj commands want angles not xyz/rpy coordinates so IK (inverse kinematics)
is up to you.
We have experimented with around 40Hz in house using a dedicated network and precomputed
IK. In other words, we precompute angles into a file and feed the robot with servoj angle commands at
about 40Hz.
In your case, your realtime needs mean you can’t precompute angles, you have to do it on the fly.
Probably means you have to use a fast IK implementation such as IKFAST (see openrave site http://openrave.org/docs/0.8.2/openravepy/ikfast/) so you can generate the appropriate angles
and feed them to the robot at a fast enough rate. I haven’t tried it though so don’t know how much
of a pain that might be.
CAVEAT EMPTOR: I’m no expert at this robot so perhaps UFactory can give you better advice! Just thought it would
be useful to you to hear what others are experiencing.
You can try it out on Linux, Mac OSX and Windows using
pip3 install pybullet
git clone GitHub - erwincoumans/xArm-Python-SDK
cd examples/wrapper/xarm7
python3 loadxarm_sim.py
I didn’t try running on real xArm 7 (In don’t have one) but you could try it like this:
python3 http://xarm_real_ik.py/
Please do so and report back so we can improve things if needed. Note you can set useNullSpace to 0 for higher precision, in xarm_sim.py, but the arm may drift down eventually, without null space.
I run the scripts on my PC and to connect a xarm controller with a real xArm 7 robot, it did work.
But the motion is a bit odd, it moves aournd 2-3cm and then stop and then moves another 2-3cm and then stop, each stop is short but it make the robot looks like shaking during the movement, we can not get continue movements as the real time 3D model shows.
I think the biggest problem should the communication delay between my PC and the xarm controller.
I will try to run the scrips on the controller directly, but i think that is not the best way, since it execute calculatation on the Linux Gentoo(the system the xarm controller use) by Python and then pass the commands to firmware and the send to the xarm robot, there will be also a delay on the process of sending the commands from Python to the firmware.
So the best way might be integrated the calculation part to the firmware which based on C++, that means we need let that scripts run on C++ platform, we found the scripts are based on python and python libs such as numpy, we are trying to figure out away to let it run on C++ but have no idea yet.
Any advice will be appreciate.
Could you please share a video to show the motion on real xArm 7 arm? Perhaps your PC has slow graphics? You could try replacing the p.connect(p.GUI) by p.connect(p.DIRECT)?
I didn’t notice the delay on our xArm 6, using Windows/Python 3:
If I have time, I can create a quick C++ example using the cmake build system. PyBullet is all C++ with Python bindings.
Got that, I used a Linux virtual machine which I did not assign much source for it, that’s may one of the reason. Let me try my Windows PC directly and post a video then.
Thanks very much.
I appreciate the time you put into showing how to achieve that smooth motion.
Since I have a lite6 robot, the setup will differ a bit, could you give me a quick list of the steps i need to take to adapt your project for the lite6 robot?
Does the URDF need to be created in a certain way? can I Just do it with blender?
are there other aspects i need to change?
I mistook URDF for something else. How did you create the specific URDF file you use for the simulation?
I found the lite6 URDF at the ufactory github but its much shorter than yours.
when i try to create a urdf file with xacro using the lite6.urdf.xacro file, i get an empty urdf file. see the output of the console below. I’m using a ros docker container to do this.
root@6978348f4773:~/catkin_ws/src/xarm_ros/xarm_description/urdf/lite6# rosrun xacro xacro --verbosity=3 lite6.urdf.xacro
set lite6_urdf: <xacro.Macro object at 0x7f259f306610> (lite6.urdf.xacro)
<?xml version="1.0" ?>
<!-- =================================================================================== -->
<!-- | This document was autogenerated by xacro from lite6.urdf.xacro | -->
<!-- | EDITING THIS FILE BY HAND IS NOT RECOMMENDED | -->
<!-- =================================================================================== -->
<robot>
</robot>
the contents of the lite6.urdf.xacro file are here: