Home » Project: Imitate me!

Project: Imitate me!

pib imitate me

The idea behind this project is to enable pib to imitate the movement of an arm. That gives us an easy way to animate pib, without having to steer programmatically. Right now: wild idea 😁 Let’s see if we get it done… This project-page documents our progress and, if we get it done, should be an entertaining tutorial of how to do it yourself.

What happened so far

First of all, we need a way to get the position of the human arm. There are many devices out there, and, as always, our choice should be releable, available and affordable. One possible device-choice is the LeapMotion-Controller, of which we bought two quite a while ago:

Source: ultraleap.com

When we got it, we very really happy about the nice Python-API, that made it really really simple to get our first app up and running. Well, that was some years ago, but what can go wrong here…

Uhh… A current search shows: our formerly so appreciated Python-API is now deprecated! Ultraleap says that the new Gemini-framework is more precise than the former API (yeah!), but there is now only a C-API. And switching completely to C is not an option 😂 Well, let’s continue searching and see if we can get some pyhton-wrapper running…

Searching on github we saw that other developers share our need and already worked on the thing:

  • https://github.com/seanschneeweiss/RoSeMotion/tree/master/app
  • https://github.com/RomanILL/Leap-Motion-Python-3.8

After downloading and installing: both did not read out data from the sensor. So, there is no other way than to do it ourselves… Jippee! 🤪

Python-wrapped API-access for LeapMotion

UltraLeap provides all its tools in one download-package. After installing everything, you can use the UltraLeap-Viewer to see LeapMotion in action:

My hand gets detected at more than 100fps

The first question is: will we be able to use the C-API properly? Then, we see how to use it from Python. So, let’s start our treasure-hunt!

After installation, examples for the C-API are provided in the samples-folder of the sdk. To build the stuff on Windows, you also need cmake. After installation of cmake, open a shell and go to the samples-folder (maybe copy the whole sdk to a convenient location first..) and type:

cmake .
cmake --build .

That should do the trick and create a folder “Debug”, with the executables in it. Running the PollingSample.exe in a shell shows that:

Arm-coordinates taken programmatically: works!

Python ctypes problem: Help needed!

Our main steering-language is python so far, so an obvious thought is, to use a little C-program to retrieve data from out LeapMotion and get this data into python with ctypes. Tried the thing on Windows with VS-tools, hm… Did not really work out. Will try this on Linux asap, but till then we came up with a hack:

LeapMotion-Data as json

Being impatient to test this stuff, the idea is (until we have something doing it properly) to get all coordinates in a C-program and use a file-interface to write them as json into. Our pib-steering-python can use a watchdog to listen for updates in this file and, when a file-change-event occurs, update pibs motors. Being dead-simple, this worked! Here is the C-code of the polling-approach. You can build this if you add the file to the samples-folder and the CMakeLists.txt, then use cmake as shown above. An .exe will appear in you Debug-Folder…

Wanna see what the results look like so far? pib will show you in our latest video!

Share This Post

Customer Reviews

5
0%
4
0%
3
0%
2
0%
1
0%
0
0%

    Leave a Reply

    Thanks for submitting your comment!