Getting each finger state and heading

Hello, i m trying to get with the VL.LeapOrion nugget data to identify each finger state: if stretched or not, orientation of the finger (xyz).

Actually i have only general Hand information, and a kind of general pinch state, giving same state pinch with any finger joining the thumb.

i would like to recognize those different type of signals:
image
Thanks for any help, i dont arrived to do something with the matrix

Hey, think i try did something similar for kinect back in a day, itā€™s a proper painā€¦ Basically you canā€™t use position of points because, they object may be different sizeā€¦ If it gives you finger rotation, then there is chances to store finger rotations and test that against shapeā€¦

Hum, i have founded how to get wich finger is pinched by using tip finger position. But i m missing really wich finger is straight or closed.

maybe VL.DollarQ can work for you.

Thank you sunep, i will look at it. i was presuming this was already data frunished by the leap ?

I donā€™t think so

hum well in beta it isā€¦ or its because the 4.00 version driver in beta and not the ultraleap ?

if it is just the pinch gesture, I thought that since you have so many gestures, you would need some training anyway, so why not just use it for everything.
I used it once to recognize poses with a kinect, that was the old dollarP.

ok, i will try it, its a very good suggestion ! thank you sunep

not sure if this is any help for you, but centuries ago i did a ā€œvveekendvvorkshopā€ on the beta machine learning pack where i demoed hand gesture recognition with multiple gestures.

tbh. i have no idea if the pack would still work in beta, it surely wasnā€™t ported to gamma.

the main thing to consider when doing these kinds of things is:

  • make your input data independent on position and rotations, so gesture samples can be compared to a prototype.
    • this means e.g. normalizing positions to zero and inverting base rotations
  • like @antokhio said it might make sense to work with joint angles instead of positions, as you are independent on different sizes.
  • you really want a machine learning mechanism to help with classification here, doint stuff like this manually is very error prone when applied to different people (ie. you tweak it to work wonderfully for you, but as soon as somebody else will try it it fails).

unfortunately i cannot provide something helpful for you hereā€¦

1 Like

thank you motzi ! this will help me surely to understand process of machine learning. The main goal is that it could help also for audience, who i want to participate with on stage, so defining with them a gesture and training it quickly would be great.

You might also wanna have a look at VL.Wekinator

1 Like

thank you @sebescudie ! i tried with beta wekinator, and as i m not familiar to them, i dint find interrest on it for the last project. but still puzzled that stretched or not stretched fingers are not supported in the VL driver ( 4.0 version with beta is). I definitively need to learn how to use those tools

From memory I think a few more gestures are recognised by the unity/unreal leap API than the raw C# one we are dealing with.

To answer your specific first question you can get the finger positions from the C# API.

The problem I have had (even on the ultraleap 2) is that it can still be quite glitchy data reported on those finger positions. It often reports wrong finger. I think even with machine learning the underlying data is not a good enough quality to achieve all the poses you hope.

Probably some poses will be better than others.

1 Like

thank you @tobyk, for the expericence i have on this show : it depends enormously about the angle of the hand and palm, for distinction between fingers. Light may interfer also. My main goal is to know wich finger is straight or not.

The dollarQ or Wekinator coudl be used for gesture recognition ( swipe and so on) if not supported by the actual driver in gamma ?

1 Like

Here is how to access some finger data, can even break down finger further with bones [finger] and then forEach the bone data to access that.

2 Likes

Nice show! I like the graphic style

Yes if you just want to know which finger is straight than the ā€˜IsExtendedā€™ property is ok. As you say accuracy can depend on angle and light.

2 Likes

thank you tobyk.
i didnt find those 3 functions in node inspektor. I m on VL.Devices.LeapOrion.1.2.1ā€™ but i didnt installed the SDK, just the ultraleap software. Could that be this issue ?
image

I have
-ultraleap tracking app 3.4.1
-nuget vl.devices.leapOrion 1,2,1 GitHub - vvvv/VL.Devices.LeapOrion: A package for using the Motion Controller by Ultraleap in VL (using the Orion SDK)
-with updated DLL as discussed here Ultraleap Gemini Driver
-leap motion controller 2 device

nodes in the finger category are visible for me with advanced aspect enabled in the node browser.

2 Likes

Ahhhhhhhhh thank you ! It seems i had a bug with the extend vision of low level nodes ( needed to restart Gamma) ! Thank you so much TobyK !

1 Like