Hey, think i try did something similar for kinect back in a day, itās a proper painā¦ Basically you canāt use position of points because, they object may be different sizeā¦ If it gives you finger rotation, then there is chances to store finger rotations and test that against shapeā¦
if it is just the pinch gesture, I thought that since you have so many gestures, you would need some training anyway, so why not just use it for everything.
I used it once to recognize poses with a kinect, that was the old dollarP.
not sure if this is any help for you, but centuries ago i did a āvveekendvvorkshopā on the beta machine learning pack where i demoed hand gesture recognition with multiple gestures.
tbh. i have no idea if the pack would still work in beta, it surely wasnāt ported to gamma.
the main thing to consider when doing these kinds of things is:
make your input data independent on position and rotations, so gesture samples can be compared to a prototype.
this means e.g. normalizing positions to zero and inverting base rotations
like @antokhio said it might make sense to work with joint angles instead of positions, as you are independent on different sizes.
you really want a machine learning mechanism to help with classification here, doint stuff like this manually is very error prone when applied to different people (ie. you tweak it to work wonderfully for you, but as soon as somebody else will try it it fails).
unfortunately i cannot provide something helpful for you hereā¦
thank you motzi ! this will help me surely to understand process of machine learning. The main goal is that it could help also for audience, who i want to participate with on stage, so defining with them a gesture and training it quickly would be great.
thank you @sebescudie ! i tried with beta wekinator, and as i m not familiar to them, i dint find interrest on it for the last project. but still puzzled that stretched or not stretched fingers are not supported in the VL driver ( 4.0 version with beta is). I definitively need to learn how to use those tools
From memory I think a few more gestures are recognised by the unity/unreal leap API than the raw C# one we are dealing with.
To answer your specific first question you can get the finger positions from the C# API.
The problem I have had (even on the ultraleap 2) is that it can still be quite glitchy data reported on those finger positions. It often reports wrong finger. I think even with machine learning the underlying data is not a good enough quality to achieve all the poses you hope.
thank you @tobyk, for the expericence i have on this show : it depends enormously about the angle of the hand and palm, for distinction between fingers. Light may interfer also. My main goal is to know wich finger is straight or not.
The dollarQ or Wekinator coudl be used for gesture recognition ( swipe and so on) if not supported by the actual driver in gamma ?
thank you tobyk.
i didnt find those 3 functions in node inspektor. I m on VL.Devices.LeapOrion.1.2.1ā but i didnt installed the SDK, just the ultraleap software. Could that be this issue ?