Rigging, bones and kinematics in gamma/stride

I would love to get back to doing some live visuals vith vvvv, but I feel super demotivated by the lack of convinient pipeline for rigs. In modern asset workflow and creation, rigs are truly everywhere, and it always have been one of the parts in vvvv pipeline that did not get a lot of love imo.

There are few threads around here:

they sadly ended with very little info, and with extremenly hard process to work with
also there is a channel on element for skeletons:
https://matrix.to/#/#universalskeletonbible:matrix.org

What I would love to do with this thread is just get some experiences from other users, their expectation, workaround, maybe some info on future plans from devs, and just collect some general heading what could we perhaps one day have in stride/gamma. I do understand there are more pressing issues and lot of work to do, this is just shout out to this topic, as we are all skeletons inside, so we should give skeletons some love.

What are the biggest hurdles to this effort? For example the problem with importing animations that have to be in different files and only files up to certain version of blender are good for this is quite a big hurdle.

Maybe, if there is enough interest in this topic and people that have experience with it, we might be able to kickstart project that would help us achieve good skeleton implementation, and everybody can pitch in some funds.

So how should good skeleton implementation look?
In my opinion, on one hand it should be general system to load skeleton models, preferably working well with blender since we like to keep things in the realm of easy acces. We should could have tools to easy patch some state machine for the animations, once in gamma.

On the other hand, we need system to customize skeletons - It would be great to be able opening stride editor, and preparing structure of the rig, attach rigidbodies, change their sizes, and some basic settings. There should be a way how to easily create IK chains to make wierd stuff, scopions, alien shapes with many appandages, etc… But it does not have to be done in stride editor, maybe we would be able to come up with rigging structre just directly in gamma, since its just graph structure in scene.

All of this should be made to work with physics engine to levarege the most out of IK.

Blender and Unity are system I am familiar with, and especially unity has some kickass scripts that are similiar to Blender bone constraints that are super nice to work with. You have option to create IK chains very easily, and it boosts creation of content signifcantly. Perhaps we could do blender addon that would allow us to export some additional data with skeletons, that would allow blender to be a sort of editor for the skeleton prepared to be used in gamma.

Check out unity implementation here:
https://docs.unity3d.com/Packages/com.unity.animation.rigging@1.1/manual/RiggingWorkflow.html

https://docs.unity3d.com/Packages/com.unity.animation.rigging@1.1/manual/constraints/ChainIKConstraint.html

and to see it in action with live visuals, this is what the system allowed me to do:

2 Likes

Hi, so first the IK solver isn’t that much of code actually…There was this character pack you can reverse engineer it to see how it works….

Right now what I do is setup class with animation blending in Stride and then trigger that from gamma…. I’ve used only the animation blending without real-time input handling I can say that’s a proper nightmare, you have pretty much two options play and blend and that’s pretty much about it…

All the examples I was able to find mostly stride template projects…

I know IK is not the hardest thing to do, but integrate it nicely with gamma workflow and make constraints that are flexible and empower designer to work with them would be very cool feature.

"…, if a model has many different animations, you have to split them into single files to import them. "
is this still true for stride?

I would guess stride should be somewhat competent at playing animations since its game engine.

we haven’t looked into model animations, skeletons, and skinning. for now, there aren’t any nodes provided. it is planned for this year, but no definitive ETA yet.

the “Modify Loaded Entities” help patch is using a stride project with a rigged model. so a workflow could be that you prepare the functionality in stride and then expose scripts that have control values that you can set in vvvv, as the help patch demonstrates.

and of course, you can hack around by importing the stride functionality directly… as @Hadasi and @antokhio are doing…

means import the model and write scripts in stride that you later control in vl right?

yes, that is something you can do. you can reference the DLL of the stride game and you will have the public methods and properties of your scripts available in VL and you can call them.

the instances of the scripts, if added to an entity, are just normal components of the entity. you can get them with the generic GetComponent node.

and stuff like parenting bones under rigidbodies and chaining them to make ragdolls would be in the realm of possibility?

In short is yes…
Sorry if had some time would assemble you an example…

The pipeline:
You store T Pose rig, in FBX

So yo should have one model, few animations and one skeleton
image
Animation should be setup like so

Then you do perfab with Model Entity, that has Animations Component and Script component.
Animations contain key value pair list of animations, an script triggers them from the list

3 Likes

Its possible already, with work:
Stride Ragdoll Tutorial (beta) - YouTube

2 Likes

Its unfortunate that a project we’re discussing will probably be using Unity because the 3D character animation is simply far more capable than Stride’s and its cheaper for my company to use annoying but ready tools than to RnD a solution from scratch.

The animation system in Stride seems designed only to be used with pre-baked skeleton animation workflow, manipulating those animations through scripts with blends. Some procedural animation possible it isn’t documented to work with bones. That said bones can be accessed and manipulated, even while baked animations play. There are also an example of IK in Stride, by a user, so this isn’t an official implementation. https://github.com/flipdp/Stride.IK It just demonstrates that if if you need it, Stride will allow it, somehow.
IK itself doesn’t seem too hard (in 2D) but I haven’t spent anytime working out constraints for that kind of thing in 3D.
Inverse Kinematics Fabrik.vl (109.7 KB)

More fundamentally, if you want to use vvvv for a 3D designing tool, you’ll need ways to select 3D objects and manipulate gizmos. Some of this work has been done previously by the devs but I don’t know what the status of that is.

Just to add, I would prefer a workflow in which I don’t need to touch the Stride editor to setup the characters. I would like to load them, see the asset and go to work. Because of how the model/mesh and animation clip system is setup in Stride I’m not if that could be possible. I think assets are compiled on import, which I think means that the the keyframes and channels of the animation are extracted and set to a separate file.

1 Like

Oh cool there is now a tutorial for some of this, I was working on this project last year and there was nothing about this topic back then.

I have no problem with preparing the project in stride editor and then controling it in gamma, I think its totally valid pipeline. I however have a problem with stride editor in this case, I would expect to get counterpart to gamma in stride editor, which means good editing tools, scene graph where I can see every bone, and easily parent my shapes for physics - which as I know now is not the case at all. This feels to me that it would be better in future to just come up with completely different system and not rely on stride editor at all.

Ultimately I think it is possible to desing an interesting system for rigs, bones and IK that would be included in installation of gamma, and offer some basic system to load models and attach entities and do some IK, as I mentioned this would benefit also stuff like robotics, etc… I want to just get some ideas from community how this could work so devs might have something to build on when they eventually get to these features. At this point I am just more interested what could we have in gamma rather then trying to have the same system as unity or other huge corp engine has.
I understand that bigger studio that uses vvvv in any capacity would just use unity for rendering and ragdoll part and pipe the texture back to vvvv, or just use vvvv as an osc controller (which is what I would do if I really need to include vvvv in something like this).

1 Like

didn’t check in detail, but does this help somehow? Procedural animation | Stride

Stride’s animation system works just like Blender or Maya’s curve animation editor. Each bone/value is assigned a curve composed of several points that are interpolated either in linear, cubic or constant fashion.

Yeah, its mostly for editing components like transformations and colours, quite basic things we do in vvvv without this approach. Its indirect and a little bit confusing in my experience. requiring the user to make two animation clips and blend them. I couldn’t figure out the time control (pause, seek, reverse play). I didn’t find an easy way to do it outside of Stride Editor but I could have missed something.

This forced me to hack the skeletonUpdater. I’ll need to take another look at how the skeletonUpdater work for directly driving bones. Last time I tried I could drive bones while an animation was playing on the same model, but it still felt a bit off.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.