some time ago I stumbled upon this post from SideFX’s instagram :
just out of curiosity, how would that be doable in vvvv ? i’m not talking about the animation but just the projected splines on the mesh, as seen in the screeshot above.
drop your hints shader gurus :)
You could fake it using a shader to cut out thin lines, for which you could make use of the topographic shader in contributions, making them 3d splines would be trickier… Although @everyoneishappy made some similar with particles and tangents which might work if you could mesh them…
I can think of a couple ways of doing this. For a general purpose approach I’d create a distance field from your mesh surface, then cross the gradient of that with a noise gradient. At lower frequencies will look pretty close to your reference.
Here’s what a FieldTrip graph for that would look like:
Otherwise, if you need it to be a bit more specific to your mesh, or don’t want to have to convert your mesh to a volume first you could probably create a couple of spline paths inside your mesh and project those radially on to the surface (use front culling rather then back culling). Like a bunch of 1px viewports.
Hope that helps for some ideas. Can pack up that patch if you want it.
what!? you can’t be serious… really impressive!
@catweasel @everyoneishappy thanks for the tips guys!
not been into Fieldtrip that much yet but that makes me wanna dig it :) redid and playing with your patch right now, that’s massive :)
thanks so much for sharing that ressource !
@sebescudie Most welcome. I prefer a little noise, but you can also use a UniformVector(VF3D.Sources) rather then the noise if you want the lines to run around an axis. If you’re interested in that cross gradient trick can check out this paper:
It’s also the basis for the DivergenceFreeNoise (VF3D.source) node
@everyoneishappy quick jump-in regarding “virtual scanning”, would it be feasable to use fieldtrip for resampling the kinect mesh into a low-poly and more stable mesh ? (as it stands, the kinect mesh has no coresponding information between frames which makes it impossible to reduce poly count)
To get field stuff to work properly you need to convert mesh to a distance field first witch isn’t gonna happen easily if you have hole in the mesh…
@antokhio he just showed a distance field from the dragon… is the kinect mesh basically different from that ?
The dragon has been pre-processed into an Signed Distance Field, so you would need to turn the kinect mesh into an SDF to work
@ggml You can look up sdf gen for an external tool. I do actually have a conversion patch as well (also not real-time), may clean it up and add it to next version of fieldtrip since a lot of people were asking about it. Honestly I don’t think it’s the way to go for what you are talking about though. Having said that you can make a sdf from depth sensor (iirc that’s actually how kinect fusion works), but it’s more of a progressive scan sort of thing, and quite finicky
sori again for jumping this thread,
@everyoneishappy what do you mean by “you can make sdf from depth sensor’? patch one ?
do you mean its just not a realtime task ?
@ggml yes probably better off as a separate thread. It’s not a realtime task. Can sort of be, but that’s more in the context of moving the sensor around a scene to build up a surface reconstruction from successive frames. Just look up fusion if you want to know more about that.
@everyoneishappyI would love to see that patch running, I cant believe the 120fps on the top !
How do I convert any mesh to SDF?
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.