I am currently experimenting with combining the Azure Kinect with VR in VVVV Gamma.
Somehow, I would like to know if anyone has any hint on capturing the color from the Azure Kinect device. So to mapping on the point cloud with the color accordingly? Should I modify anything within the Azure Kinect Device component?
Any hint will be helpful.
Thank you who answer the question in advance.
@dottore @tebjan We did what is aked for end of 2019 at a christmas dinner, does anyone remember where we placed those patches? :)
Thank you @Elias for checking this out.
Any hint would be helpful!
Thank you in advance.
i think @tonfilm did a node for that. it was taking the images from kinect, computing the ray table and drawing it as a geometry.
shouldn’t be in the vl.devices.azurekinect pack?
Let us have a look in the afternoon. We probably need to do it again - whatever we did it was done for Xenko and probably used nodes which aren’t available anymore. And yes it should end up in the package itself now that VL.Stride is officially released.
yes, I thought it’s in the pack, or at least in the git history. just checked, and the shader which does the alignment is shipping with VL.Stride: https://github.com/vvvv/VL.Stride/blob/develop/packages/VL.Stride.Runtime/src/Effects/ShaderFX/AzureKinect/ReconstructPointCloud.sdsl
Thank you all experts looking into this question, and I truly appreciate it.
However, bear with me with my newbie’s question, so can I do the color mapping within the Azure Kinect component? Or what kind of components/extra works should I do to make it work?
Do you have an example for this?
Of course, if you are still working on the result, please ignore my question first.
Thank you all!!!
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.