Assuming evvvverything is possible in vvvv and I that love (other users’) contributions, is there a way to get UV mapping from an arbitrary point of view? In Blender it is called “Project from View (Bounds)”, it will fit an UV mapping of the visible faces as seen from the active camera.
S.
do you mean ProjectedTexture where projection == camera view?
you mean like that?
https://discourse.vvvv.org/
Hi I guess that is Woei, DiMiX wrong link. tx
So I think this is not what I need, I ve found the proper link:
https://discourse.vvvv.org/
but this will just let me apply a texture from an arbitrary point of view, what I need is to retrieve the UV mapping from an arbitrary point of view
Hi, when u apply projected texture mapping u do that on per pixel level, since if u go per vertex u will need a highly tessellated model to avoid distortions.
Distortions are caused due to linear interpolation will be incorrect since projected object contains perspective also…
U can still have all that sorted inside vvvv…
There is also a difference between normal camera and camera used for calibration toolkit so projected texture function is different if u want to have it in the mapping setup.
and if u need a UV layout like a picture i’m pretty sure it’s easy to do inside shader so u will have ur pixel perfect layout png without 3d model ;]
Not the UV mapping like a picture but the buffered geometry of the UV mapping so I can use that for example for bump mapping etc.
@io: your link is also broken.
maybe this one: https://discourse.vvvv.org/
@vjc4 not sure but thanks to Lasal who has pointed out the UV_coords node from the Noodle pack I got it working.
Now following question… now that this working I was wondering how to keep the View based UV mapping to apply texture based effects like a map displacement but still apply a texture based on the original UV mapping of the object, I am not even sure that is possible I tried to decompose the geometry with the GetVertexData and SetVertexData nodes but it doesn’t seem to work.
tx
S.
you can pass multiple sets of texturecoordinates to the pixelshader. or you just pass the original uvs and quickly calculate the projected ones from your vertex position data. really depends on how your pipeline is set up