I’m trying to colour an array of boxes via a volume texture, so as I see it I need to use the xyz positions of the objects as the TexCd coordinates.
As my array is a uniform array, I thought maybe I could use the index of the object to work out its sample position in the pixel shader, or I could normalise the positions to generate them.
Alternatively in the shader semantics pdf from node there are
• OBJUNITTRANS , float4x4 : transform to move the model back into a unit box (-0.5 to 0.5)
• OBJSDFTRANS , float4x4 : transform to move the model into a standard sdf space
transform (0 to 1)
But neither of these seems to work in the shader.
If I can do this in sm4 that would be nice, but its not obligatory!
Can anyone suggest a way of doing this?
I’ve attached a quick patch of what I’m doing, but theres not really anything there ATM, appart from an array of boxes and a volume texture to sample!
@sebl thats where I started ;) @antokhio not sure how thats giving me a 3d lookup…
The second patch shows a start, but still can’t work out the next stage…
Got another long train ride tomorrow, so maybe that will do it!
Thanks Antokhio, that really helped, not sure its quite right yet, but attached is a version that seems to allow texturing of the cube array, I’ve manually created a set of 3d tex coordinates via a 3d buffer (rather than making a transform) and using your volume spheres it seems almost right…
Think what I need to do now is find a way of creating a volume texture out of a series of 2d slices of a 3d scene…
ah yea, it was looking like that
i think i already managed to do that
not sure if vertigo gonna work, i think you can do a normal transform
and i would normolize the scene to 1x1x1 size
actually i was thinking about that, the only thing i’ve seen so far is this:
If you need to color every object the same way, you can do it in a really simple way (provided you object is centered)
To sample a volume texture tex3dlod is now : tex3d.SampleLevel(float3(uv),0)
So what you need is to use a position/world transform for your object, apply it to a zero vector and sample color from there (ideally with point sampler). That means all vertices from your box will have same color. Make sure to normalize you model info sdf space (using either semantic mentioned above).
To voxelize mesh it’s also quite easy, even tho bit slower, so works best with static elements:
-Send your triangles as structured buffer to compute shader (normalized to 0-1 space)
-For each voxel/triangle compute the signed distance from that (do a batch or n triangles per frame to avoid timeout, or flush the context). Sign is given by triangle normal so make sure you have good normals it’s primordial.
-Store minimum distance in the volume (absolute minimum but keep track of the sign)
-Now you have a volume with signed distance field
-Use buffer renderer with append flag to remove any positive value in volume (to only keep voxel inside the model). If you only want the shell, do a small threshold on absolute value.
-Render as indirect with cubes
If you have lot of moving objects then it’s suddenly more complex, have fun with gpu acceleration structures ;)
I’ll make a small patch for color lookup tomorrow )
I get undeclared identifier ‘tex3d’ when I try and use tex3d.SampleLevel,
Any examples you could give would be very welcome!
I’d love to be able to voxelise geometry, but it will be moving so maybe I should do it differently…
A few years ago I used near and far ranges on a perspective node to create slices of a 3d world, and with your texture array, I think that would probably be the easiest way now!
Can you access slices of textures in shaders?
In this project I only have a depth of 6 layers, so it might be easiest to work with multiple arrays, but TBH I’d love to get my head around volumes etc as there are many possibilities that open up with them, for example antokhio’s links looked ace :)
I’m using tex3d.SampleLevel(volumeSampler,wn,0) in my vs, where wn is float3 uv coordinates and I get undeclared identifier ‘tex3d’ as an error message. Doesn’t this suggest that the issue is the shader not liking the tex3d part of the code?
TBH honest this is my first real foray into dx11, so its not like I’m making it easy on myself ;)
My thinking is sample a volume texture via a 3d coordinate, and apply the value retrieved to an object instance based on its array ID, for ease maybe this should just be a scale to keep it all in a vs. This seems fairly basic, its kind of what we do with GPU particles, but maybe I’m mistaken!
As I said maybe I can do something in a number of 2d slices instead, as its kind of beyond me without some help! This is my first use of instanced batches…
For an example on how to write into volume, please check girlpower/sm5/voxeliser.
Yes tex3d is just a variable in the shader , Texture3D tex3d;
Yes blend is off by default, thanks god for that should never have been on by default ;) Please note you can connect a render state directly to a group now also, no need to add one on every individual shader.
You should not need any lod bias in sampler, since anyway you have no mipmap ;) Make sure to use a point sampler tho.