How to make adaptive process nodes work

¡Warning: Rambling ahead!

Maybe it’s time to add a little (= really condensed) backstory. @lecloneur and me are currently looking into adding the iirc100+ TextureFX he made to VL.Addons, since it doesn’t seem likely that they will all make it into VL.Stride in a timely fashion. We’ve been discussing the how for about 10 days as of now and it’s not really a straight forward process.

Initially we just wanted to add them as “normal” TextureFX. While exploring this we encountered some issues, for example [1], [2].
Some needed ShaderFX were also missing and after some more debating we decided to explore patching everything in Fuse instead.
There we encountered the “issue” that we had to go back and forth between Texture and ShaderNode. That’s when I thought about the adaptive approach which I couldn’t get to work.
You were so kind to supply a solution for that.

Doodling with that the “polluted node browser” came up. Which again you solved for us. Finally there?

The next obstacle we hadn’t thought of before: when going to ShaderNode some info is lost (TextureSize, Format and maybe Mips). So either every node that can output a texture needs some inputs to be able to (re)set that info or we add some kind of “meta data” ( ITFX<object>).
Once again we couldn’t figure it out by ourselves. @gregsn to the rescue.

Now we are here and don’t know how to best deal with other input parameters of type ShaderNode . Meaning that if we want to connect a texture, we have to Sample before (which is what ValueMap, ColorMap is for in Stride). Or we use our ITFX<object> but then can’t have any Fuse operators inbetween.
@lecloneur proposed an approach earlier today but I am not quite sure if he has already implemented something.

Long story short I don’t know yet were this is going.
But I’ll try to implement the BuildNode and CoVariantInterface approach for Blur and Invert soonish.

3 Likes