Some considerations about alpha on FX Texture nodes

I was puzzled to see that plugging in a Logitech C920 camera to the Chroma node I couldn t get the background as alpha as I could withmy cheap onboard camera.
Chroma sends out a textura reflecting the input texture, in the C920 case it is an RGB24 texture so there was no alpha, manually changing the output texture did the trick, so first question:

-wouldn t it be better to detach the output texture mode from the input in these kind of nodes?

second question:

-how do I apply for example an Edge Glow filter to a texture with transparency so that the blur/glow expands over the limit of the non-alpha area? If that makes any sense…


Like this maybe. In this patch you have blurred edges, then you just have to blend it on top of the input texture or anything else you want. I don’t understand the first question, if you can try to explain in an other way, I’m curious about it.


Blur Edge.v4p (7.1 kB)

Chroma FX takes a Texture input and uses the Input Texture Format to set the Output Texture Format, using a Texture(Info) node. So, if the Input Texture has no Alpha Channel, then the Output Texture won t either.
If I use, as Input Texture, the RGB24 video stream of a video camera, I won t get Alpha on the Chroma Keyed pixels but black instead.
The only way I got that working was to detach internally the connection between the INput and Output Texture (the format connection, of course, not the texture connection) and set it manually to a16r16g16b16.

Then you have to use ChangeFormat(EX9.Texture)

+1 SetAlpha (EX9.Texture Join) suffers the same symptoms. it’s not a big deal though but it’s a little bit disturbing