Clearly PhongDirectional has done it’s deeds and a PBR shader should have taken it’s place some time ago.
There are many contributions floating around, but they are not really practical for a beginner and definitely to specialized and/or bloated to call it the new standard.
So the goals are:
as easy as possible to use
as less as possible parameters
no extra textures/configuration required
as light as possible on GPU
referenced as a .fxh to include it in custom shaders
Sure, PBR lives from textures, so there could be a second version taking textures for the material parameters.
Let’s first discuss what parameters we need, which standard to follow and then look into some example shaders… Thoughts, opinions?
i haven’t researched the topic thoroughly but it would be interesting if the shader is somehow compatible to marmoset toolbag and/or substance painter PBR rendering.
and i prefer the texture based approach since it still gives you realtime control. you can generate/manipulate textures to increase/decrease reflections for example. and a texture based shader would be pretty simple, or am i wrong ?
just adding one example of such a PBR texture collection, up for discussion
My personal favorite is the Disney BRDF but that requires tangent information from the mesh to look good (this goes for all the modern lighting though). you can get nice anisotropic speculars, clearcoat (additional specular layer), sheen light and fake subsurface scattering as extra. My shader receives the following texture maps: Albedo+Alpha, Normalmap+Height or Relief, Roughness+Metallic+Anisotropic+Anisotropic Specular Rotation along Surface Normal (it needs a catchier name), SSS map.
#ifdef switches, velocity and the material buffer can be omitted and you have a pretty PhongDirectional replacement. Extra feature maybe to get importance sampling environment maps so we can have nice image based lighting too
Maybe not the solution for a super simple standard shader, but Superphong will also be SuperPhysical for next release. With Epic Games (Unreal Engine) PBR implementation. This means it is also based on a metalness workflow, which is compatible with substance designer output etc.
Yeah brohs, how about picking microdee brdf.fxh and sticking it in superphong and simplifying the whole shizz with defines and shit? or we if mr burk now got physical, that’s pretty sick too broh broh.
Not sure we should remove maps stuff for phong-lite version I think maps are paramount to pbr shit. I think taking a look at unity’s available maps and settings for “unity standardshader” ius not a bad start. I mean would be nice to have something a little streamlined with unity broh broh. To me one of the biggest miss of vvvv at moment is not being able to use SUBSTANCE or, better imo, QUIXEL SUITE, as this is really important to asset development, i mean everyone making 3d models uses substance or quixel now, so it’s really important i think. (dont just think substance hey quixel ftw broh broh, sure they shoudl work similarly anyways)…
Totally agree with having some fxh to import in custo shader though that would be sick.
So the new version of superphong will basically be exactly what @evvvvil suggested. I think the
brdf implementation is very similar to the implementation of @microdee, so there is no real reason of combining these two. Microdees version looks like a cleaner implementation of the basic alogrithms, so if anyone would like to develope a new standard shader, I would recommend using this as the starting point.
Nonetheless superphong, or SuperPhysical will suppport a workflow with substance designer, exactly like for example in unity. So all the standard texture formats like roughness, metalness etc. are supported, also in combination with Image Based Lighting. Also superphong includes VSM shadows, which is the reason for much of the complexity of buffer handling etc. This will be much improved for the next release, but this is all way too much to be included into a “phong directional/point” replacement. I’m trying to do it all as simple and user friendly as possible, but it will still relay on external (or at least one) modules.
Maybe we should try to further define the requirements of a “standard shader”. Should there be several lights? Which inputs are spreadable etc…
i’d suggest one simple pbr shader like the basic unity one which exposes albedo, metallic and smoothness/roughness and maybe specular?
and one full verison with all the fancy rest subsurface, specular tint, anisotropic…
guess sticking to unity for the simple one would makes sense, since there are nice tutorials out there.
splitting in two would also make sense performance wise. since the featurecomplete disney pbr shader takes quite a lot of ticks.
an open question would probably be, if it should be implemented for linear or gamma’d workflow. or option for both?
frostbite has a pretty extensive writeup on pbr online. and its diffuse normalization really looks a bit better than the disney one.
no matter how the new standard shader will be, i strongly opt for having functions for diffuse specular etc split up (inside the shader). the monolitic phongpoint.fxh was a pain to work with. would be much better to keep it modular being able to just switch diffuse or specular term depending on ones needs
Depending on what “Material” is called, several chunks get concatenated.
The user only sees these types:
For vvvv I imagine one shader with a huge collection of effects in combination with several patches (= Materials), or configurations, if you want, which switch on/off parts of the monster, or even load them on demand.
Another all-in-one approach:
Here’s Cinema4Ds shader manager:
C4D’s way is not ideal as a reference as it hides quite a lot of what’s going on (“color” = lambert or phong shading, “luminance” = constant shading, lights are defined globally in the scene)
What I like is that one can switch bits on and off, also it’s very intuitive.
I’d suggest maybe several rather than 1 shader, with shared pins, so you could go from a simple-ish version and swap for one with extras depending on your need, without loosing connections.
Hi
The one that i have been putting together for vvvv.js is very slim and it approximates tangents and binormals.
Its leaned on a post from simons tech blog which explains cooktorrance related papers
why not build a modular shader that can be put together at will? every feature gets its own node and in the end the shader code gets assembled by something like the “Shader (String)” node I built for the particles pack.
The idea of a modular and patchable shader system is marvellous, but I guess it would need some more thought than “just glue some strings together and done we are”. I think tmp’s the template-approach might be an excellent start though.
Issues will arise, because if you simply conconate functionality that eventually ends up in a single hlsl function, we would have to take great care, that variable names and types match everywhere. Usually the compiler will warn you, if you write them by hand, but with a disjunct system it might just not.
So we want to keep the error reporting and syntax checking. For that it would be necessary to restrict output to (self-contained) functions from a plugin, basically like a fxh file. The final shader could then be composited of those very helper-function calls.
This observation could be turned into working systems a couple ways, investigating the best is sure a thing for the hive mind.
I think it might be promising to investigate, if a proper fxh-util-lib could be sufficient (in theory, and with some practical monkeywork), and later look into adding a plugin-factory for fxh files to make their functions and classes pluggable for any “shader patch”
that said, and all pragmaticallity aside, I still have dreams of a system, where patchers could decide, if a patch runs on the gpu or the cpu (similar to ctrl+b, or with regions), and affected nodes switch their code generation between msil and hlsl…
velcrome’s vision can be achieved with regular function arguments as inputs and “out” arguments as outputs. creating local variables for out arguments serving as “links” between connected “nodes” (which are in fact function calls in the compiled shader)
now each function node should have some meta data at which pipeline stage(s) they are compatible with. you shouldn’t be able to include a function node with ddx or ddy inside it in any other pipeline stage other than pixel shader.
Global uniform variables would be referenced directly in function nodes’ input arguments.
Special nodes have to take care of structs exchanged between pipeline stages. Pipeline stages returning a struct can have their “output struct” constructor node which in the patch would expose a signature about itself to be used in pipeline stage specific “input struct” nodes.
Now the difficult part comes with branching and loops. Actually it’s difficult if we want to implement such a thing inside vvvv or vl patching interface (vvvv harder, vl might be actually good for this). With a custom interface like what AutomataUI has, handling the graph and user interaction might be way much easier.
Velcrome’s second vision about intuitive switching between cpu and gpu might be achieved with Cudafy which to my understanding allows developer to run CUDA kernels written in C# ;)