Table of Contents

WebGLFundamentals.org

Fix, Fork, Contribute

Is there the notion of a generalized vertex and fragment shader?

Question:

I am going to go about creating a simple 2D, maybe 3D down the road, game system like Pixi.js. I notice that they have shaders for each type of effect, and a generic projection matrix shader, but other than that, everything else occurs in regular-code-land.

gl_Position = projection * model * vec4(position, 1.0);

Are things like ShaderToy just that, toys, seeing how much you can do with shaders alone? Or do real game engines need to implement significant functionality directly in shaders? Basically, is there the notion of a generic standard shader you can use for all rendering in a game engine, or do you have to do one off shaders for this and that?

I am trying to get a sense if I can just find that keystone shader for the game engine, the one shader pair I need for a high-performance 2D engine in WebGL, rather than thinking/imagining I need to slowly figure out on a case-by-case basis where shaders will come into play in the game engine.

For example, this is the default shader in Pixi.js:

attribute vec2 aVertexPosition;
attribute vec2 aTextureCoord;

uniform mat3 projectionMatrix;

varying vec2 vTextureCoord;

void main(void)
{
    gl_Position = vec4((projectionMatrix * vec3(aVertexPosition, 1.0)).xy, 0.0, 1.0);
    vTextureCoord = aTextureCoord;
}

https://github.com/allotrop3/four/tree/master/src/shaders

Answer:

The short answer is no, you can not make one generic shader unless your game is very simple and only needs one fixed set of features for all situations.

Game Engines like Unity and Unreal make thousands of shaders based on the features used by the game developers. Even three.js will which is not quite as sophisticated as those other engines generates different shaders based on the features used for each combination of lights, textures, skinning, blend shapes, environment mapping, etc..

There is an notion of an "uber shader" that tries to do a lot of stuff. Usually it's something a game dev uses to experiment because they know it's too slow for production. It's less common in modern engines because those engines are designed to generate the shaders either at runtime or at build time so it's easy to specify the features you want and the engine will then generate the shader. For engines that don't have a shader generating system a dev might make a shader that implements all the features. Once they get the look they want they'll then pair it down to only those features they need and/or they will add lots of conditional compilation macros to turn features on/off and then compile the shader into different versions for each combination of features they need.

You can get an idea of this by looking at three.js's shaders. Here is the shader generated by three.js for this program which I used this helper to view. I'd have pasted it in the question but it is 44k and S.O. only allows 30k for a message. First off it was assembled via a large number of snippets. Second you'll notice various conditional complication directives throughout the code. Example

#ifdef DITHERING
    vec3 dithering( vec3 color ) {
        float grid_position = rand( gl_FragCoord.xy );
        vec3 dither_shift_RGB = vec3( 0.25 / 255.0, -0.25 / 255.0, 0.25 / 255.0 );
        dither_shift_RGB = mix( 2.0 * dither_shift_RGB, -2.0 * dither_shift_RGB, grid_position );
        return color + dither_shift_RGB;
    }
#endif
#ifdef USE_COLOR
    varying vec3 vColor;
#endif
#if ( defined( USE_UV ) && ! defined( UVS_VERTEX_ONLY ) )
    varying vec2 vUv;
#endif
#if defined( USE_LIGHTMAP ) || defined( USE_AOMAP )
    varying vec2 vUv2;
#endif
#ifdef USE_MAP
    uniform sampler2D map;
#endif
#ifdef USE_ALPHAMAP
    uniform sampler2D alphaMap;
#endif
#ifdef USE_AOMAP
    uniform sampler2D aoMap;
    uniform float aoMapIntensity;
#endif
#ifdef USE_LIGHTMAP
    uniform sampler2D lightMap;
    uniform float lightMapIntensity;
#endif
#ifdef USE_EMISSIVEMAP
    uniform sampler2D emissiveMap;
#endif
#ifdef USE_ENVMAP
    uniform float envMapIntensity;
    uniform float flipEnvMap;
    uniform int maxMipLevel;
    #ifdef ENVMAP_TYPE_CUBE
        uniform samplerCube envMap;
    #else
        uniform sampler2D envMap;
    #endif

#endif

If you start turning on those features, for example if you set mateiral.envMap in JavaScript you'd see three.js insert #define USE_ENVMAP at the top of the shader in addiction to the fact that it generated the shader for a subset of all of the shader snippets.

This also shows the amount of work you save by using an existing engine. 44k of code is not a small amount of code to reproduce all of the features three.js gives you. If you're set on doing things from scratch it's at least good to be aware it can be a ton of work. Of course if you're making something that only needs a small set of features and no combinations you can get by with just a few hand-written shaders.

You also mentioned

if I can just find that keystone shader for the game engine, the one shader pair I need for a high-performance 2D engine in WebGL

There is arguably no such thing as a keystone shader for a high-performance 2D engine. If you want performance you need each shader to do as little as possible so that's the opposite of a keystone shader.

That said, it depends on the 2D game. IF you want to make Angry birds, that vertex shader you posted in your question is possibly the only shader you probably need. Angry Birds has no special effects. It just draws simple textured quads. So just

attribute vec2 aVertexPosition;
attribute vec2 aTextureCoord;

uniform mat3 projectionMatrix;

varying vec2 vTextureCoord;

void main(void)
{
    gl_Position = vec4((projectionMatrix * vec3(aVertexPosition, 1.0)).xy, 0.0, 1.0);
    vTextureCoord = aTextureCoord;
}

and a fragment shader like

precision mediump float;

varying vec2 vTextureCoord;
uniform sampler2D texture;
uniform sampler2D colorMult;

void main()
{
    gl_FragColor = texture2D(texture, vTextureCoord) * colorMult;
}

would be enough for almost all 2D games made before 2010. 2D Games since then (I just picked an arbitrary date) often use custom shaders to achieve special effects or to optimize. For example certain kinds of particle effects are easy to make with custom shaders. Every particle effect in this game is made with this shader. If you skip to 00:50 you'll see 3 example. The 2 portals under the cake, the candles on the cake, the fireworks... also if you look close in parts of the video you can see particles where characters land on the ground after jumping, all the same stateless particle shader since running particles in Javascript and individually uploading their state would arguably be slow. Another example is the backgrounds are drawn with a tiling shader like this one. That was easier IMO than using the shader above and generating a mesh of vertices for tiles.

Shadertoy is for the most part a toy. See this

The question and quoted portions thereof are CC BY-SA 4.0 by Lance Pollard from here
Questions? Ask on stackoverflow.
Issue/Bug? Create an issue on github.
Use <pre><code>code goes here</code></pre> for code blocks
comments powered by Disqus