How To Create Cool Effects with Custom Shaders in OpenGL ES 2.0 and Cocos2D 2.X

This is a post by iOS Tutorial Team member Krzysztof Zablocki, a passionate iOS developer with years of experience. Shaders may very well be the biggest step forward in computer graphics since the introduction of 3D into games. They allow programmers to create completely new effects and take full control of what’s seen on the […] By .

Leave a rating/review
Save for later
Share
You are currently viewing page 2 of 5 of this article. Click here to view the first page.

How Do Built-In Shaders Work in Cocos2D?

Cocos2D 2.x uses OpenGL-ES 2.0, and thus you need shaders for even the simplest rendering. So each CCNode has a shaderProgram instance variable which contains a pointer to a shader program which is called when the node is drawn.

Cocos2D also has its own CCShaderCache that allows you to use default shader programs, or cache your own programs so that you don’t need to load them multiple times. Constant definition keys to access predefined shader programs can be found in libs\cocos2D\CCGLProgram.h (such as kCCShader_PositionTextureColor).

You can find the default shaders Cocos2D 2.X uses in libs\cocos2d\ccShader_xxx.h.

In fact, do that now. Using the project navigator, open the Shaders group, then select the ccShader_PositionTexture_vert.h vertex shader. Note that the shader is stored as a string here (for speed of loading in Cocos2D), but we’ll list the code here without the string formatting for readability.

Here’s the code for this simple shader:

//1
attribute vec4 a_position;
attribute vec2 a_texCoord;

//2
uniform mat4 u_MVPMatrix;

//3
#ifdef GL_ES
varying mediump vec2 v_texCoord;
#else
varying vec2 v_texCoord;
#endif

//4
void main()
{
//5
  gl_Position = u_MVPMatrix * a_position;
//6
  v_texCoord = a_texCoord;
}

Before we step through the code in detail, let’s give a high level overview of this shader and its goals. Every shader program takes input, and generates output. For this shader:

  • As input it takes the position of each vertex, which for a sprite is the four corners of the sprite. It also takes the coordinate of the texture to display at each vertex (which will map to the four corners of the texture), and a transform to apply to the entire sprite to position/scale/rotate it. Cocos2D will pass in these input variables before running the shader.
  • As output it determines the final screen coordinate of the vertex (the input position with the transform applied), and the final texture coordinate for the vertex (same as input). The fragment shader will use these output variables, which we’ll cover after this.

Now that you have a high level overview, let’s look at greater detail on a section-by-section basis:

The cool thing about varying variables is that they are interpolated. This is a fancy way of saying if you set a value of a varying coordinate of vertex A to 0, and the value of a varying coordinate at vertex
B to 1.0, when you get to the fragment shader for a pixel right between A and B, OpenGL will automatically set the value of the variable to 0.5. Each fragment has an interpolated value calculated from the vertices that created this fragment. Here you can see precision specifier mediump.

There are also two other specifiers available: highp and lowp. They define the importance of the quality of calculations/data storage. The higher precision means more precise data types will be used, and the calculations will be slower.

  1. Defines the input vertex data structures. The attribute keyword tells the compiler that this is an input variable that comes with each vertex data structure. Types vec2 and vec4 declare that the data is a vector of floats; vec can have 2-4 components. This declares two of these (one for position and one for texture coordinate).
  2. When you need some external variables passed to the shader from your source code, you need to declare them as uniform. Type mat4 declares that the data is a 4×4 matrix of floats. If you’re rusty on your linear algebra, remember that a matrix is a mathematical way you can use to position, rotate, and scale vectors (among other things).
  3. The vertex shader needs to send some data to the fragment shader. To notate variables that you pass from a vertex shader to a fragment shader, you use the varying keyword.
  4. Each shader has to have a main function just as in Objective-C.
  5. Vertex shaders need to fill the built-in variable gl_Position with the transformed vertex position. This shader multiplies the input position by the ModelViewProjection matrix that Cocos2D automatically passes in (to determine the position/scale/rotation of the sprite).
  6. This passes the input coordinates to the fragment shader unchanged by assigning them to the varying variable.

And here’s the code for the fragment shader counterpart of the above vertex shader – it’s in ccShader_PositionTexture_frag.h:

//1
#ifdef GL_ES
precision mediump float;
#endif

//2
varying vec2 v_texCoord;
//3
uniform sampler2D u_texture;

void main()
{
//4
  gl_FragColor =  texture2D(u_texture, v_texCoord);
}

Remember that by the time you get to the fragment shader, OpenGL is calling this program for every single pixel that makes up the sprite. The goal of this program is to figure out how to color each pixel. The answer is simple for this default shader: just pick the right spot in the texture that matches to that pixel.

Here’s a section-by-section breakdown:

  1. Setting the basic precision for floats at the top of a fragment shader is mandatory in OpenGL ES, so this sets it to medium precision.
  2. Anything that the vertex shader passes as output needs to be defined here as input. The vertex shader is passing the texture coordinate, so it is defined again here.
  3. Fragment shaders can also have uniform variables, which are constant values sent through from code. Cocos2D will pass the texture to use in a uniform variable, so this defines a sampler2D variable for it, which is just a normal texture.
  4. gl_FragColor is the built-in variable that needs to be filled with the final color of the pixel. This gets the color from the uniform texture and uses the texCoords from the vertex shader to determine which pixel should be used.

There’s one more step you should see to understand how everything fits together. These shaders happen to be used by CCGrid.m, so open it up so you can see how it’s used.

First, in init the shader is loaded from the cache:

self.shaderProgram = [[CCShaderCache sharedShaderCache] programForKey:kCCShader_PositionTexture];

If you’re curious you can look into the CCShaderCache code to see how the shaders are compiled and stored. Next, in blit it passes the variables to the shader and runs the program:

-(void)blit
{
    NSInteger n = gridSize_.x * gridSize_.y;
  
    // Enable the vertex shader's "input variables" (attributes)
    ccGLEnableVertexAttribs( kCCVertexAttribFlag_Position | kCCVertexAttribFlag_TexCoords );

    // Tell Cocos2D to use the shader we loaded earlier
    [shaderProgram_ use];

    // Tell Cocos2D to pass the CCNode's position/scale/rotation matrix to the shader
    [shaderProgram_ setUniformForModelViewProjectionMatrix];

    // Pass vertex positions
    glVertexAttribPointer(kCCVertexAttrib_Position, 3, GL_FLOAT, GL_FALSE, 0, vertices);

    // Pass texture coordinates
    glVertexAttribPointer(kCCVertexAttrib_TexCoords, 2, GL_FLOAT, GL_FALSE, 0, texCoordinates);

    // Draw the geometry to the screen (this actually runs the vertex and fragment shaders at this point)
    glDrawElements(GL_TRIANGLES, (GLsizei) n*6, GL_UNSIGNED_SHORT, indices);

    // Just stat keeping here
    CC_INCREMENT_GL_DRAWS(1);
}

You may be wondering where the texture was passed in. This happens to take a little shortcut – if you don’t set a variable it defaults to 0, and the first texture unit is also 0, so it just binds the texture into the first texture unit in afterDraw and never passes it in.

ccGLBindTexture2D( texture_.name );

Now that you have seen one example of how shaders are used in Cocos2D, you might want to dig around and see if you can prove to yourself how CCSprite is doing its rendering.

When you’re done with that, enough analyzing – time to create your own shaders!