Shaders in Processing 2.0 – Part 1   4 comments

The new OpenGL renderers in Processing 2.0 (P2D/P3D) rely extensively on GLSL shaders. Although in most common situations the use of shaders is invisible to the user, Processing includes a new PShader class that allows to apply custom shaders to the drawing of the sketch. This post describes the shader architecture in Processing 2.0, and the common interfaces that custom GLSL shader code needs to include in order to be accepted by the OpenGL renderers in Processing.
Update: With the release of Processing 2.0 final, some of the contents in this post are outdated, please check this tutorial for a detailed description of the finalized shader API.

The transition to a shader-based rendering architecture in Processing is motivated by the following reasons: codebase unification between the desktop and mobile versions of the OpenGL library (by relying on the common API between OpenGL ES 2 and OpenGL 2.1), increasing availability of programmable GPUs (even low-cost devices such as the Raspberry Pi include GLSL support), and the possibility to customize the rendering operations in Processing with user-provided shaders.

Shader types

In order to understand how to write custom shaders for Processing, it is helpful to first consider how the OpenGL renderers use GLSL shaders in order to generate the default rendering in a sketch. Every time Processing renders a piece of geometry to the screen, the rendering operation falls within one out of six categories:

  1. rendering of filled polygons wihtout lights and without textures
  2. rendering of filled polygons with lights but without textures
  3. rendering of filled polygons with textures but wihtout lights
  4. rendering of filled polygons with lights and textures
  5. rendering of stroke geometry (lines)
  6. rendering of points

When a single frame is drawn to the screen, several of these operations might be called more than once. For example, if the draw() function of the Processing sketch looks like:

void draw() {
  translate(200, 200);

  lights();
  stroke(0);
  fill(160);
  sphere(50);

  noLights();
  noStroke();
  tint(255, 50);
  image(icon, 0, 0);
}

this would result in the rendering of stroke lines from the strokes applied to the sphere (5), the rendering of the filled and lit sphere itself (3), and the rendering of a filled, textured but unlit rectangle as result of the image() call.

Each one of these operations is handled by a different shader program, comprising a vertex and a fragment shader. The shader programs in Processing are encapsulated by an object of type PShader, which is part of the core API starting in the 2.0a7 release. By default, the OpenGL renderer in Processing instantiates 6 different PShader objects, one for each of the operations described above. The renderer will bind/unbind the corresponding PShader object depending on the type of geometry that needs to draw at each given time.

The default PShader objects that Processing uses to execute the rendering operations 1 to 6 can be replaced by custom objects created from vertex and fragment shaders provided by the user. Since the type of operation that a given combination of vertex and fragment shader is supossed to implement cannot be inferred automatically from the shader code itself, the user needs to explicitly indicate what type of operation they correspond to. At the moment of this writing, the rendering operations are described by the following shader type constants:

  1. FLAT
  2. LIT
  3. TEXTURED
  4. FULL
  5. LINE
  6. POINT

These constants are valid in the current alpha release of Processing (2.0a7) but might be renamed for the beta release (but once the development reaches the beta stage though, no more API changes should be expected).

In summary, to create a custom PShader object the user needs to provide the vertex and fragment shaders, written in valid GLSL code, and a shader type as specified by one of the constants above. The custom shader replaces the default shader of the given type when the shader(PShader sh) function is called with the newly created PShader object as the argument. The default shader is not lost however, and can be restored anytime by calling resetShader(int type). The following code exemplifies the use of these functions:

  
PShader customLight;

void setup() {
  size(640, 360, P3D);
  customLight = loadShader(PShader.LIT, "lightFrag.glsl", "lightVert.glsl");
  shader(customLight);
}

void draw() {
  ...
}

void keyPressed() {
  if (key == 'r') {
    resetShader(PShader.LIT);
  }
}

Note that the shader() call doesn’t need the type constant, and this is so because the shader object contains that information already. More precisely, each shader type is implemented as separate subclass of PShader, but the user doesn’t need to explicitly deal with these subclasses, being the API of the PShader class enough for all possible use scenarios.

Uniform and attributes in Processing shaders

As indicated at the beginning of this post, custom GLSL shaders indented to replace the default shaders included in Processing must follow a common interface that the renderer expects in order to pass geometry data to the shaders. This common interface is defined by a set of uniform and attribute variables. This set of uniforms and attributes vary depending on the type of shader. The uniforms and attributes are listed for each type from FLAT to POINT below together with their type (between parenthesis), and a brief description of the data they hold:

1) FLAT shader
Uniforms

  • projmodelviewMatrix (mat4): projection-modelview matrix (containing the geometry/camera and projection transformations from Processing)

Attributes

  • inVertex (vec4): xyzw coordinates of the incoming vertex
  • inColor (vec4): rgba color of the incoming vertex

2) LIT shader
Uniforms

  • modelviewMatrix (mat4): modelview matrix (containing the geometry/camera transformations from Processing)
  • projmodelviewMatrix (mat4): projection-modelview matrix
  • normalMatrix (mat3): normal matrix (the transpose of the inverse of the modelview)
  • lightCount (int): number of active lights (8 maximum)
  • lightPosition (vec4[8]): position of each light
  • lightNormal (vec3[8]): direction of each light
  • lightAmbient (vec3[8]): ambient component of light color
  • lightDiffuse (vec3[8]): diffuse component of light color
  • lightSpecular (vec3[8]): specular component of light color
  • lightFalloffCoefficients (vec3[8]): light falloff coefficients
  • lightSpotParameters (vec3[8]): light spot parameters (cosine of light spot angle and concentration)

Attributes

  • inVertex (vec4): xyzw coordinates of the incoming vertex
  • inColor (vec4): rgba color of the incoming vertex
  • inNormal (vec4): normal vector of incoming vertex
  • inAmbient (vec4): ambient color component of incoming vertex
  • inSpecular (vec4): specular color component of incoming vertex
  • inEmissive(vec4): emissive color component of incoming vertex
  • inShine (float): shininess component of incoming vertex

3) TEXTURED shader
Uniforms

  • projmodelviewMatrix (mat4): projection-modelview matrix
  • texcoordMatrix (mat4): texture coordinate matrix (takes care of rescaling and/or inverting the texture coordinates)
  • textureSampler (sampler2D): samples the currently bound texture
  • texcoordOffset (vec2): inverse of the texture resolution (displacement in normalized coords from one texel to the next)

Attributes

  • inVertex (vec4): xyzw coordinates of the incoming vertex
  • inColor (vec4): rgba color of the incoming vertex
  • inTexcoord (vec2): texture coordinates coordinates of incoming vertex

4) FULL shader
Uniforms

  • modelviewMatrix (mat4): modelview matrix
  • projmodelviewMatrix (mat4): projection-modelview matrix
  • normalMatrix (mat3): normal matrix
  • texcoordMatrix (mat4): texture coordinate matrix
  • textureSampler (sampler2D): samples the currently bound texture
  • texcoordOffset (vec2): inverse of the texture resolution
  • lightCount (int): number of active lights (8 maximum)
  • lightPosition (vec4[8]): position of each light
  • lightNormal (vec3[8]): direction of each light
  • lightAmbient (vec3[8]): ambient component of light color
  • lightDiffuse (vec3[8]): diffuse component of light color
  • lightSpecular (vec3[8]): specular component of light color
  • lightFalloffCoefficients (vec3[8]): light falloff coefficients
  • lightSpotParameters (vec3[8]): light spot parameters (cosine of light spot angle and concentration)

Attributes

  • inVertex (vec4): xyzw coordinates of the incoming vertex
  • inColor (vec4): rgba color of the incoming vertex
  • inTexcoord (vec2): texture coordinates coordinates of incoming vertex
  • inNormal (vec4): normal vector of incoming vertex
  • inAmbient (vec4): ambient color component of incoming vertex
  • inSpecular (vec4): specular color component of incoming vertex
  • inEmissive(vec4): emissive color component of incoming vertex
  • inShine (float): shininess component of incoming vertex

5) LINE shader
Uniforms

  • modelviewMatrix (mat4): modelview matrix
  • projectionMatrix (mat4): projection matrix (either perspective or orthographic)
  • viewport (vec4): dimensions of viewing rectangle (x,y, width, height)
  • perspective (int): 0 if lines are not affected by perspective, 1 if they are
  • scale (vec3): scale factor to make lines to be always on top of filled geometry (if < 1)

Attributes

  • inVertex (vec4): xyzw coordinates of the incoming line vertex
  • inColor (vec4): rgba color of the incoming line vertex
  • inLine (vec4): the xyz coordinates store the vertex opposite to the current (incoming) along the direction of the line segment, while w stores the displacement along the normal to the line segment (lines are rendered as a sequence of rectangular segments that are always screen facing)

6) POINT shader
Uniforms

  • modelviewMatrix (mat4): modelview matrix
  • projectionMatrix (mat4): projection matrix

Attributes

  • inVertex (vec4): xyzw coordinates of the incoming line vertex
  • inColor (vec4): rgba color of the incoming line vertex
  • inPoint (vec2): displacement for each vertex along the perimeter of the point (since points are rendered as triangle fans)

Some of these variables can be better understood by looking at the source code of the GLSL shaders, particularly for the LIT, LINE, and POINT shaders.

As an example of how to write GLSL shaders for Processing, let’s look at the code for the FLAT shader program. The vertex shader for the FLAT shader program only needs to apply the projection-modelview transformation matrix to each incoming vertex, and pass the transformed position together with the color to the fragment shader. The latter will be linearly interpolated across the pixels that result of rasterizing the triangles formed by each batch of incoming vertices:

Vertex shader code:

uniform mat4 projmodelviewMatrix;

attribute vec4 inVertex;
attribute vec4 inColor;

varying vec4 vertColor;

void main() {
  // Applying modelview+projection transformation to incoming vertex:
  gl_Position = projmodelviewMatrix * inVertex;

  // Passing unmodified vertex color to the fragment shader.
  vertColor = inColor;
}

Fragment shader code:

#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif

varying vec4 vertColor;

void main() {
  // Outputting pixel color (interpolated across triangle)
  gl_FragColor = vertColor;
}

The ifdef preprocessor directive in the fragment shader is required to make the shader compatible with OpenGL ES 2.0, since the GLSL shaders in GL ES require the specification of the precision for float and int types. If the shader is intended to run only on desktop, then the ifdef section is not needed.

If these two files are saved in the sketch data folder as, for instance, vert.glsl and frag.glsl then we could load them into our custom PShader object as follows:

PShader flatShader;

void setup() {
  size(640, 360, P3D);
  flatShader = loadShader(PShader.FLAT, "frag.glsl", "vert.glsl");
  shader(flatShader);
}

void draw() {
  background(0);
  noStroke();
  fill(200, 30, 30);
  rect(10, 10, width - 20, height - 20);
}

Since these two sample shaders are identical to the ones included with the OpenGL library, the resulting rendering won’t differ from the default. Often times it is only needed to change the fragment shader of a rendering operation in Processing, so we can re-use the default vertex shader for the given type. In that case, we can call loadShader() passing only the filename of the fragment shader:

PShader flatShader;

void setup() {
  size(640, 360, P3D);
  flatShader = loadShader(PShader.FLAT, "frag.glsl");
  shader(flatShader);
}

void draw() {
  background(0);
  noStroke();
  fill(200, 30, 30);
  rect(10, 10, width - 20, height - 20);
}

For more examples, check the sample sketches included in the 2.0a7 release under the OpenGL/Shaders category.

Posted August 2, 2012 by ac in Programming

Tagged with , , , , ,

4 responses to “Shaders in Processing 2.0 – Part 1

Subscribe to comments with RSS.

  1. I love you for posting these. I have been looking to start digging into glsl with processing but have had little luck. Thank you so much.

  2. Pingback: Shading « processing – tutorial

  3. Pingback: Shaders – Universidad Nacional de Colombia

Leave a comment