Shaders in Processing 2.0 beta   9 comments

Processing 2.0 beta was finally released a couple of weeks ago, with the 2.0b3 version being the latest release in the beta series at this time. The new shader API described in the previous posts (1, 2, and 3) received some tweaking and cleaning-up, so the examples mentioned earlier might not work properly. Check below for a more detailed description of the shader API updates in the beta.
Update: With the release of Processing 2.0 final, some of the contents in this post are outdated, please check this tutorial for a detailed description of the finalized shader API.

The basic structure of the shader mechanism in Processing 2.0 beta remains the same as in the latest alphas, with shaders being classified by Processing in 6 types depending on the class of geometry they are meant to render:

  1. filled polygons without lights and textures
  2. filled polygons with lights but without textures
  3. filled polygons with textures but without lights
  4. filled polygons with lights and textures
  5. stroke geometry (lines)
  6. points

In the alphas, the type of shader had to be specified explicitly when loading the GLSL code, by means of a PShader constant (FLAT, LIT, TEXTURED, FULL, LINE, POINT). This is not longer the case, since the shader type is automatically determined by Processing by examining the GLSL source code. As described in this previous post, each shader type must expose a certain number of uniform and attribute variables needed to handle the geometry. So for example, if the vertex shader being loaded declares a vec4 attribute named “inLine”, then Processing will try to use that shader to render stroke lines.

A shader should be set immediately before the drawing calls that should be rendered by it, and once it is used the resetShader() call should be issued to restore the default shaders:

PImage img;
PShader edges;
void setup() {
  size(640, 360, P2D);
  img = loadImage("leaves.jpg");      
  edges = loadShader("edges.glsl");
  ...
}

void draw() {  
  shader(edges);  
  // Render image using shader edges.  
  image(img, 0, 0);
  resetShader();

  // Further images will be rendered with the
  // default shader for textures.
  ...
}

In this code example, the shader edges should be capable of handling textures, which means that it needs to have a vec2 attribute named “inTexcoord” to stores the texture coordinates of the image. If Processing doesn’t find this attribute in the GLSL code, then it will print a warning message and then use its own default shader for rendering textures.

Posted September 24, 2012 by ac in Programming, Software

Tagged with , , , , ,

9 responses to “Shaders in Processing 2.0 beta

Subscribe to comments with RSS.

  1. Hi Andrés – I really love the new shader capabilties of processing, they make image processing work much easier. Thanks for your hard work! I’ve run into a problem with a project I’m working on, however, and was wondering if you could help. It looks like an issue working with the “filter” call of the PGraphics object with a shader. I filed a bug here with the details: http://code.google.com/p/processing/issues/detail?id=1301

    • Hello Noel, thanks for your feedback, very happy to know that the new shader functionality is useful in your projects. I’m sorry for replying this late, but I got caught with other projects recently, and had to take some time off from Processing development. I will go through the issues db shortly and look at the one you have entered. Cheers, Andres.

  2. Hi Andres, first of all, thanks a lot for your effort of writing down all this.. I have some difficulties though. I have currently compiled processing revision r10784.. and I am completely failing to load texture coordinates into fragment shader. All values as gl_TexCoord[0].st, gl_MultiTexCoord0 even suggested attribute “inTexcoord” do not return anything useful.. I am positive that textures are available to shader by passing some other vectors in texture2D().. results are wild but texture is definitely there. I am using obj model for geometry without any mtl attached, shaders without textures works great, default shader works even with texture just fine.

    I am using 64-bit debian squeeze /w NVIDIA NVS 4200M, propieretary drivers via optirun.

    After few days of messing around and reading anything on web regarding similar issues I could not find any help, so any help or just pointing somewhere (?) is much appreciated.

    Thanks a lot for any response,

    Krystof

    • Hello, I was away from the blog so I didn’t get your post until now. If you can draw images normally using the image() function then it means that GLSL texturing is working ok (the P2D/P3D renderers use shaders to draw everything). As a reference, below you have the shaders that the renderers use to draw textured geometry (w/out lights):

      uniform mat4 projmodelviewMatrix;
      uniform mat4 texcoordMatrix;

      attribute vec4 inVertex;
      attribute vec4 inColor;
      attribute vec2 inTexcoord;

      varying vec4 vertColor;
      varying vec4 vertTexcoord;

      void main() {
      gl_Position = projmodelviewMatrix * inVertex;

      vertColor = inColor;
      vertTexcoord = texcoordMatrix * vec4(inTexcoord, 1.0, 1.0);
      }

      uniform sampler2D textureSampler;

      uniform vec2 texcoordOffset;

      varying vec4 vertColor;
      varying vec4 vertTexcoord;

      void main() {
      gl_FragColor = texture2D(textureSampler, vertTexcoord.st) * vertColor;
      }

      Note that the texture coordinates are passed to the fragment shader using the vertTexcoord varying, which is set in the vertex shader from the inTexcoord uniform. The old built-in variables gl_TexCoord are not used because they have been deprecated in recent versions of GLSL.

      • Thanks for full explanation, it is working now.. the problem was actually combining shaders with the SAITO’s obj loader.. in this case you need to define at least one texture map in material STL file, otherwise shader can’t handle textures. Now it is clear. Thanks alot, for great work!

  3. Hi Andres, thanks a lot for this huge step forward in Processing.

    Having this six different shaders means that it is necessary to write a different shader for every class of geometry used in a sketch? As the attributes of every shader class are different, it is not possible to reuse the shader code?

    I added a code snippet in the processing forum to describe my question better: https://forum.processing.org/topic/different-shaders-for-geometry-lines-and-text.

    Thanks in advance

    • No, you only need to write the shader for the type of geometry you need to handle separately. For the rest of the geometry, Processing will use the default shaders that are included in the opengl renderer. I will follow up your question on the forum.

  4. Pingback: Shaders : beautifulseams

  5. Pingback: Processing 2.0 / PAGE online

Leave a comment