18.1. Hatching Example
Bert Freudenberg of the University of Magdeburg in Germany was one of the first people outside 3Dlabs to come up with a unique OpenGL shader. His area of research has been to use programmable hardware to produce real-time NPR effects such as hatching and half-toning. He experimented with a prototype implementation of the OpenGL Shading Language in the summer of 2002 and produced a hatching shader that he agreed to share with us for this book.
This shader has a few unique features, and the steps involved in designing this shader are described in Bert's Ph.D. thesis, Real-Time Stroke-based Halftoning (2003). Bert's hatching shader is based on a woodblock printing shader by Scott Johnston that is discussed in Advanced RenderMan: Creating CGI for Motion Pictures by Tony Apodaca and Larry Gritz (1999).
The goal in a hatching shader is to render an object in a way that makes it look hand-drawn, for instance with strokes that look like they may have been drawn with pen and ink. Each stroke contributes to the viewer's ability to comprehend the tone, texture, and shape of the object being viewed. The effect being sought in this shader is that of a woodcut print. In a woodcut, a wooden block carved with small grooves is covered with ink and pressed onto a surface. The image left on the surface is the mirror image of the image carved into the wood block. No ink is left where the grooves are cut, only where the wood is left uncut. Lighting is simulated by varying the width of the grooves according to light intensity.
We face a number of challenges in developing a shader that simulates the look of a woodcut print. The first thing we need is a way of generating stripes that defines the tone and texture of the object. Alternating white and black lines provides the highest contrast edges and thus represents a worst-case scenario for aliasing artifacts; thus antialiasing is a prime consideration. We also want our lines to "stay put" on the object so that we can use the object in an animation sequence. Finally, the lines in a woodcut print are not always perfectly straight and uniform as though they were drawn by a computer. They are cut into the wood by a human artist, so they have some character. We'd like the lines that our shader generates to have some character as well.
18.1.1. Application Setup
The application needs to send vertex positions, normals, and a single set of texture coordinates to the hatching shader. The normals are used in a simple lighting formula, and the texture coordinates are used as the base for procedurally defining the hatching pattern. The light position is passed in as a uniform variable, and the application also updates the value of the Time uniform variable at each frame so that the behavior of the shader can be modified slightly each frame. What do you suppose we use to give our lines some character? You guessed itthe noise function. In this case, we have the application generate the noise values that are needed and store the results in a 3D texture. For this reason, the value for a uniform variable of type sampler3D is provided by the application to inform the fragment shader which texture unit should be accessed to obtain the noise values.
18.1.2. Vertex Shader
The hatch vertex shader is shown in Listing 18.1. The first line is the only line that looks different from shaders we've discussed previously. The varying variable ObjPos is the basis for our hatching stroke computation in the fragment shader. To animate the wiggle of the lines, the vertex shader adds the uniform variable Time to the z coordinate of the incoming vertex position. This makes it appear as though the wiggles are "flowing" along the z-axis. A scaling value is also used to make the hatching strokes match the scale of the object being rendered. (To accommodate a variety of objects, we should probably replace this value with a uniform variable.) The remainder of the vertex shader performs a simple diffuse lighting equation, copies the t coordinate of the incoming texture coordinate into the varying variable V, and computes the value of the built-in variable gl_Position.
Listing 18.1. Vertex shader for hatching
18.1.3. Generating Hatching Strokes
The purpose of our fragment shader is to determine whether each fragment is to be drawn as white or black in order to create lines on the surface of the object. As we mentioned, there are some challenges along the way. To prepare for the full-blown hatching shader, we develop some of the techniques we need and illustrate them on a simple object: a sphere.
We start with the same code that was presented in Section 17.4.1 for generating vertical stripes, namely,
float sawtooth = fract(V * 16.0); float triangle = abs(2.0 * sawtooth - 1.0); float square = step(0.5, triangle);
Recall that V was a varying variable passed from the vertex shader. It was equal to the s texture coordinate if we wanted to generate vertical stripes and equal to the t texture coordinate if we wanted to generate horizontal stripes. We chose the number 16 to give us 16 white stripes and 16 black stripes. The result of this code is illustrated in Figure 18.1. We can modify the relative size of the white and black stripes by adjusting the threshold value provided in the step function.
Figure 18.1. A sphere with a stripe pattern generated procedurally based on the s texture coordinate (Courtesy of Bert Freudenberg, University of Magdeburg, 2002)
18.1.4. Obtaining Uniform Line Density
We now have reasonable-looking stripes, but they aren't of uniform width. They appear fatter along the equator and pinched in at the pole. We'd like to end up with lines that are of roughly equal width in screen space. This requires the use of the dFdx and dFdy functions:
float dp = length(vec2(dFdx(V), dFdy(V)));
As we learned in Section 17.4.3, this computation provides us with the gradient (i.e., how rapidly V is changing at this point on the surface). We can use this value to adjust the density of lines in screen space.
Computing the actual gradient with the length function involves a potentially costly square root operation. Why not use the approximation to the gradient discussed in Section 17.4.3? In this case we must compute the actual gradient because the approximation to the gradient isn't quite good enough for our purposes. We're not using this value to estimate the filter width for antialiasing; instead we're using it to compute the stripe frequency. This computation needs to be rotationally invariant so that our stripes don't jump around just because we rotate the object. For this reason, we need to compute the actual length of the gradient vector, not just the sum of the absolute values of the two components of the gradient vector.
So, we use the base 2 logarithm of this value (shown applied to the sphere in Figure 18.2 (A)) to adjust the density in discrete stepseach time dp doubles, the number of lines doubles unless we do something about it. The stripes get too thin and too dense if doubling occurs. To counteract this effect (because we are interested in getting a constant line density in screen space), we decrease the number of stripes when the density gets too high. We do this by negating the logarithm.
Figure 18.2. Adjusting the stripe frequency. The integer part of the logarithm of the gradient (A) is the basis for determining stripe frequency. First, the sphere is shown with a higher frequency of stripes (B). The integer part of the logarithm then adjusts the stripe frequency in (C), and the effect of tapering the ends is shown in (D). (Courtesy of Bert Freudenberg, University of Magdeburg, 2002)
float logdp = -log2(dp); float ilogdp = floor(logdp); float frequency = exp2(ilogdp); float sawtooth = fract(V * 16.0 * frequency);
A sphere with a higher stripe frequency is shown in Figure 18.2 (B). As you can see, the lines look reasonable in the lower part of the sphere, but there are too many at the pole. By applying the stripe frequency adjustment, we end up with stripes of roughly equal width across the sphere (see Figure 18.2 (C)). Notice the correlation between Figure 18.2 (A) and Figure 18.2 (C).
The next issue to address is the abrupt changes that occur as we jump from one stroke frequency to the next. Our eyes detect a distinct edge along these transitions, and we need to take steps to soften this edge so that it is less distracting. We can accomplish this by using the fractional part of logdp to do a smooth blend between two frequencies of the triangle wave. This value is 0 at the start of one frequency, and it increases to 1.0 at the point where the jump to the next frequency occurs.
float transition = logdp - ilogdp;
As we saw earlier, we can generate a triangle wave with frequency double that of a triangle wave with frequency t by taking abs (2.0 * t 1.0). We can use the value of transition to linearly interpolate between t and abs (2.0 * t 1.0) by computing (1.0 transition) * t + transition * abs (2.0 * t 1.0). This is exactly the same as if we did mix (abs (2.0 * t 1.0), t, transition). But instead of using mix, we note that this is equivalent to abs ((1.0 transition) * t - transition). Using the previously computed value for our base frequency (triangle), we end up with the following code:
triangle = abs((1.0 - transition) * triangle - transition);
The result of drawing the sphere with uniform stripe density and tapered ends is shown in Figure 18.2 (D).
18.1.5. Simulating Lighting
To simulate the effect of lighting, we'd like to make the dark stripes more prominent in regions that are in shadow and the white stripes more prominent in regions that are lit. We can do this by using the computed light intensity to modify the threshold value used in the step function. In regions that are lit, the threshold value is decreased so that black stripes get thinner. In regions that are in shadow, the threshold value is increased so that the black stripes get wider.
const float edgew = 0.2; // width of smooth step float edge0 = clamp(LightIntensity - edgew, 0.0, 1.0); float edge1 = clamp(LightIntensity, 0.0, 1.0); float square = 1.0 - smoothstep(edge0, edge1, triangle);
Once again, we use the smoothstep function to antialias the transition. Because our stripe pattern is a (roughly) constant width in screen space, we can use a constant filter width rather than an adaptive one. The results of the lighting effect can be seen in Figure 18.3.
Figure 18.3. Applying lighting to the sphere. In (A), the sphere is lit with a simple lighting model and no stripes. In (B), the light intensity modulates the width of the stripes to simulate the effect of lighting. (Courtesy of Bert Freudenberg, University of Magdeburg, 2002)
18.1.6. Adding Character
If a woodcut block is made by a human artist and not by a machine, the cuts in the wood aren't perfect in thickness and spacing. How do we add some imperfections into our mathematically perfect description of hatching lines? With noise (see Chapter 15). For this shader, we don't need anything fancy; we just want to add some "wiggle" to our otherwise perfect lines or perhaps some "patchiness" to our simple lighting equation. We can use a tileable 3D texture containing Perlin noise in the same way for this shader as for the shaders in Chapter 15.
To add wiggle to our lines, we modify the sawtooth generation function:
float sawtooth = fract((V + noise * 0.1) * frequency * stripes);
The result of noise added to our stripe pattern is illustrated in Figure 18.4.
Figure 18.4. Adding noise to the hatching algorithm. In (A), the Perlin noise function is applied directly to the sphere's surface. In (B), it modulates the parameter that defines the frequency of the hatching strokes. (Courtesy of Bert Freudenberg, University of Magdeburg, 2002)
18.1.7. Hatching Fragment Shader
The pieces described in the preceding sections are put together in the general-purpose hatching shader shown in Listing 18.2. It bases its hatching stroke computation on the t texture coordinate, so the result is horizontal stripes rather than vertical ones. The results of applying this shader to the teapot model are shown in Figure 18.5.
Figure 18.5. Woodcut-style teapot rendered with the hatching shader (Courtesy of Bert Freudenberg, University of Magdeburg, 2002)
Listing 18.2. Fragment shader for woodcut-style rendering