﻿ Section 10.5.  Another Environment Mapping Example

Free JavaScript Editor     Ajax Editor ﻿

Main Page

### 10.5. Another Environment Mapping Example

Another option for environment mapping is to use photographic methods to create a single 2D texture map, called an EQUIRECTANGULAR TEXTURE MAP or a LAT-LONG TEXTURE MAP. This type of texture can be obtained from a realworld environment by photography techniques, or it can be created to represent a synthetic environment. An example is shown in Color Plate 9.

Whatever means are used to obtain an image, the result is a single image that spans 360° horizontally and 180° vertically. The image is also distorted as you move up or down from the center of the image. This distortion is done deliberately so that you will see a reasonable representation of the environment if you "shrink-wrap" this texture around the object that you're rendering.

The key to using an equirectangular texture as the environment map is to produce a pair of angles that index into the texture. We compute an altitude angle by determining the angle between the reflection direction and the XZ plane. This altitude angle varies from 90° (reflection is straight up) to 90° (reflection is straight down). The sine of this angle varies from 1.0 to 1.0, and we use this fact to get a texture coordinate in the range of [0,1].

We determine an azimuth angle by projecting the reflection direction onto the XZ plane. The azimuth angle varies from 0° to 360°, and this gives us the key to get a second texture coordinate in the range of [0,1].

The following OpenGL shaders work together to perform environment mapping on an object by using an equirectangular texture map. These shaders are derived from a "bumpy/shiner" shader pair that was developed with John Kessenich and presented at SIGGRAPH 2002. The altitude and azimuth angles are computed to determine s and t values for indexing into our 2D environment texture. This texture's wrapping behavior is set so that it wraps in both s and t. (This supports a little trick that we do in the fragment shader.) Otherwise, the initial conditions are the same as described for the cube map environment mapping example.

Listing 10.7 comprises the vertex shader that does environment mapping with an equirectangular texture map. The only real difference between this shader and the one described in Section 10.4.2 is that this one computes Normal and EyeDir and passes them to the fragment shader as varying variables so that the reflection vector can be computed in the fragment shader.

##### Listing 10.7. Vertex shader used for environment mapping

 ```varying vec3 Normal; varying vec3 EyeDir; varying float LightIntensity; uniform vec3 LightPos; void main() { gl_Position = ftransform(); Normal = normalize(gl_NormalMatrix * gl_Normal); vec4 pos = gl_ModelViewMatrix * gl_Vertex; EyeDir = pos.xyz; LightIntensity = max(dot(normalize(LightPos - EyeDir), Normal), 0.0); }```

Listing 10.8 contains the fragment shader that does environment mapping by using an equirectangular texture map.

##### Listing 10.8. Fragment shader for doing environment mapping with an equirectangular texture map

 ```const vec3 Xunitvec = vec3(1.0, 0.0, 0.0); const vec3 Yunitvec = vec3(0.0, 1.0, 0.0); uniform vec3 BaseColor; uniform float MixRatio; uniform sampler2D EnvMap; // = 4 varying vec3 Normal; varying vec3 EyeDir; varying float LightIntensity; void main() { // Compute reflection vector vec3 reflectDir = reflect(EyeDir, Normal); // Compute altitude and azimuth angles vec2 index; index.t = dot(normalize(reflectDir), Yunitvec); reflectDir.y = 0.0; index.s = dot(normalize(reflectDir), Xunitvec) * 0.5; // Translate index values into proper range if (reflectDir.z >= 0.0) index = (index + 1.0) * 0.5; else { index.t = (index.t + 1.0) * 0.5; index.s = (-index.s) * 0.5 + 1.0; } // if reflectDir.z >= 0.0, s will go from 0.25 to 0.75 // if reflectDir.z < 0.0, s will go from 0.75 to 1.25, and // that's OK, because we've set the texture to wrap. // Do a lookup into the environment map. vec3 envColor = vec3(texture2D(EnvMap, index)); // Add lighting to base color and mix vec3 base = LightIntensity * BaseColor; envColor = mix(envColor, base, MixRatio); gl_FragColor = vec4(envColor, 1.0); }```

The varying variables Normal and EyeDir are the values generated by the vertex shader and then interpolated across the primitive. To get truly precise results, these values should be normalized again in the fragment shader. However, for this shader, skipping the normalization gives us a little better performance, and the quality is acceptable for certain objects.

The constants Xunitvec and Yunitvec have been set up with the proper values for computing our altitude and azimuth angles. First, we compute our altitude angle by normalizing the reflectionDir vector and performing a dot product with the Yunitvec constant. Because both vectors are unit vectors, this dot product computation gives us a cosine value for the desired angle that ranges from [1,1]. Setting the y component of our reflection vector to 0 causes it to be projected onto the XZ plane. We normalize this new vector to get the cosine of our azimuth angle. Again, this value ranges from [1,1]. Because the horizontal direction of our environment texture spans 360°, we multiply by 0.5 so that we get a value that maps into half of our environment map. Then we need to do a little more work to determine which half this is.

If the z portion of our reflection direction is positive, we know that the reflection direction is "toward the front" and we use the computed texture map indices directly. The index values are scaled and biased so that when we access the environment map texture, we get s values that range from [0.25,0.75] and t values that range from [0,1].

If z is negative, we do our calculations a little differently. The t value is still computed the same way, but the s value is scaled and biased so that it ranges from [0.75,1.25]. We can use these values directly because we've set our texture wrap modes to GL_REPEAT. s values between 1.0 and 1.25 will map to s values from 0 to 0.25 in our actual texture (the trick alluded to earlier). In this way, we can properly access the entire environment texture, depending on the reflection direction. We could compare s to 1.0 and subtract 1.0 if its value is greater than 1.0, but this would end up requiring additional instructions in the machine code and hence the performance would be reduced. By using the repeat mode trick, we get the hardware to take care of this for free.

With our index values set, all we need to do is look up the value in the texture map. We compute a diffusely lit base color value by multiplying our incoming light intensity by BaseColor. We mix this value with our environment map value to create a ceramic effect. We then create a vec4 by adding an alpha value of 1.0 and send the final fragment color on for further processing. The final result is shown in Color Plate 11A. You can see the branches from the tree in the environment on the back and rear of the triceratops. For this example, we used a color of (0.4, 0.4, 1.0) (i.e., light blue) and a mix ratio of 0.8 (i.e., 80% diffuse color, 20% environment map value).

An example of environment mapping that assumes a mirrorlike surface and adds procedural bumps is shown in Color Plate 11B.

﻿
ofhumanrights.org
Ajax Editor     JavaScript Editor