sunLoadingImage
whowedImag
decoration left 1
decoration left 2
transhome
transprojects
transgallery
transarticles
decoration rigth
English

Show/Hide search bar
black cat logo variable logo
[11 Dec 2012]

Normal Mapping

Per-pixel lighting (Phong shading) uses normals that are automatically interpolated across triangles (as input of fragment shader). Normal in each fragment depends on three normals at vertices of the triangle. Normal mapping method substitutes uniform interpolated normals in each fragment by unique normals, which are created by designer or generated from a high-polygonal mesh. Perturbed normals lead to changes in lighting. Following pictures show plain surface with simple interpolated normals and plain surface rendered with normal mapping method. Model with perturbed normals looks like it has some relief, but in reality it's plain.
Simple lighting + Normal map = Lighting with normal mapping

Information about required changes for normals is passed to fragment shader as normal map. Each texel of this texture contains one normal. Usually, normals are sampled from normal texture by same texture coordinates that are used for sampling of diffuse texture. Normals in normal map are saved in texture tangent space. X and Y axes of tangent space match directions of increase of image's width and height, and Z axis is perpendicular to image (points out of the screen). This article describes how to build normal map.
All vectors that are used in calculation of lighting must be in same space. But vectors in vertex shader (for example, direction from vertex position to light source) are in world space, and normal from normal map is in tangent space. Transformation of normal to world space is inefficient (multiplying by matrix in fragment shader for every fragment), so we will transfrom vectors (direction to the light, direction to the camera, etc) to tangent space in vertex shader and pass them to fragment shader.
In this case, tangent space is tangent to model's texture coordinates. Each vertex has its unique tangents space. Z axis of tangent space is equal to normal at the vertex. X and Y axes have same direction as gradient of the texture coordinates at the vertex (S and T texture coordinates respectively). X and Y axes of tangent space are called tangent and bitangent vectors.
Tangent space is defined by normal, tangent and bitangent vectors that form orthonormal basis. These vectors also form 3x3 rotation matrix that transforms vectors from world space to tangent space (matrix projects/rotates arbitrary vector to space defined by three orthonormal vectors). We can calculate normals of a mesh as cross product of edges adjacent to vertices, but tangets and bitangents are unknown. This article describes how to calculate tangent and bitangent vectors.
Transform vector to tangent space
Now we have tangents and bitangents. Next step is to pass these vectors as vertex attributes to vertex shader. Article about calculation of tangent space shows that bitangent vector can be restored in vertex shader as cross product of normal and tangent vectors (and multiplied by handedness value). All required data for normal mapping is stored in one four-component (vec4) vertex attribute, where xyz components store tangent, and w - handedness value.
vec4 tangent; // attribute to vertex shader
tangent.xyz = tangentVector;
tangent.w   = determinant(Mto_tangent)
Bitangent vector is restored in the following way:
B = (N x T.xyz) * T.w
In fragment shader, matrix formed from tangent, bitangent and normal vectors transforms vectors (direction to the light, half vector, reflected vector, etc) from world to tangent space. New vectors are passed to fragment shader and automatically interpolated. Vectors passed from vertex shader should be normalized in fragment shader. Normal Mapping vertex shader is shown in the next listing:
// VERTEX SHADER FOR NORMAL MAPPING
#version 330

// ATTRIBUTES
layout(location = 0) in vec3 i_position;
layout(location = 1) in vec3 i_normal;
layout(location = 2) in vec2 i_texcoord1;
layout(location = 3) in vec4 i_tangent; // xyz - tangent, w - sign

// UNIFORMS
uniform mat4 u_proj_mat;
uniform mat4 u_view_mat;
uniform mat4 u_model_mat;
uniform mat3 u_normal_mat;
uniform vec3 u_light_position;
uniform vec3 u_camera_position;

// TO FRAGMENT SHADER
out vec2 v_texcoord1;
out vec3 v_directionToLight;
out vec3 v_directionToCamera;

void main(void){
  vec4 worldPos = u_model_mat * vec4(i_position,1.0);
  gl_Position = u_proj_mat * u_view_mat * worldPos;

  // TRANSFORM NORMALS AND TANGENT WITH TO WORLD SPACE
  vec3 modelNormal = u_normal_mat * i_normal;
  vec3 modelTangent = u_normal_mat * i_tangent.xyz;

  // CALCULATE OUTPUTS IN WORLD SPACE
  v_texcoord1 = i_texcoord1;
  v_directionToLight = normalize(u_light_position-worldPos.xyz);
  v_directionToCamera = normalize(u_camera_position-worldPos.xyz);

  // RESTORE BITANGENT
  vec3 bitangent = i_tangent.w*cross(modelNormal,modelTangent.xyz);

  // TRANSFORM DIRECTION TO LIGHT AND TO CAMERA TO TANGENT SPACE
  v_directionToLight = vec3( dot(modelTangent.xyz, v_directionToLight),
          dot(bitangent, v_directionToLight),
          dot(modelNormal, v_directionToLight));

  v_directionToCamera = vec3( dot(modelTangent.xyz, v_directionToCamera),
          dot(bitangent,v_directionToCamera),
          dot(modelNormal, v_directionToCamera));
}
Normal from normal map is already in tangent space. It's encoded as RGB value, and must be restored from [0,255] interval to [-1,1] interval:
Recover normal from RGB
Fragment shader calculates lighting in tangent space. All calculations of diffuse and specular lighting are same as for simple world space ligting, but vectors are in tangent space. Normal mapping effect can be scaled by mixing uniform normal and normal from normal map in different proportions. Uniform (default, original) normal in tangent space is equal to (0,0,1).
// FRAGMENT SHADER FOR NORMAL MAPPING
#version 330

// UNIFORMS
uniform sampler2D u_colorTexture;
uniform sampler2D u_normalTexture;
uniform vec3 u_lightColor;

// FROM VERTEX SHADER
in vec2 v_texcoord1;
in vec3 v_directionToLight;
in vec3 v_directionToCamera;

void main(void){
  // RESTORE NORMAL FROM NORMAL TEXTURE
  vec3 norfromtex = normalize( texture(u_normalTexture, v_texcoord1).xyz * 2.0 - 1.0);

  // DECREASE DETAILS BY MIXING NORMAL WITH UNIFORM NORMAL
  float factor = 1;
  vec3 N = normalize(norfromtex*factor + vec3(0,0,1)*(1.0-factor));

  // PERFORM LIHGTING AS USUAL, BUT ALL VECTORS ARE IN TANGENT SPACE
  vec3 colfromtex = texture( u_colorTexture, v_texcoord1).xyz;
  vec3 L = v_directionToLight;
  vec3 H = normalize(v_directionToLight+v_directionToCamera);
  float idiff = diffuseSimple(N,L);
  float ispec = specularSimple(N,H);
  gl_FragColor.xyz = (vec3(iambi) + idiff + ispec)*colfromtex*u_lightColor;
  gl_FragColor.w = 1.0;
}
Bump mapping method is similar to normal mapping method and perturbs normals in the same way. The difference is that bump mapping uses height map, not normal map. Tangent space is calcutated on the fly. Method requires less memory, but is slower because it requires 4+ samples from depth texture to reconstruct normal from each fragment.



Sun and Black Cat- Igor Dykhta (igor dykhta email) 2007-2014