[27 Aug 2012]  Blur for environmental maps (blur for cube texture) 
This tutorial shows how to blur cube texture with the help of a shader. Blurred cube texture can be used for imagebased lighting, in environmental mapping method for simulation of transparent and reflective objects and in other effects.
Blur for cube texture isn't as easy as blur for 2D texture. You have to take into account texels from different faces of a cube texture, weight these values and properly determine influence of each texel. In imagebased lighting to make a cube map for diffuse lighting, each texel of output cube texture is calculated as averaged weighted value from all texels from the input cube texture that are in the hemisphere oriented along the direction of current texel. Cube maps for specular lighting are less demanding on number of averaged texels. It should take into account only texels that are in the cone oriented along the normal. Following image shows the cone oriented along the normal and the hemisphere oriented along the normal:
In most cases, it's inappropriate to handle all texels from input cube texture that are in the cone or the hemisphere. Let's suppose that input cube texture has size of 1024 texels. Then for each texel in output cube texture it is required to sample all texels in respective hemisphere (half of the texels in the input texture), and this is more than three million samples. Even more, if output cube texture has size of only 32 texels, then it's required to repeat blur operation for six thousand times. In total, more than 18 billion samples from cube texture are required. And unfortunately this is not realtime process on most machines. If precision isn't mandatory, then it's possible to take into account only small part of input texels from hemisphere that are uniformly distributed in the hemisphere. Of course, such sampling might miss small and very bright light sources that might considerably change blurred cube texture, in case if they were taken into account.
Uniformly distributed directions for sampling from input cube texture can be saved in twodimensional texture (very similar to normal map). Texture with normals is passed to a fragment shader, where each texel of this texture is sampled and its value is used as a direction for sampling from input cube texture. Directions in the normal map are saved in tangent space and can be used by any fragment. But to sample from input cube texture, directions in tangent space should be transformed to world space. The fragment shader has to build transformation matrix from orthonormal basis vectors (normal, tangent and bitangent) in order to transform from tangent space to world space. Tangent and bitangent vectors should be passed in the vertex shader as vertex attributes.
Generation of texture with directions in tangent space
To generate texture with uniformly distributed sampling directions it is possible to project uniformly distributed positions from 2D grid onto a sphere. 2D cartesian coordinates are transformed to spherical coordinates, and then spherical coordinates are transformed to 3D cartesian coordinates that can be used as a direction for sampling.
Following image shows how 2D grid of positions is interpreted. First of all 2D cartesian coordinates are mapped from [0, 1] range to [1, 1] range.
Spherical coordinates are determined through three values: r, theta ³ phi. Radius r in our case is equal to 1, as we are working with normalized directions.
Angle theta  angle between the direction of Z axis and current direction. Let's find value of theta. 2D cartesian space is divided into four sectors along axes X, Y, X and Y. In each sector projection of a point on the axis of the sector is interpreted as sin(theta).
Angle phi  angle around Z axis beginning from the X axis in counterclockwise direction. Angle is determined for each point as atan2(x, y).
Now we have spherical coordinates. Standard transformation from spherical coordinates to 3D cartesian coordinates is following: 

x, y and z values determine direction for sampling. Following image shows texture with directions for sampling. Each texel of this texture was calculated as described previously.
Function that maps position of unit square to direction for sampling:
glm::vec3 mapToHemisphere(const glm::vec2 & point, float maxVertAngle = M_PI/2)
{
// point on 2D square in [0, 1] range
glm::vec2 in = point;
// map point to [1, 1] range
in = in * 2.0f  1.0f;
// perpendicular direction
if(in.x == 0 && in.y ==0)
{
// in tangent space perpendicular is parallel to Z axis
return glm::vec3(0, 0, 1);
}
// deviation from perpendicular in range [0, 1]
float sinTheta;
if(in.y > in.x) // above the line y=x
{
if(in.y < in.x) // under the line y=x
{
sinTheta = in.x;
}
else // above the line y = x
{
sinTheta = in.y;
}
}
else // under the line y=x
{
if(in.y > in.x) // above the line y=x
{
sinTheta = in.x;
}
else // under the line y = x
{
sinTheta = in.y;
}
}
// determine theta  angle with vertical axis
float theta = asinf(sinTheta);
// scale angle. By default theta is in range [0, M_PI/2]
theta *= maxVertAngle/(M_PI/2);
// normalized direction in 2D
in = glm::normalize(in);
// determine angle around vertical axis
float phi = atan2(in.y, in.x);
// transform spherical coordinates to 3D cartesian
glm::vec3 out;
out.x = cos(phi) * sin(theta);
out.y = sin(phi) * sin(theta);
out.z = cos(theta);
return out;
}
Function creates and fills texture with directions for sampling (in tangent space):
std::unique_ptr<Texture2D> getHemisphereNormalsTexture(int size, float hemiAngle)
{
// allocate memory for texture
int components = 3;
std::unique_ptr data(new GLubyte[components * size * size]);
// half texel offset
float base = 0.5 / size;
// calculate direction for each texel in texture
for(int x=0; x
{
// position on 2D square along X axis
float xx = float(x)/size;
for(int y=0; y
{
// position on 2D square along Y axis
float yy = float(y)/size;
// get 3D direction
glm::vec3 res = mapToHemisphereComplex(glm::vec2(xx + base,
yy + base), hemiAngle);
// location of texel in the texture
int offset = x * components + y * size * components;
// save direction to texture and map to range [0, 255]
data[offset++] = (res.x + 1.0f)/2.0f * 255;
data[offset++] = (res.y + 1.0f)/2.0f * 255;
data[offset++] = (res.z + 1.0f)/2.0f * 255;
}
}
// create OpenGL texture and fill it with data
std::unique_ptr<Texture2D> tex(new Texture2D());
tex>setup(size, size, GL_RGB, GL_RGB, GL_UNSIGNED_BYTE, data.get());
return std::move(tex);
}
Shader that blurs cube texture
Following setup is required before rendering:
Set output cube texture (blurred) as render target.
Object to render  a sphere.
Camera is located inside the sphere. Fov = 90 degrees. Aspect ratio = 1.
Render the scene six times. Each time select next face of output cube texture and setup view matrix appropriately.
Pass input cube texture and texture with directions to fragment shader
Vertex shader simply passes normal and tangent vectors to fragment shader.
In fragment shader each direction from texture with directions is transformed from tangent space to world space. Then world space direction is used to sample from input cube texture. Sampled value is added to accumulator variable. After all directions are handled, color is averaged and being saved to render target (face of blurred cube map).
Vertex shader that passes normal and tangent vectors to fragment shader:
#version 330
// attributes
layout(location = 0) in vec3 i_position; // xyz  position
layout(location = 1) in vec3 i_normal; // xyz  normal
layout(location = 3) in vec4 i_tangent; // xyz  tangent, w  handedness
// matrices
uniform mat4 u_modelViewProjectionMat;
uniform mat3 u_normalMat;
// data for fragment shader
out vec3 o_worldNormal;
out vec3 o_worldTangent;
out float o_handedness;
void main(void)
{
// vertex in screen space
gl_Position = u_modelViewProjectionMat * vec4(i_position, 1);
// transform normal and tangent to world space
o_worldNormal = normalize(u_normalMat * i_normal);
o_worldTangent = normalize(u_normalMat * i_tangent.xyz);
o_handedness = i_tangent.w;
}
Fragment shader that blurs input cube texture:
#version 330
// data from vertex shader
in vec3 o_worldNormal;
in vec3 o_worldTangent;
in float o_handedness;
// texture that should be blurred
layout(location = 0) uniform samplerCube u_colorTexture;
// texture with directions in tangent space
layout(location = 1) uniform sampler2D u_normalsTexture;
// size of texture with directions
uniform vec2 u_normalTextureDimensions;
// color to framebuffer
out vec4 resultingColor;
void main(void)
{
// normalize normal and tangent after interpolation
vec3 N = normalize(o_worldNormal);
vec3 T = normalize(o_worldTangent);
// restore orthogonaliti after interpolation
T = T  N * dot(N, T);
// restore bitangent
vec3 B = cross(N, T) * o_handedness;
// offset with size of one texel
vec2 texStep = vec2(1.0, 1.0) / u_normalTextureDimensions;
// offset with size of half of a texel
vec2 texOffset = vec2(0.5, 0.5) / u_normalTextureDimensions;
// matrix that transforms from tangent space to world space
mat3 toWorldSpace = mat3(T, B, N);
// variables to accumulate color and weight
vec3 accumulatedColor = vec3(0, 0, 0);
float accumulatedWeight = 0;
// for each direction in texture with directions
for(float x=0; x < u_normalTextureDimensions.x; x+=1)
{
for(float y=0; y < u_normalTextureDimensions.y; y+=1)
{
// texture coordinates for sampling from texture with directions
vec2 texCoords = texOffset + texStep * vec2(x, y);
// direction from texture with directions (tangent space)
vec3 NT = normalize(texture(u_normalsTexture, texCoords).rgb * 2.0  1.0);
// transform direction to wold space
vec3 NW = normalize(toWorldSpace * NT);
// sample from input cube texture with world space direction
vec3 sampleColor = texture(u_colorTexture, NW).rgb;
// decrease influence with Lambert's law
float weight = dot(NW, N);
// add and weight color
accumulatedColor += sampleColor * weight;
accumulatedWeight += weight;
}
}
// save average color
resultingColor.xyz = accumulatedColor / accumulatedWeight;
resultingColor.a = 1;
}
Sun and Black Cat Igor Dykhta () © 20072014