Skip to main content
elric neumann

WebGL alpha blending modulation as texture layers

Alpha blending is a technique in computer graphics that enables combining two textures by modifying the opacity (alpha) of each pixel. Each pixel has RGB color values and an alpha value that represents transparency, where alpha values range from 0 (fully transparent) to 1 (fully opaque).

In WebGL, we can apply alpha blending without CPU-bound modulation of each color channel and instead implement it using shaders running on the GPU to process each pixel (final source).

graph

Vertex shader: position and texture coordinates #

The vertex shader defines the positions and texture coordinates of vertices, forming a rectangle that covers the screen. This allows blending to be applied across the entire area (which we'll apply using the fragment shader). The attributes a_position and a_texcoord represent the vertex positions and the corresponding texture coordinates. For alpha blending, each texture coordinate (0,0) to (1,1) maps the textures onto this rectangle:

attribute vec4 a_position;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;

void main() {
  gl_Position = a_position;
  v_texcoord = a_texcoord;
}

Each position in a_position is mapped to Normalized Device Coordinates (NDC) in the range of 1 to 1 (which is the clip space configuration for WebGL, and OpenGL by extension), covering the full screen area. The a_texcoord values range from (0,0) to (1,1), so each texture is sampled across the rectangle formed by the vertices.

Fragment shader: color blending #

In the fragment shader, we blend the two texture layers. For each pixel, the fragment shader samples color and alpha values from the textures and applies the alpha blending relation. The colors color1 and color2 represent the colors from the base and overlay textures, respectively:

precision mediump float;

varying vec2      v_texcoord;
uniform sampler2D u_texture1;
uniform sampler2D u_texture2;
uniform float     u_alpha;

void main() {
  vec4 color1 = texture2D(u_texture1, v_texcoord);
  vec4 color2 = texture2D(u_texture2, v_texcoord);

  gl_FragColor = vec4(
    mix(color1.rgb, color2.rgb, color2.a * u_alpha),
    color1.a + color2.a * (1.0 - color1.a)
  );
}

The blending relation combines color1 and color2 based on their respective alpha values. The final color output C for each pixel is computed using the relation:

C=(1α1)C1+α2C2

where:

This relation modulates the color of the base image C1 by the transparency of the overlay image C2, creating a composite color based on the relative opacity of each.

Alpha (transparency) of the final output A is calculated as:

α=α1+α2(1α1)

This expression ensures that the result’s overall transparency depends on both textures' alpha values, with α1 and α2 representing the alpha values from the base and overlay textures. The term (1α1) reduces the influence of the overlay alpha based on the base image’s opacity.

The textures and blending parameter (u_alpha) are prepared on the GPU before rendering. First, positions and texture coordinates are buffered, and the two textures are activated, binding each to a texture unit. The global blending factor u_alpha controls the transparency of the overlay texture across all pixels for improved consistency in opacity adjustment.

function withAlphaBlending(
  gl: WebGLRenderingContext,
  alpha: number,
  texture1: WebGLTexture,
  texture2: WebGLTexture
) {
  const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
  const fragmentShader = createShader(
    gl,
    gl.FRAGMENT_SHADER,
    fragmentShaderSource
  );
  const program = createProgram(gl, vertexShader, fragmentShader);
  gl.useProgram(program);

  // these are basically location data corresponding to our shader sources
  const positionLocation = gl.getAttribLocation(program, "a_position");
  const texcoordLocation = gl.getAttribLocation(program, "a_texcoord");
  const texture1Location = gl.getUniformLocation(program, "u_texture1");
  const texture2Location = gl.getUniformLocation(program, "u_texture2");
  const alphaLocation = gl.getUniformLocation(program, "u_alpha");
}

Vertex position buffer #

const positionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
  -1, -1,  1, -1, -1,  1,
  -1,  1,  1, -1,  1,  1,
]), gl.STATIC_DRAW);

Texture position buffer #

const texcoordBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
  0, 0,  1, 0,  0, 1,
  0, 1,  1, 0,  1, 1,
]), gl.STATIC_DRAW);

Linking to shader attributes #

Once the buffers are filled, they are linked to the shader attributes a_position and a_texcoord. gl.vertexAttribPointer connects the position and texture coordinate buffers to the shader, which the GPU interprets as being the accurate vertex positions.

gl.enableVertexAttribArray(positionLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);

gl.enableVertexAttribArray(texcoordLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer);
gl.vertexAttribPointer(texcoordLocation, 2, gl.FLOAT, false, 0, 0);

gl.uniform1f(alphaLocation, alpha);

gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture1);
gl.uniform1i(texture1Location, 0);

gl.activeTexture(gl.TEXTURE1);
gl.bindTexture(gl.TEXTURE_2D, texture2);
gl.uniform1i(texture2Location, 1);

gl.drawArrays(gl.TRIANGLES, 0, 6);

Color channels as TEXTURE_2D buffers #

For this, we can use the WebGL context to call texImage2D on a color buffer and sequentially mipmap the buffer. Each color component is a gl.UNSIGNED_BYTE implying that we use 8×4=32 bits, which is 1 unsigned 32-bit integer. For most displays, this is perfectly normal.