WebGL alpha blending modulation as texture layers
Alpha blending is a technique in computer graphics that enables combining two textures by modifying the opacity (alpha) of each pixel. Each pixel has RGB color values and an alpha value that represents transparency, where alpha values range from 0
(fully transparent) to 1
(fully opaque).
In WebGL, we can apply alpha blending without CPU-bound modulation of each color channel and instead implement it using shaders running on the GPU to process each pixel (final source).
Vertex shader: position and texture coordinates #
The vertex shader defines the positions and texture coordinates of vertices, forming a rectangle that covers the screen. This allows blending to be applied across the entire area (which we'll apply using the fragment shader). The attributes a_position
and a_texcoord
represent the vertex positions and the corresponding texture coordinates. For alpha blending, each texture coordinate
attribute vec4 a_position;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
void main() {
gl_Position = a_position;
v_texcoord = a_texcoord;
}
Each position in a_position
is mapped to Normalized Device Coordinates (NDC) in the range of a_texcoord
values range from
Fragment shader: color blending #
In the fragment shader, we blend the two texture layers. For each pixel, the fragment shader samples color and alpha values from the textures and applies the alpha blending relation. The colors color1
and color2
represent the colors from the base and overlay textures, respectively:
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D u_texture1;
uniform sampler2D u_texture2;
uniform float u_alpha;
void main() {
vec4 color1 = texture2D(u_texture1, v_texcoord);
vec4 color2 = texture2D(u_texture2, v_texcoord);
gl_FragColor = vec4(
mix(color1.rgb, color2.rgb, color2.a * u_alpha),
color1.a + color2.a * (1.0 - color1.a)
);
}
The blending relation combines color1
and color2
based on their respective alpha values. The final color output
where:
and are the RGB colors of the base and overlay textures, respectively is the alpha of the overlay texture, scaled byu_alpha
(a global blending factor)
This relation modulates the color of the base image
Alpha (transparency) of the final output A is calculated as:
This expression ensures that the result’s overall transparency depends on both textures' alpha values, with
The textures and blending parameter (u_alpha
) are prepared on the GPU before rendering. First, positions and texture coordinates are buffered, and the two textures are activated, binding each to a texture unit. The global blending factor u_alpha
controls the transparency of the overlay texture across all pixels for improved consistency in opacity adjustment.
function withAlphaBlending(
gl: WebGLRenderingContext,
alpha: number,
texture1: WebGLTexture,
texture2: WebGLTexture
) {
const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
const fragmentShader = createShader(
gl,
gl.FRAGMENT_SHADER,
fragmentShaderSource
);
const program = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(program);
// these are basically location data corresponding to our shader sources
const positionLocation = gl.getAttribLocation(program, "a_position");
const texcoordLocation = gl.getAttribLocation(program, "a_texcoord");
const texture1Location = gl.getUniformLocation(program, "u_texture1");
const texture2Location = gl.getUniformLocation(program, "u_texture2");
const alphaLocation = gl.getUniformLocation(program, "u_alpha");
}
Vertex position buffer #
- A buffer is created and filled with vertex data to define the rectangle’s shape.
- The position buffer holds six values, specifying two triangles covering the screen with coordinates:
. - Each pair
represents a vertex position in NDC, allowing WebGL to form two triangles that span the entire screen.
const positionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
-1, -1, 1, -1, -1, 1,
-1, 1, 1, -1, 1, 1,
]), gl.STATIC_DRAW);
Texture position buffer #
- Another buffer is created to store the texture coordinates, which map the entire texture onto the rectangle.
- The coordinate data
corresponds to the texture’s corners, i.e. bottom-left , bottom-right , top-left , and top-right . - Each vertex of the screen-covering rectangle samples over the texture precisely.
const texcoordBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
0, 0, 1, 0, 0, 1,
0, 1, 1, 0, 1, 1,
]), gl.STATIC_DRAW);
Linking to shader attributes #
Once the buffers are filled, they are linked to the shader attributes a_position
and a_texcoord
. gl.vertexAttribPointer
connects the position and texture coordinate buffers to the shader, which the GPU interprets as being the accurate vertex positions.
gl.enableVertexAttribArray(positionLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(texcoordLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer);
gl.vertexAttribPointer(texcoordLocation, 2, gl.FLOAT, false, 0, 0);
gl.uniform1f(alphaLocation, alpha);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture1);
gl.uniform1i(texture1Location, 0);
gl.activeTexture(gl.TEXTURE1);
gl.bindTexture(gl.TEXTURE_2D, texture2);
gl.uniform1i(texture2Location, 1);
gl.drawArrays(gl.TRIANGLES, 0, 6);
Color channels as TEXTURE_2D
buffers #
For this, we can use the WebGL context to call texImage2D
on a color buffer and sequentially mipmap the buffer. Each color component is a gl.UNSIGNED_BYTE
implying that we use