Volumetric Rendering in Three.js

Introduction to Volumetric Rendering

Volumetric rendering visualizes 3D scalar fields where each voxel contributes to the final image based on its density and optical properties. Unlike surface rendering, volumetric techniques capture internal structures, making them perfect for:

  • Atmospheric effects: Clouds, fog, and smoke simulation
  • Scientific visualization: Medical CT/MRI data, seismic volumes
  • VFX: Explosions, fire, and fluid simulations

Three.js provides the foundation, but advanced volumetric rendering requires custom GLSL shaders, ray-marching algorithms, and careful performance optimization.

Core Concepts: Ray-Marching and Volume Sampling

Ray-Marching Algorithm

Ray-marching steps through a 3D volume along rays from the camera, accumulating color and opacity:

// Fragment shader snippet for ray-marching
vec3 raymarch(vec3 ro, vec3 rd, float maxDist) {
    float t = 0.0; // Current distance along ray
    vec4 color = vec4(0.0);
    for(int i = 0; i < MAX_STEPS; i++) {
        vec3 pos = ro + t * rd;
        float density = sampleDensity(pos); // Sample volume
        if(density > 0.0) {
            vec4 sampleColor = sampleColor(pos, density);
            color = accumulateColor(color, sampleColor, t);
            if(color.a > 0.99) break; // Early termination
        }
        t += stepSize * (1.0 + density * 0.1); // Adaptive stepping
        if(t > maxDist) break;
    }
    return color.rgb;
}

Volume Representation

Volumes can be stored as:

  • 3D Textures: Native WebGL sampler3D (limited to 2048³ max)
  • 2D Texture Arrays: Sliced volumes for larger datasets
  • Procedural Noise: Generated on-the-fly using GLSL noise functions

Implementation: Building the Volumetric Shader

1. Noise-Based Density Fields for Clouds and Smoke

// Advanced Perlin/Simplex noise implementation
float fbm(vec3 p, int octaves) {
    float value = 0.0;
    float amplitude = 0.5;
    float frequency = 1.0;
    for(int i = 0; i < octaves; i++) {
        value += amplitude * noise(p * frequency);
        p *= 2.0;
        amplitude *= 0.5;
        frequency *= 2.0;
    }
    return value;
}
float cloudDensity(vec3 pos) {
    // Multi-scale noise for realistic clouds
    float baseNoise = fbm(pos * 0.1, 4);
    float detailNoise = fbm(pos * 1.0, 6);
    // Shape clouds with height falloff and erosion
    float heightFactor = smoothstep(0.0, 1.0, pos.y);
    float erosion = sin(pos.x * 0.1) * cos(pos.z * 0.1);
    return max(0.0, baseNoise * heightFactor + detailNoise * 0.1 + erosion * 0.05);
}

2. Lighting and Scattering

Implement phase functions for realistic light scattering:

// Henyey-Greenstein phase function for forward/backward scattering
float henyeyGreenstein(vec3 lightDir, vec3 viewDir, float g) {
  float cosTheta = dot(lightDir, viewDir);
  return (1.0 - g*g) / (4.0 * PI * pow(1.0 + g*g - 2.0*g*cosTheta, 1.5));
}

// Beer-Lambert law for absorption
vec4 accumulateLight(vec3 pos, vec3 lightDir, float density) {
  float opticalDepth = density * stepSize;
  float transmittance = exp(-opticalDepth);

  // Sample light with multiple scattering approximation
  float lightDensity = sampleDensity(pos + lightDir * shadowOffset);
  return vec4(1.0) * transmittance * lightDensity;
}

3. Three.js Integration

// Custom volumetric material
class VolumetricMaterial extends THREE.ShaderMaterial {
    constructor(options = {}) {
        super({
            uniforms: {
                volumeTexture: { value: null },
                noiseTexture: { value: null },
                cameraPosition: { value: new THREE.Vector3() },
                lightPosition: { value: new THREE.Vector3() },
                stepSize: { value: 0.01 },
                maxSteps: { value: 128 },
                // ... other uniforms
            },
            vertexShader: vertexShader,
            fragmentShader: fragmentShader,
            transparent: true,
            blending: THREE.AdditiveBlending
        });
    }
}
// Ray-marched volume renderer
class VolumetricRenderer {
    constructor(renderer, scene, camera) {
        this.renderer = renderer;
        this.scene = scene;
        this.camera = camera;
        this.volumeBox = new THREE.Mesh(
            new THREE.BoxGeometry(10, 10, 10),
            new VolumetricMaterial()
        );
        this.scene.add(this.volumeBox);
    }
    render() {
        // Update uniforms
        this.volumeBox.material.uniforms.cameraPosition.value.copy(this.camera.position);
        // Custom render pass
        this.renderer.render(this.scene, this.camera);
    }
}

Advanced Optimization Techniques

1. Early Ray Termination

// Alpha compositing with early exit
vec4 accumulateColor(vec4 accum, vec4 sample, float distance) {
    float transmittance = exp(-sample.a * stepSize);
    accum.rgb = accum.rgb * transmittance + sample.rgb * (1.0 - transmittance);
    accum.a = 1.0 - transmittance * (1.0 - accum.a);
    // Early termination threshold
    if(accum.a > 0.99) {
        accum.a = 1.0;
        return accum; // Exit ray-marching loop
    }
    return accum;
}

2. Adaptive Step Size

// Distance-based and density-adaptive stepping
float adaptiveStepSize(vec3 pos, float baseStep) {
    float density = sampleDensity(pos);
    float gradient = length(computeGradient(pos)); // Sobel or finite differences
    // Smaller steps in high-density regions
    float stepMultiplier = 1.0 / (1.0 + density * 10.0 + gradient * 5.0);
    // Minimum/maximum step constraints
    return clamp(baseStep * stepMultiplier, 0.001, baseStep * 2.0);
}

3. Temporal Reprojection and Denoising

// Temporal accumulation buffer
class TemporalVolumeRenderer {
  constructor() {
    this.historyBuffer = new THREE.WebGLRenderTarget(width, height, {
      format: THREE.RGBAFormat,
      type: THREE.FloatType
    });
    this.velocityBuffer = new THREE.WebGLRenderTarget(width, height);
    this.accumulationFactor = 0.9;
  }

  renderFrame(currentFrame) {
    // Reproject previous frame using motion vectors
    const reprojectionPass = new ReprojectionPass(
      this.historyBuffer.texture,
      this.velocityBuffer.texture
    );

    // Blend with current frame
    const blended = mix(currentFrame, reprojectionPass.output, this.accumulationFactor);

    // Update history
    this.renderer.setRenderTarget(this.historyBuffer);
    this.renderer.render(blended.scene, blended.camera);
  }
}

4. Empty Space Leaping

// Skip empty regions using hierarchical min-max textures
uniform sampler3D minMaxTexture; // Octree-like acceleration structure

bool skipEmptySpace(vec3 pos, float tMax, inout float t) {
  ivec3 coord = ivec3(pos / cellSize);
  float minDensity = textureLod(minMaxTexture, vec3(coord) / textureSize, 0.0).r;

  if(minDensity > 0.0) return false; // Hit potential volume

  // Leap to next non-empty cell
  vec3 nextCell = (coord + 1) * cellSize;
  float distanceToCell = distance(pos, nextCell);

  if(t + distanceToCell < tMax) {
    t += distanceToCell;
    return true;
  }

  return false;
}

Performance Optimization Strategies

GPU Memory Management

  1. Texture Compression: Use BC7 or ASTC for volume textures
  2. Texture Atlasing: Pack multiple small volumes into single texture
  3. LOD Streaming: Load high-res volumes only for focused regions

Best Practices and Gotchas

  1. Precision Issues: Use highp precision qualifiers for 3D texture coordinates
  2. Texture Size Limits: WebGL2 supports 2048³ max; use slicing for larger volumes
  3. Anisotropic Filtering: Enable for better volume texture quality
  4. Depth Testing: Disable for transparent volumes; use alpha blending
  5. Shader Compilation: Pre-compile complex shaders during app initialization

Conclusion

Volumetric rendering in Three.js transforms static 3D scenes into rich, immersive experiences. By mastering ray-marching, noise synthesis, and optimization techniques, developers can create:

  • Cinematic cloud systems with realistic light scattering
  • Medical visualization tools for surgical planning and research
  • Industrial applications for quality control and reverse engineering

The key to production-ready volumetric rendering lies in balancing visual fidelity with performance through adaptive techniques, LOD strategies, and GPU acceleration.

Leave a comment

Your email address will not be published. Required fields are marked *