2 C
New York
Saturday, November 29, 2025

Constructing The Monolith: Composable Rendering Techniques for a 13-Scene WebGL Epic




Free course advice: Grasp JavaScript animation with GSAP by way of 34 free video classes, step-by-step tasks, and hands-on demos. Enroll now →

To construct this monolithic mission with 13 totally different scenes, a number of methods made up of reusable and composable parts had been developed inside React Three Fiber:

  1. Deferred Rendering & Outlines
  2. Composable Supplies
  3. Composable Particle System
  4. Scene Transition System

The article begins with an summary of the idea artwork and early collaboration behind the mission, then strikes into devoted sections that specify every system intimately. These sections describe the selections behind Deferred Rendering and Outlines, the construction of the Composable Supplies system, the logic behind the Composable Particle System, and the strategy used for transitions between scenes.


Free GSAP 3 Express Course


Be taught trendy internet animation utilizing GSAP 3 with 34 hands-on video classes and sensible tasks — good for all talent ranges.


Test it out

Transient Intro & Idea Artwork

Kehan got here to me straight, figuring out me by way of a pal of a pal. He had a imaginative and prescient for the mission and had already engaged Lin for example a number of scenes. I informed him the staff I wished, and we expanded right into a full group of freelancers. Fabian joined as a shader developer, Nando as a inventive, Henry as a 3D artist, Daisy as a producer, and HappyShip joined as soon as Henry went on trip.

Lin’s illustrations had such a particular and provoking artwork type that translating them into 3D grew to become an extremely enjoyable and thrilling course of. The staff spent numerous days and nights discussing the way to convey the mission to life, with a relentless stream of recent concepts and shared references—my bookmarks folder for the mission now holds greater than 50 hyperlinks. It was a pleasure and a privilege to work with such a passionate and proficient staff.

1. Deferred Rendering & Outlines

A key characteristic of the artwork type is using coloured outlines. After intensive analysis, we discovered three principal methods to realize this:

  1. Edge detection primarily based on depth and normals
  2. Inverse hull
  3. Edge detection primarily based on materials IDs

We determined to make use of the primary technique for 2 causes. With inverse hull, shifting the digital camera nearer or farther from the article would trigger the define width to seem thicker or thinner. Materials ID would additionally not work nicely with the particle-based clouds.

Normals

https://themonolithproject.web/?depth

To make use of deferred rendering in Three.js, we have to set the depend in WebGLRenderTarget, the place every depend represents a G-Buffer. For every G-Buffer, we will outline the feel kind and format to scale back reminiscence utilization.

In our case, we used a G-Buffer for storing normals. We utilized a reminiscence optimization method known as octahedron regular vector encoding, which permits normals to be encoded into fewer bits at the price of further encoding and decoding time.

Define Colours

https://themonolithproject.web/?outlineColor

We additionally wished totally different coloured outlines for various objects, so we used an extra G-Buffer. As a result of we had been solely utilizing a small variety of colours, one optimization might have been to make use of a colour lookup texture, lowering the G-Buffer to only a few bits. Nevertheless, we saved issues easy and simpler to regulate by utilizing the total RGB vary.

Outlines

https://themonolithproject.web/?define

As soon as the G-Buffers are ready, a convolution filter is utilized to the depth and regular information to detect edges. We then apply the colour from the define colour G-Buffer to these edges. Sources comparable to Moebius Model Publish Processing by Maxime Heckel and Define Styled Materials by Visible Tech Artwork had been immensely useful.

Gotchas

One difficulty with utilizing depend in Three.js WebGLRenderTarget is that each one core supplies, comparable to MeshBasicMaterial, will not render by default. A worth should be assigned to the G-Buffer location for it to seem once more. To keep away from polluting the buffer with undesirable information, we will merely set it to itself.

format(location = 1) out vec4 gNormal;

void principal() {
  gNormal = gNormal;
}

2. Composable Supplies

Since this mission consists of many scenes with quite a few objects utilizing totally different supplies, I wished to create a system that encapsulates a bit of shader performance—together with any information and logic it requires—right into a part. These parts might then be mixed to type a fabric. React and JSX make this type of composability easy, leading to a quick and intuitive developer expertise.

Notice: this mission was developed in early 2024, earlier than TSL was launched. Issues might be achieved in a different way right this moment.

GBufferMaterial

The core of the system is the GBufferMaterial part. It’s primarily a ShaderMaterial with useful uniforms and pre-calculated values, together with insertion factors that modules can use so as to add further shader code on high.

uniform float uTime;
/// insert <setup>

void principal() {
  vec2 st = vUv;

  /// insert <principal>
}

MaterialModule

A big array of reusable modules, together with a number of customized one-off modules, had been created for this mission. Probably the most primary of those is the MaterialModuleColor.

export const MaterialModuleColor = forwardRef(({ colour, mix = '' }, ref) => {
  // COLOR
  const _color = useColor(colour);

  const { materials } = useMaterialModule({
    identify: 'MaterialModuleColor',
    uniforms: {
      uColor: { worth: _color },
    },
    fragmentShader: {
      setup: /*glsl*/ `
        uniform vec3 uColor;
      `,
      principal: /*glsl*/ `
        pc_fragColor.rgb ${mix}= uColor;
      `,
    },
  });

  useEffect(() => {
    materials.uniforms.uColor.worth = _color;
  }, [_color]);

  useImperativeHandle(ref, () => _color, [_color]);

  return <></>;
});

It merely provides a uColor uniform and writes it to the output colour.

Use Case

For instance, that is the code for the monolith:

<mesh geometry={nodes.MONOLITHClean.geometry}>
  <GBufferMaterial>
    <MaterialModuleNormal />
    <MaterialModuleOutline colour={0x547694} />

    <MaterialModuleUVMap map={tUV} />
    <MaterialModuleGradient
      color1={'#71b3dd'}
      color2={'#acc9ad'}
      mixFunc={'st.y;'}
    />
    <MaterialModuleAnimatedGradient
      color1={'#71b3dd'}
      color2={'#acc9ad'}
      color3={'#ffffff'}
      pace={0.4}
      mix="*"
    />

    <MaterialModuleBrightness quantity={2} />

    <MaterialModuleUVOriginal />
    <MaterialModuleMap
      map={tDetails}
      mix="*"
      oneMinus={true}
      colour={lineColor}
      alphaTest={0.5}
    />

    <MaterialModuleFlowMap />
    <MaterialModuleFlowMapColor
      colour={'#444444'}
      mix="+"
    />

  </GBufferMaterial>
</mesh>

All of those are generic modules that had been reused throughout many alternative meshes all through the positioning.

  • MaterialModuleNormal: encodes and writes the world regular to the conventional G-Buffer
  • MaterialModuleOutline: writes the define colour to the outlineColor G-Buffer
  • MaterialModuleUVMap: units the present st worth primarily based on the offered texture (affecting later modules that use st)
  • MaterialModuleGradient: attracts a gradient colour
  • MaterialModuleAnimatedGradient: attracts an animated gradient
  • MaterialModuleBrightness: brightens the output colour
  • MaterialModuleUVOriginal: resets st to the unique UVs
  • MaterialModuleMap: attracts a texture
  • MaterialModuleFlowMap: provides the move map texture to the uniforms
  • MaterialModuleFlowMapColor: provides a colour primarily based on the place the move map is activated

Modules that affected the vertex shaders had been additionally created, comparable to:

  • MaterialModuleWind: strikes the vertex for a wind impact, used for timber, shrubs, and so on.
  • MaterialModuleDistort: distorts the vertex, used for the planets

With this technique, advanced shader performance—comparable to wind—is encapsulated right into a reusable and manageable part. It may then be mixed with different vertex and fragment shader modules to create all kinds of supplies with ease.

3. Composable Particle System

Equally, the concept of creating issues composable and reusable is prolonged to the ParticleSystem.

ParticleSystem

That is the core ParticleSystem part. Because it was written in WebGL, it consists of logic to calculate place, velocity, rotation, and life information utilizing the ping-pong rendering technique. Extra options embody prewarming, the flexibility to start out and cease the particle system naturally (permitting remaining particles to complete their lifetime), and a burst mode that in the end wasn’t used.

Similar to the GBufferMaterial, the place, rotation, and life shaders include insertion factors for modules to make use of. For instance:

void principal() {
  vec4 currPosition = texture2D(texturePosition, uv);
  vec4 nextPosition = currPosition;

  if (needsReset) {
    /// insert <reset>
  }

  /// insert <pre_execute>

  nextPosition += currVelocity * uDelta;
    
  /// insert <execute>

  gl_FragColor = vec4(nextPosition);
}

It supported two modes: factors or instanced mesh.

ParticleSystemModule

The system is impressed by Unity, with modules that outline the emission form in addition to modules that have an effect on place, velocity, rotation, and scale.

Emission modules

For instance, the EmissionPlane module permits us to set particle beginning positions primarily based on the dimensions and place of a aircraft.

The EmissionSphere module permits us to set the particle beginning positions on the floor of a sphere.

Probably the most highly effective module is the EmissionShape module. This enables us to go in a geometry, and it calculates the beginning positions utilizing MeshSurfaceSampler.

Place, Velocity, Rotation, and Scale modules

Different generally used modules embody:

  • VelocityAddDirection
  • VelocityAddOverTime
  • VelocityAddNoise
  • PositionAddMouse: provides to the place primarily based on the mouse place, and may push or pull particles away from or towards the mouse
  • PositionSetSpline: units a spline path for the particles to comply with and ignores velocity

Asteroids Use Case

For instance, that is the asteroid belt:

<ParticleSystem
  _key={_key}
  enabled={true}
  maxParticles={quantity}
  lifeTime={[20, 200]}
  charge={quantity / 200}
  looping={true}
  prewarm={true}
  geometry={geometry}
  pace={0.2}
>
  <EmissionSphere radius={2} />

  <PositionSetSpline
    debug={debug}
    randomRotation={true}
    spline={[
      new Vector4(-distance, 0, 0, 0.1),
      new Vector4(0, -distance, 0, 0.8),
      new Vector4(distance, 0, 0, 2),
      new Vector4(0, distance, 0, 0.8),
    ]}
    closed={true}
  />

  <RotationSetRandom pace={1.5} />

  <GBufferMaterial
    depthWrite={false}
    clear={true}
  >
    <MaterialModuleWorldPos />
    <MaterialModuleParticle />
    <MaterialModuleMap map={tAsteroids} />
    <MaterialModuleColor
      colour={colour}
      mix="*"
    />
    <MaterialModuleFlowMap />
    <MaterialModuleFlowMapColor mix="+" />
  </GBufferMaterial>
  
</ParticleSystem>

The particles are emitted from a small sphere, then comply with a spline path with a random rotation.

It additionally works with the GBufferMaterial, permitting us to shade it utilizing the identical modules. That is how the mouseover move map is utilized to this particle system—the identical materials module used for the monolith can also be used right here.

Leafs Use Case

<ParticleSystem
  _key={`SceneSwampOverview-Leafs`}
  enabled={true}
  maxParticles={32 * 32 * 0.5}
  looping={true}
  prewarm={true}
  lifeTime={20}
  charge={(32 * 32 * 0.5) / 20}
  geometry={leafGeometry}
  {...props}
>
  <EmissionShape geometry={leafEmitterGeometry} />

  <VelocitySetDirection route={[0, -1, 0]} />
  <VelocityAddOverTime route={[-0.1, 0, 0]} />

  <RotationSetRandom pace={[0.3, 3]} />
  
  <PositionGroundLimit y={0.4} />
  <RotationGroundLimit y={0.4} />

  <PositionUtilCamera />
  <PositionAddMouse
    distance={0.5}
    energy={0.03}
  />
  <PositionAddAttractor
    ref={refParticleAttractor1}
    geometry={monolithGeometry}
    energy={0}
  />

  <ParticleSystemSpriteMaterial
    map={tLeafs}
    cols={8}
    alphaTest={0.5}
    outlineColor={0x428385}
  />
</ParticleSystem>

4. Scene Transition System

Due to the big variety of scenes and the number of transitions we wished to create, we constructed one other system particularly for scene transitions. Each transition within the mission makes use of this technique, together with:

  • photo voltaic system > planet: wipe up
  • planet > bone: zoom blur
  • historical past > pill: masks
  • pill > fall: masks
  • fall > overview: zoom blur
  • desert > swamp: radial
  • winter > forest: sphere
  • world > ending: masks

First, we draw scene A with deferred rendering, together with depth and normals. Then we do the identical for scene B.

Subsequent, we use a fullscreen triangle with a fabric answerable for mixing the 2 scenes. We created 4 supplies to help all of our transitions.

  • MaterialTransitionMix
  • MaterialTransitionZoom
  • MaterialTransitionRadialPosition
  • MaterialTransitionRaymarched

The only of those is MaterialTransitionMix, however it’s also fairly highly effective. It takes the scene A texture, scene B texture, and an extra grayscale combine texture, then blends them primarily based on a progress worth from 0 to 1.

For the photo voltaic system to planet transition, the combination texture is generated at runtime utilizing a rectangle that strikes upward.

For the historical past to pill transition, the combination texture can also be generated at runtime by rendering the identical pill scene in a particular masks mode that outputs the pill in a black-to-white vary.

The pill to fall transition, in addition to the world to ending transition, had been dealt with the identical method, utilizing combine textures generated at runtime.

Deferred Rendering, made composable

Utilizing the identical insertion method because the composable materials and particle methods, the deferred rendering workflow was made composable as nicely.

By the top of the mission, we had created the next modules for our Deferred Rendering system:

  • DeferredOutline
  • DeferredLighting
  • DeferredChromaticAberration
  • DeferredAtmosphere — most seen within the desert intro
  • DeferredColorCorrect
  • DeferredMenuFilter

Use Case

For instance, the photo voltaic system scene included the next modules:

<RenderTextureDeferred>
  <DeferredChromaticAberration maxDistortion={0.03} />
  <DeferredLighting ambient={1} />
  <DeferredOutline
    depthThreshold={0.1}
    normalThreshold={0.1}
  />
  <DeferredColorCorrect />
  <DeferredMenuFilter />

  <SceneSolar />
</RenderTextureDeferred>

Ultimate ideas

These methods assist make growth sooner by being encapsulated, composable, and reusable.

This implies options could be added and examined in isolation. No extra large materials recordsdata with too many uniforms and tons of of strains of GLSL. Fixing a selected characteristic not requires copying code throughout a number of supplies. Any JS logic wanted for a shader is tightly coupled with the snippet of vertex or fragment code that makes use of it.

And naturally, as a result of all of that is constructed with React, we get scorching reloading. Having the ability to modify a selected shader for a selected scene and see the outcomes immediately makes the workflow extra enjoyable, fulfilling, and productive.



Supply hyperlink

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles