False Earth is an interactive WebGPU undertaking and the sequel to my first work, Drift.
In Drift, an astronaut is misplaced in house, drifting, longing to return residence. An AI-generated diary displays his psychological state — the loneliness, the melancholy, the recollections of time spent together with his household.
Now, the story continues. The astronaut has landed on a brand new planet. It seems to be like Earth — there may be grass, there may be sky, there may be floor, but it surely feels unusual and “false.” The grass by no means ends. Cosmic beams fall from the sky. Flowers bloom and die in seconds.
He made it someplace. However this isn’t residence.

The WebGL Prototypes
Earlier than diving into False Earth, I ran two experiments to check what the browser may deal with.
Vertex Animation Texture (VAT)
Rendering hundreds of animated vertices was too heavy for the CPU. I moved the work to the GPU utilizing instancing and VAT — a way the place animation information (positions and normals) is baked right into a texture and replayed within the shader. On this demo, a whole bunch of flowers bloomed, grew, and died because the person moved the mouse.
Procedural Grass
Lifelike grass is the soul of False Earth. I wished blades that reacted to lighting and interplay, not simply static geometry. Impressed by the Ghost of Tsushima staff’s technical breakdown, I constructed a procedural system that gave me full management over each blade’s form, shade, and motion.
The WebGL Restrict
However I rapidly hit a wall. Extra flowers, extra grass — and the body price dropped. Past draw calls, GPGPU in WebGL meant encoding information as pixels and juggling framebuffer object (FBO) learn/write operations, which felt clunky and restricted.
That led me to WebGPU. It was my first time utilizing it together with TSL (Three.js Shading Language), and the distinction was speedy. Storage buffers let me write structured information straight on the GPU and skim it from any shader — no extra pixel-packing workarounds. I may give attention to the logic of the world moderately than combating the API.
The Infinite Grass Subject
The Spatial Technique
There isn’t a method to generate sufficient particular person blades to cowl a complete world. As a substitute, I divided the sector right into a grid system. Because the digicam crosses a grid boundary, all the vertex group snaps ahead. This infinite scrolling trick retains the grass surrounding the character irrespective of how far they journey. I used world place as a deterministic seed to generate parameters for every blade’s form, shade, and native elevation, which stays constant throughout each grid snap.
The GPU Knowledge Pipeline
I saved the parameters for every blade in a structured storage buffer, computed them in a compute shader, and handed the consequence into the rendering pipeline. Every blade’s information package deal is packed into 4 vec4 values (64 bytes per occasion) for GPU-friendly alignment:
- Place and Kind: World place (
xyz) and a blade sort index (w) for form variation. - Form Parameters: Randomized width, peak, bend curvature, and wind energy.
- Rotation and Seeds: Pre-computed sine/cosine for going through rotation, plus clump and per-blade hash seeds for shade and sway variation.
- Compressed Regular and Interplay: Terrain regular saved as solely two parts (
x,z)—theypart is reconstructed within the vertex shader throughsqrt(1 − x² − z²), halving regular storage price. The remaining two floats carry the character push vector.
To populate these fields with pure variation, I adopted a Voronoi clustering method. Every blade finds its two nearest Voronoi facilities and blends their parameters (peak, width, bend) primarily based on the gap to every heart. This prevents onerous seams at clump boundaries—blades close to an edge transition progressively between their neighbors’ properties moderately than snapping abruptly.
Vertex Deformation
Every blade is only a flat airplane. To make it really feel like natural matter, I layered a number of displacement passes within the vertex shader, every including a special type of movement:
Bending & Sway
Every blade follows a cubic Bézier curve with 4 management factors. A parameter t runs from 0 on the root to 1 on the tip, and two inside management factors (p1, p2) management how the blade bends:
// Cubic Bézier: 4 management factors outline the blade's curvature
const backbone = bezier3(p0, p1, p2, p3, t);
const tangent = normalize(bezier3Tangent(p0, p1, p2, p3, t));
const aspect = normalize(cross(vec3(0.0, 0.0, 1.0), tangent));
Wind pushes the management factors straight — mid-points shift gently, the tip extra aggressively — bending the entire blade in a pure arc moderately than making use of a flat offset:
// Wind pushes management factors proportionally (root secure, tip strongest)
const p1Pushed = p1.add(windDir.mul(windScale.mul(peak).mul(0.08)));
const p2Pushed = p2.add(windDir.mul(windScale.mul(peak).mul(0.15)));
const p3Pushed = p3.add(windDir.mul(windScale.mul(peak).mul(0.25)));
On prime of the Bézier backbone I layered a dual-frequency sine-wave sway: a low-frequency oscillation for the principle physique and a high-frequency flutter for positive element. Each develop stronger towards the tip, and a gradual “gust” envelope modulates the amplitude so the entire area seems to breathe. For distant blades, I light the wind depth with a distance-based falloff to maintain issues secure.
// Solely impacts vertices close to the tip
const topSwayMask = smoothstep(float(0.5), float(1.0), t);
// Gust envelope — gradual respiratory that modulates general amplitude
const gust = float(0.65).add(
float(0.35).mul(sin(uTime.mul(0.35).add(seed.mul(6.28318))))
);
// Low freq (major sway) + excessive freq (small flutter)
const low = sin(uTime.mul(baseFreq).add(section).add(t.mul(2.2)));
const excessive = sin(uTime.mul(baseFreq.mul(5.0)).add(section.mul(1.7)).add(t.mul(5.0)));
// Gust drives the low-frequency sway, high-freq stays fixed
const swayLow = amp.mul(gust).mul(uWindSwayStrength);
const swayHigh = amp.mul(0.8).mul(uWindSwayStrength);
const swayAmount = low.mul(swayLow).add(excessive.mul(swayHigh));
return aspect.mul(swayAmount).mul(topSwayMask);
Terrain Alignment
Blades want to sit down flush with the terrain. I computed a rotation from the native up-vector (0, 1, 0) to the terrain regular utilizing a cross-product axis and acos(dot) angle. Because the regular is already within the storage buffer, grass, flowers, and the character all share the identical elevation information.
The “Thick” Phantasm
Because the blades are flat 2D planes, they will disappear when seen edge-on. I added a view-dependent tilt that pushes vertices outward alongside the face regular, scaled by how edge-on the blade is to the digicam. An edge masks (stronger on the perimeters) mixed with a middle masks (stronger on the base) retains the impact from distorting the tip or including bulk the place it’s not wanted.
// How edge-on is that this blade to the digicam?
const camDirLocalY = dot(camDirW, sideW);
// Edge masks: stronger on the perimeters when seen head-on
const edgeMask = uvCoords.x.sub(0.5).mul(camDirLocalY);
edgeMask.mulAssign(pow(abs(camDirLocalY), float(1.2)));
// Middle masks: stronger on the base, weaker on the tip
const centerMask = pow(float(1.0).sub(t), float(0.5)).mul(pow(t.add(0.05), float(0.33)));
// Push vertices outward alongside the face regular
return posObj.add(normalXZ.mul(thicknessStrength.mul(edgeMask).mul(centerMask)));
Interplay
I wished the grass to half when the character walks by means of it. The compute shader provides every blade an outward push vector that falls off with distance from the character, and the vertex shader weights that displacement by t² — strongest on the tip, zero on the root. I additionally flatten the blade’s peak towards the bottom so it doesn’t simply slide sideways however compresses down because the character passes over:
// Compute shader: radial falloff from character
const pushFactor = smoothstep(pushRadius, float(0.0), charDist);
const pushVector = safeCharDir.mul(pushFactor).mul(pushAmount);
// Vertex shader: push outward + flatten peak
lpos = vec3(
lpos.x.add(pushVector.x.mul(pow(t, float(2.0)))),
lpos.y.mul(oneMinus(pushLen.mul(flattenAmount).mul(t))),
lpos.z.add(pushVector.y.mul(pow(t, float(2.0))))
);
In False Earth, cosmic beams fall from the sky and ship power waves rippling throughout the bottom. Every wave’s origin, begin time, radius, and lifelong stay in a storage buffer, and a number of waves can overlap — the shader loops by means of all lively entries per blade.
Every wave expands as a ring-shaped wavefront, not a crammed circle. The ring width is 20% of the utmost radius, with a smoothstep falloff on the edge. A separate fade curve ramps depth up at delivery and down earlier than demise, so the ring seems, swells outward, and dissolves naturally.
The identical wave energy drives each the emissive glow and the outward vertex push, so the visible ring and the bodily ripple by means of the grass keep completely synchronized:
// Ring form: distance from the increasing wavefront
const distFromWavefront = abs(dist.sub(currentRadius));
const ringWidth = maxRadius.mul(0.2);
const form = smoothstep(ringWidth, float(0.0), distFromWavefront);
// Lifetime fade: ramp up at delivery, fade out earlier than demise
const fade = smoothstep(float(1.0), float(0.5), progress)
.mul(smoothstep(float(0.0), float(0.1), progress));
const combinedStrength = form.mul(fade);
// Similar energy drives each glow (scalar) and push (vector)
consequence.energy += combinedStrength;
consequence.pressure += pushDir.mul(combinedStrength);
Procedural Shading
There aren’t any textures on the blades — all the pieces is computed within the shader. Right here is how I achieved a convincing look with out the overhead:
- Structural Normals: An actual blade has a raised midrib and skinny rims on the edges. I faked this by bending the shading regular throughout the blade’s width utilizing horizontal UV — a midrib inflection on the heart, lifted rims close to the perimeters — then blended it into the geometric regular at low energy so the blade catches mild as if it had actual cross-sectional curvature.
- Procedural Variation: I added shade randomness pushed by the clump and blade seeds so no two patches look an identical.
- Top-Primarily based AO: A easy ambient occlusion cross that darkens the roots primarily based on blade peak, including depth for almost zero price.
- Atmospheric Depth: Distant blades are gently desaturated to counsel atmospheric perspective.
VAT Flowers
Flowers in False Earth bloom wherever power waves contact the bottom. I wished them to really feel alive — showing, rising, and dying on their very own schedule — so all the spawning logic runs in a compute shader, with every occasion getting its personal place, lifetime, and seed.
The Round Spawning System
To maintain spawning GPU-side with out CPU readbacks, I used a round buffer backed by a storage index.
- Atomic Indexing: Every new flower increments a world index through
atomicAdd. - Automated Recycling: The index wraps with modulo towards the utmost occasion depend, recycling slots so the lifecycle stays clean even throughout heavy interplay.
// Every spawn atomically claims the subsequent slot, wrapping to reuse outdated situations
const headIndex = atomicAdd(spawnStorage.get("index"), uint(1)).mod(uint(maxCount));
const occasion = vatData.factor(headIndex);
The Lifecycle State Machine
As soon as spawned, every flower strikes by means of 4 phases pushed by its normalized age (progress = age / lifetime). Section boundaries (p1, p2, p3) range per occasion in order that they don’t all bloom in lockstep:
- Delay [0, p1): The dormant period before growth begins. VAT frame stays at 0.
- Grow [p1, p2): The blooming sequence where VAT progress scales from 0 to 1.
- Keep [p2, p3): The flower holds the final animated frame (1.0) for its peak duration.
- Die [p3, 1.0]: Progress reverses linearly from 1.0 again towards 0, so the flower withers and shrinks.
If(progress.lessThan(p1), () => {
currentFrame.assign(0.0);
}).ElseIf(progress.lessThan(p2), () => {
currentFrame.assign(progress.sub(p1).div(p2.sub(p1)));
}).ElseIf(progress.lessThan(p3), () => {
currentFrame.assign(1.0);
}).Else(() => {
const die = progress.sub(p3).div(float(1.0).sub(p3));
currentFrame.assign(float(1.0).sub(die));
});
When the sequence finishes, the occasion is marked inactive and its slot is free for the subsequent spawn.

VAT Rendering
Within the vertex shader, the VAT texture is sampled on the present body index to reconstruct animated positions. Per-instance measurement variation — derived from every occasion’s seed — retains the flowers from trying stamped out.
Just like the grass, flowers are aligned to terrain, affected by wind, and pushed by the character. Petals, stems, and leaves share one mesh, however I encoded a cloth masks into the vertex shade R channel throughout preprocessing: 0.0 for stems, 0.5 for petals, 1.0 for leaves. Within the fragment shader, a step(abs(worth − threshold), 0.05) recovers every masks, letting me shade all three supplies in a single draw name.

For the of completion, I mixed a Fresnel rim glow with a touring wave that sweeps throughout every petal over time. The wave offset comes from fract(time) per occasion, so each flower pulses at its personal rhythm — giving the flora a mystical, ethereal high quality.
Optimization
The Oblique Draw Structure
Even with all the pieces working on the GPU, drawing thousands and thousands of blades remains to be costly. The very first thing I wished was frustum culling — solely submitting what the digicam can truly see. However since blade positions stay on the GPU, CPU-side culling was out of the query. WebGPU oblique drawing solved this. The identical structure described under is reused for each the grass area and the flowers.
Oblique draw lets the GPU itself determine what number of situations to attract by studying from an oblique buffer. WebGPU defines this buffer as a Uint32Array with 5 fields:
vertexCount: Variety of vertices per occasion.instanceCount: Variety of situations to attract (up to date atomically).firstVertex: Offset within the vertex buffer.firstInstance: Offset within the occasion buffer.offset: Base occasion offset.
In Three.js, join the geometry to that buffer with a single name:
geometry.setIndirect(drawBuffer)
GPU-Pushed Culling & Filtering
With the oblique buffer in place, a compute cross decides what will get drawn every body. It resets instanceCount to zero, checks every blade’s visibility, and appends surviving indices right into a visibleIndicesBuffer whereas incrementing the depend atomically. Blades very near the digicam are all the time included so close by grass by no means pops out:
// At all times embody blades close to the digicam
const isCloseEnough = abs(diff.x).add(abs(diff.z)).lessThan(float(3));
const isVisible = isCloseEnough.or(performCulling(worldPos));
If(isVisible, () => {
const slot = atomicAdd(drawStorage.get("instanceCount"), uint(1));
visibleIndicesBuffer.factor(slot).assign(uint(instanceIndex));
});
Then, within the vertex shader, visibleIndicesBuffer is learn to resolve the true occasion index:
const trueIndex = visibleIndicesBuffer.factor(instanceIndex);
const information = grassData.factor(trueIndex);

This implies the vertex shader solely runs on situations which can be truly on display. With blades unfold across the digicam, a ~75° area of view covers roughly one-fifth of the environment — so round 80% of situations by no means attain the vertex shader in any respect.
GPU-Pushed LOD
Culling alone was not sufficient — I additionally wanted stage of element (LOD). I reused the identical oblique sample, however this time with a number of buffers — one per mesh density. Three tiers:
- Excessive element (15 segments): 0–5 m from digicam
- Medium element (5 segments): 5–20 m
- Low element (2 segments): 20 m to horizon
The compute cross measures every blade’s distance and routes it into the suitable bucket. As a result of the far ring covers vastly extra space than the close to ring, most seen blades land within the lowest tier (2 segments as an alternative of 15) — drastically slicing triangle depend whereas preserving close by blades crisp.
With out care, the LOD tiers create a visual ring artifact. I broke it up with a per-instance noise jitter on the gap take a look at:
// Jitter the gap take a look at to melt LOD transition rings
const noiseSeed = fract(float(instanceIndex).mul(0.12345)).mul(2.0).sub(1.0);
const noisyDist = distToCamera.add(distToCamera.mul(noiseScale).mul(noiseSeed));
If(noisyDist.greaterThanEqual(minDist).and(noisyDist.lessThan(maxDist)), () => {
const lodIndex = atomicAdd(drawStorage.get("instanceCount"), uint(1));
indices.factor(lodIndex).assign(uint(instanceIndex));
});
Non-blocking Startup: Async Compilation
Shader compilation can freeze the browser throughout load. I wished the intro animation to remain clean, so I wrapped each heavy part (grass, flowers, character) in an AsyncCompile wrapper that makes use of Three.js’s compileAsync and manages a three-stage pipeline:
- Stage 1 — Compile: All parts name
compileAsyncin parallel so shaders construct concurrently. Every compile races towards a timeout — on some cellular GPUs, compilation can stall. If the timeout fires, the part skips the wait and renders instantly, accepting a small stutter moderately than a frozen loading display. - Stage 2 — Queue: As soon as compiled, every part joins a FIFO queue and waits its flip. Just one goes at a time — rendering all the pieces directly on the primary body would overwhelm the motive force.
- Stage 3 — Add: The part turns into seen and renders for the primary time — that is the place the GPU driver transfers textures, geometry, and buffers into VRAM. Since there isn’t any completion sign, the part holds the slot for a number of frames earlier than releasing it to the subsequent in line.
The consequence: a clean idle → compiled → importing → executed state machine per part, and the loading animation runs uninterrupted all through.

The Ultimate Polish
The Put up-Processing
False Earth has a first-person mode, so guests can expertise the world from the astronaut’s perspective. The post-processing chain builds up layer by layer: chromatic aberration (R, G, B sampled at barely offset UVs for lens fringing), a vignette with a cool-blue tint to counsel a helmet visor, bloom for emissive glow, and eventually tone mapping for general publicity.
In TSL, the entire chain is wired as a node graph — shade and depth come out of the scene cross as nodes, and every impact plugs in as a rework:
const scenePass = cross(scene, digicam);
const colorTex = scenePass.getTextureNode('output');
const depthTex = scenePass.getViewZNode();
// Every impact transforms the node chain
let finalNode = applyAberration(colorTex);
finalNode = applyVignette(finalNode);
finalNode = finalNode.add(bloom(finalNode));
const pp = new THREE.PostProcessing(renderer);
pp.outputNode = finalNode;
I additionally added depth of area to blur distant geometry. A aspect impact: skinny, emissive components just like the cosmic beams received blurred too, dropping their sharp, high-energy look.
To repair this, I render beams in a separate beamScene. In post-processing, I evaluate the 2 depth buffers and composite with occlusion — beams keep sharp the place they learn as foreground, however nonetheless disappear behind terrain.
const depthDiff = beamDepth.sub(sceneDepth);
const beamOcclusion = smoothstep(float(0), float(10), depthDiff);
finalNode = finalNode.add(beamColor.mul(beamOcclusion));





Syncing Sound to Movement
Sound was the final piece that made the world really feel tangible. Footsteps are triggered from the stroll/run animation phases, so timing stays in sync at each speeds. I take advantage of 5 samples with ±200-cent pitch randomization to keep away from apparent repetition, and quantity attenuates with distance utilizing the Internet Audio API. Quick-lived BufferSource nodes preserve CPU overhead low even with many concurrent one-shots.
Conclusion
I began my profession constructing interactive works for bodily installations. Years in the past, after I first found Three.js, I keep in mind pondering: I could make one thing and anybody with a browser can open it. No gallery, no particular {hardware}. Only a hyperlink. That felt large to me.
Shifting from WebGL to WebGPU felt like that very same leap once more. Compute shaders, oblique drawing, and storage buffers gave me the instruments to render over one million grass blades, drive flower lifecycles totally on the GPU, and preserve all the pieces interactive — all inside a browser.
For me, False Earth is one step in studying what I can construct past the constraints I used to work inside. As net graphics preserve advancing, I need to push additional into immersive storytelling and construct digital experiences that really feel extra alive, responsive, and emotionally grounded.
And the astronaut — I believe he nonetheless has someplace to go.


