On this tutorial, we’ll discover how you can use Three.js and shaders to take an everyday video, pixelate the 2D feed, and extrude these outcomes into 3D voxels, lastly controlling them with the Rapier physics engine.
Let’s kick issues off by having a look on the demo.
Whoa, what is that this?! Or as we are saying in Japanese: “Nanja korya!!!”
Idea
Pixels
Pixels are the elemental items that make up the “type” we see on digital screens. And the general “form” is nothing greater than a group of these pixels… I’ve at all times been fascinated by the idea of pixels, so I’ve been creating pixel artwork utilizing Three.js and shaders. I really like pursuing distinctive visuals that spark curiosity by way of surprising expressions. For instance, I’ve beforehand explored reconstructing dwell digital camera streams by changing particular person pixels with varied concrete objects to create a playful digital transformation. Whereas making issues like that, a thought abruptly struck me:What would occur if I extruded these pixels into 3D house?
Gravity
One other main theme I’ve been exploring in recent times is “Gravity”. Not anti-gravity! Simply plain gravity! Whereas constructing web sites powered by physics engines, I’ve been chasing the chaos and concord that unfold inside them. I’ve additionally lengthy wished to take the Gravity I’ve been working with in 2D areas and develop it into 3D.
All of those concepts and explorations type the start line for this mission.Similar to with pixel artwork, I deal with a single “InstancedMesh” in Three.js as one “pixel”. I organize these neatly in a grid so that they type a flat aircraft… then flip each right into a 3D voxel. On prime of that, I create a corresponding inflexible physique for every voxel and place them underneath a physics engine.
Tech stack used on this demo:
- Rendering: Three.js + Shaders
- Physics: Rapier
- Animation/Management: GSAP
- GUI: Tweakpane
Alright – let’s get began!
Implementation
We’ll implement this in three fundamental steps.
1. Create a Flat Airplane from InstancedMeshes and Inflexible Our bodies
Strategy: The purpose is to coordinate a group of InstancedMesh items with their corresponding inflexible our bodies in order that, collectively, they seem as a single, flat video aircraft.
Consider it as creating one thing like this: (For readability, I’ve added small gaps between every InstancedMesh within the picture under.)

Pixel Unit Settings
We determine the dimensions of every pixel by dividing the video texture based mostly on the specified variety of columns. From this, we calculate the mandatory values to rearrange sq. InstancedMesh correctly.
We retailer all the mandatory knowledge in this.gridInfo. This calculation is dealt with by the calculateGridMetrics operate inside utils/mathGrid.js.
// index.js
/**
* Computes and registers grid metrics into this.gridInfo:
* - Dimensions (cols, rows, depend)
* - Sizing (cellSize, spacing)
* - Format (width, top)
*/
buildScene() {
//...
this.gridInfo = calculateGridMetrics(
this.videoController.width,
this.videoController.top,
this.app.digital camera,
this.settings.cameraDistance,
this.params,
this.settings.fitMargin
);
//...
}
Creating Inflexible Our bodies and InstancedMeshes
Utilizing the grid data we obtained earlier, we’ll now create the inflexible our bodies to put within the physics world, together with the corresponding InstancedMeshes. We’ll begin by creating and positioning the inflexible our bodies first, then create the InstancedMeshes in sync with them.
The important thing level right here is that – anticipating the transition to voxels, we instantiate every unit utilizing BoxGeometry. Nevertheless, we initially flatten them within the vertex shader by scaling the depth near zero. This permits us to initialize them so that they visually seem as flat tiles, identical to ones made with PlaneGeometry.
Every of those particular person InstancedMeshes will signify one pixel.
The createGridBodies methodology is the inspiration of every part. As quickly because the gridInfo is set, this methodology generates all the mandatory parts without delay: creating the inflexible our bodies, setting their preliminary positions and rotations, and producing the noise and random values required for the vertex results and field shapes.
// core/PhysicsWorld.js (simplified)
createGridBodies(gridInfo, params, settings) {
// Put together knowledge constructions (TypedArrays for efficiency)
this.cachedUVs = new Float32Array(depend * 2);
this.noiseOffsets = new Float32Array(depend);
const randomDepths = new Float32Array(depend);
for (let i = 0; i < depend; i++) {
const x = startX + col * spacing;
const y = startY + row * spacing;
const u = (col + 0.5) / cols;
const v = (row + 0.5) / rows;
// Caching UVs for the dice drop
this.cachedUVs[i * 2] = u;
this.cachedUVs[i * 2 + 1] = v;
// Generate noise and random depth scales
this.noiseOffsets[i] = computeNoiseOffset(i);
randomDepths[i] = params.form === "random" ? (Math.random() < 0.2 ? 1.0 : 1.0 + Math.random() * 1.5) : 1.0;
// Create RAPIER Inflexible Our bodies and Colliders
const rbDesc = RAPIER.RigidBodyDesc.mounted().setTranslation(x, y, 0);
const rb = this.world.createRigidBody(rbDesc);
const clDesc = RAPIER.ColliderDesc.cuboid(halfSize, halfSize, halfSize * depthScale);
this.world.createCollider(clDesc, rb);
}
}
We initialize every occasion as a dice utilizing BoxGeometry.
// view/PixelVoxelMesh.js
createGeometry() {
return new THREE.BoxGeometry(
this.gridInfo.cellSize, this.gridInfo.cellSize, this.gridInfo.cellSize);
}
We use setMatrixAt to synchronize the positions and rotations of the InstancedMesh with their corresponding inflexible our bodies.
// view/PixelVoxelMesh.js
setMatrixAt(index, pos, rot) {
this.dummy.place.set(pos.x, pos.y, pos.z);
this.dummy.quaternion.set(rot.x, rot.y, rot.z, rot.w);
this.dummy.scale.set(1, 1, 1);
this.dummy.updateMatrix();
this.mesh.setMatrixAt(index, this.dummy.matrix);
}
Taking part in the Video on a Single Massive Airplane
We move the video as a texture to the fabric of those InstancedMeshes, turning the gathering of small items into one giant, cohesive aircraft. Within the vertex shader, we specify the corresponding video coordinates for every InstancedMesh, defining the precise width and top every occasion is liable for. At first look, it seems as an everyday video taking part in on a flat floor – which is how the visible seen in the beginning is achieved.
// view/PixelVoxelMesh.js
createMaterial(videoTexture) {
return new THREE.ShaderMaterial({
uniforms: {
uMap: {worth: videoTexture},
uGridDims: {worth: new THREE.Vector2(this.gridInfo.cols, this.gridInfo.rows)},
uCubeSize: {worth: new THREE.Vector2(this.gridInfo.cellSize, this.gridInfo.cellSize)},
//...
// vertex.glsl
float instanceId = float(gl_InstanceID);
float col = mod(instanceId, uGridDims.x);
float row = flooring(instanceId / uGridDims.x);
vec2 uvStep = 1.0 / uGridDims;
vec2 baseUv = vec2(col, row) * uvStep;
2. Including Results with Shaders
Strategy: Within the vertex shader, we initially set the depth to a really small worth (0.05 on this case) to maintain every part flat. By regularly growing this depth worth over time, the unique dice form – created with BoxGeometry – will get restored.
This straightforward mechanism types the core of our voxelization course of.
One of many key factors is how we animate this depth improve. For this demo, we’ll do it by way of a “ripple impact.” Ranging from a particular level, a wave spreads outward, progressively turning the flat InstancedMeshes into full cubes.
On prime of that, we’ll combine in some glitch results and RGB shifts in the course of the ripple, then transition to a pixelated single-color look (the place every whole InstancedMesh is stuffed with the colour sampled from the middle of its UV). Lastly, the shapes settle into correct 3D voxels.
Within the demo, I’ve ready two totally different modes for this ripple impact. You may swap between them utilizing the “Natural” toggle within the GUI.
Ripple Impact
JavaScript: Calculating the Wave Vary
In controllers/SequenceController.js, the targetVal – the ultimate purpose of the animation – is calculated based mostly on the gap from the wave’s origin to the farthest InstancedMesh. This ensures the ripple covers each ingredient earlier than ending.
//controllers/SequenceController.js
startCollapse (u, v) {
//...
pixelVoxelMesh.setUniform("uRippleCenter", { x: u, y: v });
const side = this.fundamental.gridInfo.cols / this.fundamental.gridInfo.rows;
const centerX = u * side;
const centerY = v;
const corners = [ { x: 0, y: 0 }, { x: aspect, y: 0 }, { x: 0, y: 1 }, { x: aspect, y: 1 } ];
const maxDist = Math.max(...corners.map((c) => Math.hypot(c.x - centerX, c.y - centerY)));
// Calculate remaining ripple radius based mostly on most distance and unfold
const targetVal = maxDist + params.effectSpread + 0.1;
gsap.to(pixelVoxelMesh.materials.uniforms.uVoxelateProgress, {
worth: targetVal,
//...
Vertex Shader: Controlling the Transformation Progress
- World Progress (
uVoxelateProgress): This worth will increase as much astargetVal, making certain the wave expands till it reaches the furthest corners of the display. - Native Progress: Every InstancedMesh begins its transformation the second the wave passes its place. By utilizing capabilities like
clampandsmoothstep, the progress worth is locked at1.0as soon as the wave has handed – this ensures that each mesh finally completes its transition right into a full dice.
We begin with a flat 0.05 depth and scale as much as the targetDepth throughout voxelization, making use of the progress worth from the chosen mode to drive the transformation.
// vertex.glsl
// --- Progress ---
float progressOrganic = clamp((uVoxelateProgress - noisyDist) / uEffectSpread, 0.0, 1.0);
float progressSmooth = smoothstep(distFromCenter, distFromCenter + uEffectSpread, uVoxelateProgress);
// --- Coordinate Transformation (Depth) ---
float finalScaleVal = combine(valSmooth, valOrganic, uOrganicMode); // Combine between Easy and Natural values based mostly on mode uniform
float baseDepth = 0.05;
float targetDepth = combine(1.0, aRandomDepth, uUseRandomDepth);
float currentDepth = baseDepth + (targetDepth - baseDepth) * finalScaleVal;
vec3 remodeled = place;
remodeled.z *= currentDepth;
Natural Mode On
The wave’s place and propagation are pushed by noise. The voxelization transforms with a bouncy, natural elasticity.
// vertex.glsl
// --- A. Natural Mode (Noise, Elastic) ---
// Add noise to the gap to create an irregular wave entrance
float noisyDist = max(0.0, distFromCenter + (rndPhase - 0.5) * 0.3);
// Calculate native progress based mostly on voxelization progress and unfold
float progressOrganic = clamp((uVoxelateProgress - noisyDist) / uEffectSpread, 0.0, 1.0);
// Apply elastic easing to the progress to get the natural form worth
float valOrganic = elasticOut(progressOrganic, 0.2 + rndPhase * 0.2, currentDamp);
Natural Mode Off (Easy Mode)
The wave propagates linearly throughout the grid in an orderly method. The voxelization progresses with a clear, sine-based bounce, synchronized completely with the enlargement.
// vertex.glsl
// --- B. Easy Mode (Linear, Sine) ---
// Calculate clean progress utilizing a easy distance threshold
float progressSmooth = smoothstep(distFromCenter, distFromCenter + uEffectSpread, uVoxelateProgress);
// Calculate bounce utilizing a sine wave
float valSmoothBounce = sin(progressSmooth * PI) * step(progressSmooth, 0.99);
// Mix linear progress with sine bounce
float valSmooth = progressSmooth + (valSmoothBounce * uEffectDepth);
Inside every mode, quite a few parameters – RGB shift, wave shade, width, and deformation depth – work collectively in concord. Be at liberty to mess around with these settings underneath the “VISUAL” part within the GUI. I believe you’ll uncover some really fascinating and enjoyable Pixel-to-Voxel transformations!
3. Let’s get physics, physics. I wanna get physics.
Strategy: To this point, we’ve targeted on the transformation within the InstancedMeshes – from flat tiles to full cubes. Subsequent, we’ll let the pre-created inflexible our bodies come to life contained in the physics engine. The purpose is that this: the second the shapes totally flip into cubes, the corresponding inflexible our bodies “get up” and begin responding to the physics engine’s gravity, inflicting them to drop and scatter naturally.
Simulating the Ripple Impact in JavaScript
The transformation from flat to full dice occurs solely within the vertex shader, so JavaScript has no environment friendly option to detect precisely when this transformation happens for every occasion on the GPU. The vertex shader doesn’t inform us which particular InstancedMeshes have accomplished their dice transformation.Nevertheless, we’d like this timing data to know precisely when to “get up” the corresponding inflexible our bodies to allow them to reply to gravity and start their drop. To know precisely when every particular person InstancedMesh has completed turning into a dice, we simulate the identical ripple impact logic on the JavaScript facet that’s operating within the vertex shader.
// vertex.glsl
float noisyDist = max(0.0, distFromCenter + (rndPhase - 0.5) * 0.3);
float progressOrganic = clamp((uVoxelateProgress - noisyDist) / uEffectSpread, 0.0, 1.0);
float progressSmooth = smoothstep(distFromCenter, distFromCenter + uEffectSpread, uVoxelateProgress);
We simulate the development talked about above on the JavaScript facet.
// core/PhysicsWorld.js
checkAndActivateDrop(rippleCenter, currentProgress, aspectVecX, organicMode, unfold, spacing, depend, onDropCallback) {
if (!this.dropFlags) return;
const spreadThreshold = unfold * 0.98;
for (let i = 0; i < depend; i++) {
if (this.dropFlags[i] === 1) proceed;
const u = this.cachedUVs[i * 2];
const v = this.cachedUVs[i * 2 + 1];
const noiseOffset = organicMode ? this.noiseOffsets[i] : 0;
const dx = (u - rippleCenter.x) * aspectVecX;
const dy = v - rippleCenter.y;
const distSq = dx * dx + dy * dy;
const threshold = currentProgress - spreadThreshold - noiseOffset;
// Replicating vertex shader ripple logic for physics synchronization
if (threshold > 0 && threshold * threshold > distSq) {
this.dropFlags[i] = 1;
const drive = spacing * 2.0;
this.activateBody(i, drive);
onDropCallback(i);
}
}
}
Progress – Unfold – Noise > DistanceHow can a single JavaScript if assertion keep completely in sync with the shader? The key lies in the truth that each are on the lookout for the very same second: when voxelization completes (the worth reaches 1.0). Features like clamp and smoothstep are merely used to clip the values so that they don’t exceed 1.0 for visible depth. When you strip away these visible “clamps”, the logic in JavaScript and the Shader is similar.
JavaScriptCondition-Natural:
Situation-Easy:
In “Natural” mode, we use this.noiseOffsets—pre-calculated with the computeNoiseOffset operate in utils/frequent.js– following the identical logic used within the vertex shader.
Natural ModeLogic:
When progressOrganic reaches 1.0:
Rearranged:
Easy ModeLogic:
When progressSmooth reaches 1.0:
Rearranged:
One remaining tip: The logic that decides which cubes to drop at the moment loops by way of each occasion. I’ve already optimized it fairly a bit, but when the dice depend grows a lot bigger, we’ll want extra mathematical optimizations—for instance, skipping calculations for grid cells outdoors the present wave propagation space (rectangular bounds).
Let me hear inflexible physique discuss, physique discuss
Manipulating the inflexible our bodies can result in all types of enjoyable and numerous behaviors relying on the values you tweak. Since there are such a lot of parameters to play with, I assumed it may be overwhelming to point out them abruptly. For this demo and tutorial, I’ve targeted the GUI controls on simply “DropDelay” and “Dance” mode.
When dropping, the activateBody methodology applies a random preliminary velocity to every inflexible physique. This methodology is triggered based mostly on the “DropDelay” timing. For the “Dance” mode, the updateAllPhysicsParams methodology updates the inflexible our bodies utilizing the particular parameter values.
// core/PhysicsWorld.js
activateBody(index, initialForceScale = 1.0) {
const rb = this.rigidBodies[index];
if (!rb) return;
rb.setBodyType(RAPIER.RigidBodyType.Dynamic);
rb.setGravityScale(0.01);
rb.setLinearDamping(1.2);
rb.wakeUp();
const drive = initialForceScale;
// Apply random preliminary velocity and rotation
rb.setLinvel({
x: (Math.random() - 0.5) * drive, y: (Math.random() - 0.5) * drive, z: (Math.random() - 0.5) * drive
}, true);
rb.setAngvel({
x: Math.random() - 0.5, y: Math.random() - 0.5, z: Math.random() - 0.5
},true);
}
updateAllPhysicsParams(bodyDamping, bodyRestitution) {
this.rigidBodies.forEach((rb) => {
rb.setLinearDamping(bodyDamping);
rb.setAngularDamping(bodyDamping);
const collider = rb.collider(0);
if (collider) collider.setRestitution(bodyRestitution);
rb.wakeUp();
});
}
To affix the dialog, you’ll be able to work together with the inflexible our bodies utilizing your mouse. These interactions are managed by controllers/InteractionController.js, which communicates with core/PhysicsWorld.js through devoted strategies that allow you to seize and transfer our bodies with Three.js’s Raycaster.
Restoring after the discuss
When the “BACK TO PLANE” button is triggered, startReverse in controllers/SequenceController.js briefly disables gravity for all inflexible our bodies. To revive the unique grid, I carried out a staggered linear interpolation (Lerp/Slerp) that animates every physique again to its preliminary place and rotation saved throughout creation. This creates a clean, orderly transition from a chaotic state again to the flat aircraft.
// controllers/SequenceController.js
startReverse() {
//...
gsap.to(progressObj, {
worth: 1 + staggerStrength,
period: 1.0,
onUpdate: () => {
const globalT = progressObj.worth;
cachedData.forEach((knowledge) => {
// Calculate progress (t) with particular person stagger delay
let t = Math.max(0, Math.min(1, globalT - knowledge.delay * staggerStrength));
if (t <= 0 || (t >= 1 && globalT < 1)) return;
// Interpolate place and rotation
const pos = {
x: knowledge.startPos.x + (knowledge.endPos.x - knowledge.startPos.x) * t,
y: knowledge.startPos.y + (knowledge.endPos.y - knowledge.startPos.y) * t,
z: knowledge.startPos.z + (knowledge.endPos.z - knowledge.startPos.z) * t
};
qTmp.copy(knowledge.startRot).slerp(knowledge.endRot, t);
// Replace RAPIER RigidBody transformation
knowledge.rb.setTranslation(pos, true);
knowledge.rb.setRotation(qTmp, true);
});
// Sync InstancedMesh
physics.syncMesh(pixelVoxelMesh);
}
//...
Implementation Notes
Contact Cache (The Reminiscence of the Discuss): You may discover the second drop feels barely much less intense than the primary. That is seemingly resulting from Rapier’s “Contact Pairs” cache persisting inside the our bodies. For a wonderfully recent simulation, you possibly can regenerate the inflexible our bodies throughout every restoration. I’ve stored it easy right here because the visible influence is minor.
Syncing with the Render Cycle: You may additionally discover that the physics step() is named straight inside animate() in index.js, pushed by requestAnimationFrame. Whereas a Fastened Timestep with an accumulator is customary for consistency, I deliberately omitted it to synchronize with the monitor’s refresh price. Quite than settling for “one-size-fits-all” precision, I selected to harness the total refresh potential of the machine to realize a extra pure aesthetic stream and fluidity. Consequently, this app is sort of CPU-intensive, and efficiency will fluctuate relying in your machine. If issues begin feeling heavy, I like to recommend lowering the “Column” worth within the GUI to lower the general occasion depend.
As for potential technical enhancements, the present setup will hit bottlenecks if the dice depend will increase considerably. I really feel the present depend strikes stability – letting us really feel the transition from pixel to voxel – however dealing with tens of 1000’s of cubes would require a distinct method.
Rendering Optimization: As an alternative of CPU-based matrix calculations, place and rotation knowledge might be saved in DataTextures to deal with computations in shaders. This represents a considerable enchancment and is a concrete subsequent step for additional optimization. Alternatively, using Three.js’s WebGPURenderer (with TSL) is one other path for optimization.
Physics Technique: Whereas we might transfer physics solely to the GPU – which excels at large, easy simulations – it could actually usually restrict the logic-based flexibility required for particular interactions. For a mission like this, utilizing Rapier on the CPU is the fitting selection, because it gives the right stability between high-performance physics and exact, direct management.
Wrap-Up
Utilizing these as a basis, attempt creating your individual unique Pixel-Voxel-Drop variations! Begin easy – swap out the video for a distinct one, or change the feel with a nonetheless photograph, for instance.
From there, experiment by altering the geometry or shaders: flip the voxels into icy bins, use Voronoi patterns to create a shattering glass impact, or make solely cubes of a particular shade drop… Alternatively, you possibly can even combine this impact into an internet site’s hero part – which is precisely what I’m planning on doing subsequent. The concepts will unfold outward, identical to the ripple impact itself!
Ultimate Ideas
Lastly, I’d like to specific my gratitude to Codrops for reaching out to me this time. Codrops has at all times impressed me, so it’s really an honor to be invited by them. Thanks a lot.
A particular enormous thank-you goes to Manoela. I imagine it was her single remark concerning the ripple impact that formed the path of this demo. I’ve even named this impact “Manoela’s Ripple.” Thanks for the great trace and inspiration.
And likewise, concerning the video used on this demo: Shibuya Crossing Video from Pexels.com.
Watching the dynamic stream of Shibuya Crossing, it completely captures the vibe of how inflexible our bodies collide and work together inside a physics engine. Simply as pixels and voxels are the elemental items of a digital form, every individual and every object exists as an unbiased entity, colliding and transferring in their very own means. On this gathering of people, there’s concord inside chaos, and chaos inside concord – that is the very world all of us dwell in.


