The Cast website started with a easy query: what if a studio web site might really feel like a place? Not only a scroll of thumbnails or a cultured homepage, however a world you may truly step into and discover.
Once I began Cast, I knew I wished to merge my two passions: automobiles and know-how. The automotive world is filled with areas that radiate function—garages, machine outlets, fabrication labs—however the digital business not often creates something that feels the identical manner. The positioning turned a strategy to join these two worlds, an experiment in constructing a digital house that captures the spirit of the bodily one.
As an alternative of selling pages, we constructed rooms: a storage, a desk, a workbench, a testing house. Each anchors a part of the story: who we’re, what we make, and the way it’s constructed. The objective wasn’t to showcase the work; it was to create a workshop you’d truly wish to step into.
From Storage to Browser
The visible language got here straight from the true world. Assume Han’s storage from Tokyo Drift for vibe intersecting with clean-meets-industrial aesthetic of Singer or Gunther Werks. The Cast house borrows that very same steadiness of polish and grit: concrete flooring, steel shelving, gentle reflections on paint and glass.

Early on, I mocked up ideas utilizing AI picture era — not as closing artwork, however as a quick strategy to present what was in my head: the storage exterior, the door rolling open, the distinction between tough and refined. That shortcut allow us to iterate visually earlier than a single polygon was modeled.

From there, we constructed the total setting in Blender, modeling every component, dressing the scene with props, and baking lighting passes to seize lifelike falloff and shadow. The problem was to make every thing really feel cohesive with a number of rooms forming one steady workshop, all illuminated as in the event that they belonged to the identical bodily house.
The Blender-to-WebGPU Workflow
Each scene within the Cast website begins in Blender however involves life inside our customized WebGPU engine.
In Blender:
- Mannequin and structure every setting.
- Assign base supplies (normals and roughness maps the place wanted).
- Bake lighting into high-resolution textures, packing as many objects per atlas as potential.

In WebGPU:
- Import the geometry (GLB with Draco compression) and lightmaps (separate KTX2 recordsdata).
- Assign customized shaders by means of an in-engine GUI.
- Tweak the fabric habits per scene utilizing shared shader modules — PBR, microfacet BRDF, and reflection mixing.
We don’t use the strict glTF PBR spec. Every materials is tuned manually to steadiness efficiency and realism — what I name a part of the “faux tracing” pipeline. Some objects, just like the automobile, run full PBR with roughness and metallic maps. Others, like partitions or furnishings, simply use baked lighting plus microfacet scattering for a touch of movement within the highlights.

The automobile is the one factor utilizing true PBR, however for a deep dive on lifelike rendering automobiles take a look at the Aura Colour Explorer case examine.
Blender offers us the precision of true raytraced lighting; WebGPU offers us real-time flexibility to make the house really feel alive.
A simplified overview of the pipeline:
Blender → Bake Lighting → Compress (Draco + KTX2)
→ Import to WebGPU Engine → Assign Shaders → Tune Supplies
Lighting the Workshop — The “Pretend Tracing” Pipeline
From the beginning, the objective wasn’t simply to make the Cast workshop look lifelike, however to make it behave that manner. You may faux virtually every thing on the net, however mild is the toughest factor to faux convincingly.
I name it faux tracing. It’s a hybrid lighting method: baked realism for international temper, and light-weight dynamic cues to maintain the scene respiration. It isn’t bodily appropriate; it’s perceptually appropriate.
Two Layers of Gentle
- Baked diffuse: All oblique mild, shade bleed, and ambient tone are baked in Blender.
These maps deal with the “world lighting”: the way in which the setting fills with gentle reflections. - Dynamic specular: A small real-time system contained in the WebGPU engine handles solely the highlights: the glints on steel, the roll-off throughout a leather-based chair, the shimmer on paint as you scroll.

Every materials will get a easy mix management — a 0-to-1 worth that decides how a lot the dynamic layer contributes on high of the baked base.
vec3 baked = texture(Lightmap, uv).rgb;
vec3 spec = MicrofacetBRDF(regular, viewDir, lightDir, roughness);
vec3 shade = combine(baked, baked + spec, uDynamicBlend);
Microfacet Distribution
The specular layer makes use of a typical microfacet BRDF distribution with a customized roughness curve that exaggerates grazing highlights. It’s not a full PBR stack; it’s stripped down for velocity.
vec3 MicrofacetBRDF(vec3 N, vec3 V, vec3 L, float roughness, vec3 F0) {
// Half vector
vec3 H = normalize(V + L);
float NdotV = max(dot(N, V), 0.0);
float NdotL = max(dot(N, L), 0.0);
float NdotH = max(dot(N, H), 0.0);
float VdotH = max(dot(V, H), 0.0);
// Keep away from denormals / division points
const float EPS = 1e-5;
// GGX regular distribution operate (Trowbridge-Reitz)
float a = max(roughness, 0.001); // perceptual roughness [0..1]
float a2 = a * a;
float d_denom = max((NdotH * NdotH) * (a2 - 1.0) + 1.0, EPS);
float D = a2 / (3.14159265 * d_denom * d_denom);
// Smith masking-shadowing with Schlick-G time period
float ok = (a + 1.0);
ok = (ok * ok) / 8.0;
float Gv = NdotV / max(NdotV * (1.0 - ok) + ok, EPS);
float Gl = NdotL / max(NdotL * (1.0 - ok) + ok, EPS);
float G = Gv * Gl;
// Schlick Fresnel
vec3 F = F0 + (1.0 - F0) * pow(1.0 - VdotH, 5.0);
// Specular time period
vec3 spec = (D * G) * F / max(4.0 * NdotL * NdotV + EPS, EPS);
// Power conservation guard + NdotL issue for mild transport
return spec * NdotL;
}
These highlights transfer with the digicam, giving the phantasm of depth and movement even when the baked mild is static. It’s a type of tiny perceptual tips that convinces your mind the world is actual.
Why It Works
Totally baked scenes look lovely till they transfer — then they really feel lifeless. When mild scatters and shifts just a bit as you scroll, one thing fires in your mind: that is alive. That’s what faux tracing goals for.
As an alternative of chasing good bodily accuracy, we chase that second of disbelief the place you cease desirous about shaders and simply really feel such as you’re standing in an actual room.
Scroll as a Directional Language
If lighting makes the Cast world plausible, movement makes it really feel tangible. Scroll is the one enter each customer shares, so as a substitute of treating it as a set off, we handled it as a language.
One Axis at a Time
Scroll isn’t a joystick. It’s a single-axis gesture, and the location respects that. Every scene strikes in just one path, both ahead by means of house or downward by means of the construction, by no means each.

Should you attempt to combine them, it feels flawed.
As quickly because the digicam curves or circles whereas tied to scroll, your mind expects game-style controls. By committing to at least one axis per part, the expertise stays intuitive: the mouse wheel strikes the world in a straight line, and your physique immediately understands it.
The Lower
Transitions between areas work like movie edits. Once you attain the steps, for example, the digicam doesn’t climb them — it cuts to the brand new flooring. It’s the identical continuity trick you see in movies or video games: movement continues in a single path, however the geography resets.
That single lower retains the rhythm easy and reinforces the phantasm that the entire website exists inside one steady constructing.

Tuning the Rhythm
Getting the pacing proper wasn’t about keyframes or storyboards; it was about distance.
Every scene has an outlined peak in view items — actually, what number of screen-heights you scroll to journey by means of it.
Too quick? Add extra view-heights. Too gradual? Take away just a few. After a handful of tweaks, the stream locked in.
// simplified scroll-to-camera mapping
const sectionHeight = vh(300); // 3 view heights per part
digicam.place.z = scrollProgress * sectionHeight;
That’s it. No physics, no easing equations — simply proportion. As soon as these proportions felt proper, the location moved with its personal gravity: heavy sufficient to really feel bodily, mild sufficient to glide.
Submit-Processing and Ambient Methods
After the lighting and scroll programs have been locked in, the ultimate layer was about ambiance — the ending touches that make the world really feel grounded in actuality.
Tone Mapping and Digicam Results
A customized tone-mapper balances baked shade with dynamic specular power, adopted by a gentle vignette, a hint of RGB shift, and a touch of movie grain. These small imperfections add a layer of analog heat — the distinction between one thing that appears rendered and one thing that feels shot.
Reflections and Occlusion
To maintain the house linked, we added real-time reflections throughout the ground — a delicate mirror cross that anchors every thing to the identical airplane. On high of that runs screen-traced directional occlusion (SDDO), a light-weight shader that fakes ambient occlusion whereas additionally injecting a touch of sunshine bounce.
This hybrid cross offers depth with out killing efficiency — slightly shadow underneath the tires, a glow round vibrant surfaces — the form of element your eye catches subconsciously.

Even with every thing baked, mild nonetheless must transfer. As you scroll, these faux reflections and occlusion passes shift simply sufficient to make the house really feel dynamic. It’s not bodily appropriate, however perceptually it’s spot-on: the distinction between watching a nonetheless render and standing inside one.
Ultimately, these post-processing tips aren’t about ornament. They’re about deception — convincing your eyes that one thing inconceivable is occurring proper in a browser tab.
Efficiency Tuning
With a lot occurring in every scene, like baked and dynamic lighting, reflections, occlusion, and scroll-driven movement, maintaining the location responsive on each gadget got here right down to expertise, not automation.
After constructing real-time graphics for greater than a decade, I’ve developed a way of what every class of {hardware} can deal with. As an alternative of complicated benchmarks, we depend on instinct and light-weight detection. Excessive-end GPUs get every thing; mid-tier units drop the heavier passes; older telephones and funds PCs nonetheless run the world, simply with fewer dynamic tips.
The tenet is straightforward: easy movement first, constancy second. If efficiency dips, we sacrifice visible luxuries lengthy earlier than scroll fluidity. The digicam ought to all the time really feel weightless.
| Tier | Key Options |
|---|---|
| Excessive | All results, full reflections, dynamic specular, SDDO |
| Mid | Half-res lightmaps, diminished reflection updates |
| Low | Baked lighting solely, no reflections or post-FX |
The result’s efficiency that feels tuned by hand — like tuning an engine by ear somewhat than by numbers. You may hear when it’s operating proper, even if you happen to can’t measure it.
Why WebGPU Adjustments Every little thing
The complete Cast website runs on WebGPU, and the largest change from WebGL isn’t simply theoretical—it’s bodily.
In WebGL, you may hear when issues received heavy. The followers would spin up. With WebGPU, they keep silent.
That quietness says every thing concerning the distinction in structure. In WebGL, virtually each off-screen impact—blurs, reflections, mild passes—needed to be processed by means of framebuffers, which meant treating each operation like a texture. It labored, nevertheless it got here at a excessive value.
With compute shaders and workgroups, WebGPU can carry out those self same calculations straight on the GPU with virtually no detour. Duties like mild scattering, reflection prefiltering, and blur convolution now occur in parallel, with out heating the machine.
The influence is actual: smoother efficiency, cooler units, and a a lot larger ceiling for what’s potential in a browser. It permits us to run results constantly that will have been prohibitive in WebGL, not simply as soon as per body however each body.
For us, WebGPU isn’t only a renderer; it’s the core structure behind every thing Cast will construct subsequent—the muse for web sites, installations, and interactive movies that every one share the identical inventive engine.
WebGPU lastly offers internet experiences the identical expressive vary as native engines. The distinction is that we will now deploy them immediately, wherever, with no set up and no compromise.
Conclusion
From the primary baked check renders to the primary reside scroll cross, every thing simply… labored. The lighting, the structure, the transitions — all of them clicked quicker than anticipated.
That mattered, as a result of this was the primary Cast venture. It set the tone for every thing that got here after. If the location had struggled or felt compromised, it will’ve been more durable to imagine within the six-week dash that adopted. However when the storage door opened for the primary time and lightweight spilled throughout the ground precisely as imagined, it was clear: we have been constructing one thing actual.
It’s not about displaying off know-how. It’s about reminding people who the online can nonetheless shock you.


