At first of 2025, I lastly determined to construct myself a new portfolio. I nonetheless just about preferred the one I made again in 2021, however I felt the necessity to put to good use all of the cool stuff I’ve discovered these previous couple years working with WebGPU. And, moreover, half of the tasks featured in my case research had been put offline anyway, so it was about time.
I didn’t actually know the place I used to be going at this level, besides that:
- It might, after all, characteristic a number of procedurally generated WebGPU scenes. I already had a number of ideas to discover in thoughts, like particles or boids simulation.
- I needed to deal with the design myself. It could appear bizarre, particularly since I used to be very pleased with what Gilles got here up designing for my final portfolio, and in addition as a result of I do suck at design. However this may give me extra freedom, and I’ve additionally all the time preferred constructing issues from scratch by myself.
- Final however not least, it needed to be enjoyable!
1. The journey
The (robust) design and content material course of
Don’t do that!
At first, I had no concept what to do design smart. Fonts, colours: there are such a lot of issues that might go fallacious.
I began with easy gentle and darkish colours, stored the fonts Gilles had chosen for my earlier portfolio and began to repeat/paste its outdated textual content content material. It didn’t really feel that nice, and it wasn’t enjoyable for certain.

I undoubtedly wanted colours. I might have wasted a number of hours (or days) selecting the best pairing, however as a substitute I made a decision this may very well be the fitting alternative to make use of this random colour palette generator utility I’ve coded a number of years in the past. I cleaned the code a bit, created a repo, printed it to npm and added it to my mission. I additionally barely modified the tone of the copywriting, and that led me to one thing nonetheless not that nice, however a bit extra enjoyable.

I let it web site for some time and began engaged on different components of the positioning, resembling integrating the CMS or experimenting with the WebGPU scenes. It’s solely after a protracted iteration course of that I’ve lastly arrange my thoughts on this sort of old-fashioned video video games retro vibe blended with a extra cheerful, cartoonish aesthetic, nearly Sweet Crush-esque. Impactful headings, popping animations, banded gradients… you title it.
After all, I’ve by no means gone so far as making a Figma mission (I did choose a number of reference photos as a moodboard although) and simply examined a ton of stuff immediately with code till I felt it wasn’t that dangerous anymore. All in all, it was a really lengthy and painful course of, and I suppose each designer would agree at this level: don’t do that!

Do you really learn portfolios content material?
One other painful level was to decide on the precise content material and general construction of the positioning. Do I would like detailed case research pages? Do I would like pages in any respect? Will the customers even learn all these lengthy blocks of textual content I’ll wrestle to put in writing?
In the long run, I selected to drop the case research pages. I had a few causes to take action:
- Usually occasions the mission finally ends up being put offline for varied causes, and you find yourself showcasing one thing the person can’t go to anymore. That is precisely what occurred on my earlier portfolio.
- Many of the consumer work I’ve been doing these previous years has been for companies, and I’m not all the time allowed to publicly share them. I’ve no drawback with that, however it barely diminished the variety of tasks I might spotlight.
From there on, it was a fast choice to only go together with a single touchdown web page. I’d put direct hyperlinks to the tasks I might spotlight and small movies of all the opposite tasks or private works I might characteristic. On prime of that, I’d add a number of “about” sections blended with my WebGPU scenes, and that’d be the gist of it.
Talking of the WebGPU scenes, I actually needed them to be significant, not only a technical demonstration of what I might do. However we’ll get to that later.
The ultimate UX twist
After a number of months, I felt like I used to be coming into the ultimate stage of improvement. The web page construction was principally accomplished, all my varied sections had been there and I used to be engaged on the ultimate animations and micro-interactions tweakings.
So I took a step again, and appeared again at my preliminary expectations. I had my WebGPU scenes showcasing my varied technical expertise. I had dealt with the design myself, and it wasn’t that dangerous. However had been the flashy colours and animations sufficient to make it a extremely enjoyable expertise general?
I feel you already know the reply. One thing was lacking.
Apart from the random colour palette switcher, the UX mainly consisted of scroll-driven animations. Many of the 3D scenes interactions had been rudimentary. I wanted an concept.
The design already had this online game cheerful look. So… What if I turned my complete portfolio right into a sport?
As soon as once more, I began writing down my concepts:
- The person would want to work together with the totally different UI components to unlock the theme switcher and colour palette generator buttons.
- Every WebGPU scene might function a solution to unlock the next content material, appearing as a really primary “puzzle” sport.
- Preserve observe of the person general progress.
- Permit the person to skip the entire sport course of in the event that they need to.
This implies many of the customers wouldn’t ever make it to the footer, or use this random palette generator device I’ve struggled to implement. This would possibly very nicely be essentially the most riskiest, stupidest choice I’ve made to this point. However it could give my portfolio this distinctive and enjoyable contact I used to be in search of within the first place, so I went all in.
After all, it goes with out saying it implied a significant refactoring of the entire code and I wanted to give you authentic interplay concepts for the WebGPU scenes, however I wish to suppose it was price it.


2. Technical research
Now that you recognize all of the whys, let’s take a look on the hows!
Tech stack
I’ve determined to strive Sanity Studio as I’ve by no means labored with it earlier than and as I knew it could be a comparatively small mission, it’d be an ideal match to start out utilizing it. Regardless that I felt like I simply scratched its floor, I preferred the general developer expertise it offered. Then again, I already had a great expertise working with Nuxt3 so this was a simple alternative.
No want to say why I selected GSAP and Lenis — everybody is aware of these are nice instruments to ship easy animated web sites.
After all, the WebGPU scenes needed to be accomplished with gpu-curtains, the 3D engine I spent a lot time engaged on these previous two years. It was an effective way to check it in a real-life state of affairs and gave me the chance to repair a number of bugs or add a pair options alongside the best way.
And since I needed the entire course of to be as clear as potential, I’ve printed the entire supply code as a monorepo on GitHub.
Animations
I received’t go too deep into how I dealt with the assorted animations, just because I’ve basically used CSS and a little bit of GSAP right here and there, principally for canvas animations, SplitText results or the movies carousel utilizing ScrollTrigger observer.
The essential scenes
There are lots of elements on the web site that wanted to attract one thing onto a <canvas> and react to the theme and/or colour palette modifications.
To deal with that, I created a Scene.ts class:
import sort { ColorPalette } from "@martinlaxenaire/color-palette-generator";
export interface SceneParams {
container: HTMLElement;
progress?: quantity;
palette?: ColorPalette;
colours?: ColorModelBase[];
}
export class Scene {
#progress: quantity;
container: HTMLElement;
colours: ColorModelBase[];
isVisible: boolean;
constructor({ container, progress = 0, colours = [] }: SceneParams) {
this.container = container;
this.colours = colours;
this.#progress = progress;
this.isVisible = true;
}
onResize() {}
onRender() {}
setSceneVisibility(isVisible: boolean = true) {
this.isVisible = isVisible;
}
setColors(colours: ColorModelBase[]) {
this.colours = colours;
}
get progress(): quantity {
return this.#progress;
}
set progress(worth: quantity) {
this.#progress = isNaN(worth) ? 0 : worth;
this.onProgress();
}
forceProgressUpdate(progress: quantity = 0) {
this.progress = progress;
}
lerp(begin = 0, finish = 1, quantity = 0.1) {
return (1 - quantity) * begin + quantity * finish;
}
onProgress() {}
destroy() {}
}
Since switching theme from gentle to darkish (or vice versa) additionally updates the colour palette by tweaking the HSV worth element of the colours a bit, I’ve simply put a setColors() methodology in there to deal with these modifications.
The progress dealing with right here is definitely a stay of when the WebGPU scenes animations had been principally scroll-driven (earlier than I launched the sport mechanisms), however since a number of scenes nonetheless used it, I stored it in there.
All of the 2D canvas scenes prolong that class, together with the WebGPU fallback scenes, the theme switcher button or the dynamic favicon generator (did you discover that?).
The WebGPU scenes
One of many very cool options launched by WebGPU is that you may render to a number of <canvas> components utilizing just one WebGPU system. I used this to construct 4 totally different scenes (we’ll take a better have a look at every of them under), that each one prolong a WebGPUScene.ts class:
import { GPUCurtains } from "gpu-curtains";
import sort { ComputeMaterial, RenderMaterial } from "gpu-curtains";
import { Scene } from "./Scene";
import sort { SceneParams } from "./Scene";
import {
QualityManager,
sort QualityManagerParams,
} from "./utils/QualityManager";
export interface WebGPUSceneParams extends SceneParams {
gpuCurtains: GPUCurtains;
targetFPS?: QualityManagerParams["targetFPS"];
}
export class WebGPUScene extends Scene {
gpuCurtains: GPUCurtains;
qualityManager: QualityManager;
high quality: quantity;
_onVisibilityChangeHandler: () => void;
constructor({
gpuCurtains,
container,
progress = 0,
colours = [],
targetFPS = 55,
}: WebGPUSceneParams) {
tremendous({ container, progress, colours });
this.gpuCurtains = gpuCurtains;
this._onVisibilityChangeHandler =
this.onDocumentVisibilityChange.bind(this);
this.qualityManager = new QualityManager({
label: `${this.constructor.title} high quality supervisor`,
updateDelay: 2000,
targetFPS,
onQualityChange: (newQuality) => this.onQualityChange(newQuality),
});
this.high quality = this.qualityManager.high quality.present;
doc.addEventListener(
"visibilitychange",
this._onVisibilityChangeHandler
);
}
override setSceneVisibility(isVisible: boolean = true) {
tremendous.setSceneVisibility(isVisible);
this.qualityManager.lively = isVisible;
}
onDocumentVisibilityChange() {
this.qualityManager.lively = this.isVisible && !doc.hidden;
}
compilteMaterialOnIdle(materials: ComputeMaterial | RenderMaterial) {
if (!this.isVisible && "requestIdleCallback" in window) {
window.requestIdleCallback(() => {
materials.compileMaterial();
});
}
}
override onRender(): void {
tremendous.onRender();
this.qualityManager.replace();
}
onQualityChange(newQuality: quantity) {
this.high quality = newQuality;
}
override destroy(): void {
tremendous.destroy();
doc.removeEventListener(
"visibilitychange",
this._onVisibilityChangeHandler
);
}
}
Within the actual model, this class additionally handles the creation of a Tweakpane GUI folder (helpful for debugging or tweaking values), however for the sake of readability I eliminated the associated code right here.
As you may see, every of those scenes carefully displays its personal efficiency utilizing a customized QualityManager class. We’ll speak about that later, within the efficiency part.
Okay, now that now we have the essential structure in thoughts, let’s break down every of the WebGPU scenes!
Since WebGPU just isn’t absolutely supported but, I’ve created fallback variations utilizing the 2D canvas API and the Scene class we’ve seen above for every of the next scenes.
Hero scene
The scenes featured within the portfolio someway respect a sort of complexity order, which means the extra you advance within the portfolio, the extra technically concerned the scenes develop into.
In that manner, the hero scene is by far the simplest technically talking, however it needed to look notably placing and interesting to right away seize the person’s consideration. It was thought as some type of cellular puzzle sport splash display.
It’s manufactured from a primary, single fullscreen quad. The thought right here is to first rotate its UV elements every body, map them to polar coordinates and use that to create coloured triangles segments.
// Middle UVs at (0.5, 0.5)
var centeredUV = uv - vec2f(0.5);
// Apply rotation utilizing a 2D rotation matrix
let angleOffset = params.time * params.pace; // Rotation angle in radians
let cosA = cos(angleOffset);
let sinA = sin(angleOffset);
// Rotate the centered UVs
centeredUV = vec2<f32>(
cosA * centeredUV.x - sinA * centeredUV.y,
sinA * centeredUV.x + cosA * centeredUV.y
);
// Convert to polar coordinates
let angle = atan2(centeredUV.y, centeredUV.x); // Angle in radians
let radius = size(centeredUV);
// Map angle to triangle index
let totalSegments = params.numTriangles * f32(params.nbColors) * params.fillColorRatio;
let normalizedAngle = (angle + PI) / (2.0 * PI); // Normalize to [0,1]
let triIndex = flooring(normalizedAngle * totalSegments); // Get triangle index
// Compute fractional half for mixing
let segmentFraction = fract(normalizedAngle * totalSegments); // Worth in [0,1] inside section
let isEmpty = (i32(triIndex) % i32(params.fillColorRatio)) == i32(params.fillColorRatio - 1.0);
let colorIndex = i32(triIndex / params.fillColorRatio) % params.nbColors; // Use half as many colour indices
let colour = choose(vec4(params.colours[colorIndex], 1.0), vec4f(0.0), isEmpty);
There’s really a wavy noise utilized to the UV beforehand utilizing concentric circles, however you get the thought.
Apparently sufficient, essentially the most troublesome half was to attain the rounded rectangle coming into animation whereas preserving the proper side ratio. This was accomplished utilizing this operate:
fn roundedRectSDF(uv: vec2f, decision: vec2f, radiusPx: f32) -> f32 {
let side = decision.x / decision.y;
// Convert pixel values to normalized UV area
let marginUV = vec2f(radiusPx) / decision;
let radiusUV = vec2f(radiusPx) / decision;
// Regulate radius X for side ratio
let radius = vec2f(radiusUV.x * side, radiusUV.y);
// Middle UV round (0,0) and apply scale (progress)
var p = uv * 2.0 - 1.0; // [0,1] → [-1,1]
p.x *= side; // repair side
p /= max(0.0001, params.showProgress); // apply scaling
p = abs(p);
// Half measurement of the rounded rect
let halfSize = vec2f(1.0) - marginUV * 2.0 - radiusUV * 2.0;
let halfSizeScaled = vec2f(halfSize.x * side, halfSize.y);
let d = p - halfSizeScaled;
let exterior = max(d, vec2f(0.0));
let dist = size(exterior) + min(max(d.x, d.y), 0.0) - radius.x * 2.0;
return dist;
}
Highlighted movies slider scene
Subsequent up is the highlighted movies slider. The unique concept got here from an outdated WebGL prototype I had constructed a number of years in the past and by no means used.
The thought is to displace the planes vertices to wrap them round a cylinder.
var place: vec3f = attributes.place;
// curve
let angle: f32 = 1.0 / curve.nbItems;
let cosAngle = cos(place.x * PI * angle);
let sinAngle = sin(place.x * PI * angle);
place.z = cosAngle * curve.itemWidth;
place.x = sinAngle;
I clearly used this for the years titles, whereas the movies and path results behind them are distorted utilizing a post-processing cross.
Whereas this was initially tied to the vertical scroll values (and I actually preferred the sensation it produced), I needed to replace its habits after I switched to the entire gamification concept, making it an horizontal carousel.
Due to gpu-curtains DOM to WebGPU syncing capabilities, it was comparatively simple to arrange the movies grid prototype utilizing the Airplane class.
The path impact is completed utilizing a compute shader writing to a storage texture. The compute shader solely runs when mandatory, which suggests when the slider is shifting. I’m certain it might have been accomplished in a hundreds alternative ways, however it was a great excuse to play with compute shaders and storage textures. Right here’s the compute shader concerned:
struct Rectangles {
sizes: vec2f,
positions: vec2f,
colours: vec4f
};
struct Params {
progress: f32,
depth: f32
};
@group(0) @binding(0) var backgroundStorageTexture: texture_storage_2d<rgba8unorm, write>;
@group(1) @binding(0) var<uniform> params: Params;
@group(1) @binding(1) var<storage, learn> rectangles: array<Rectangles>;
fn sdfRectangle(middle: vec2f, measurement: vec2f) -> f32 {
let dxy = abs(middle) - measurement;
return size(max(dxy, vec2(0.0))) + max(min(dxy.x, 0.0), min(dxy.y, 0.0));
}
@compute @workgroup_size(16, 16) fn predominant(
@builtin(global_invocation_id) GlobalInvocationID: vec3<u32>
) {
let bgTextureDimensions = vec2f(textureDimensions(backgroundStorageTexture));
if(f32(GlobalInvocationID.x) <= bgTextureDimensions.x && f32(GlobalInvocationID.y) <= bgTextureDimensions.y) {
let uv = vec2f(f32(GlobalInvocationID.x) / bgTextureDimensions.x - params.progress,
f32(GlobalInvocationID.y) / bgTextureDimensions.y);
var colour = vec4f(0.0, 0.0, 0.0, 0.0); // Default to black
let nbRectangles: u32 = arrayLength(&rectangles);
for (var i: u32 = 0; i < nbRectangles; i++) {
let rectangle = rectangles[i];
let rectDist = sdfRectangle(uv - rectangle.positions, vec2(rectangle.sizes.x * params.depth, rectangle.sizes.y));
colour = choose(colour, rectangle.colours * params.depth, rectDist < 0.0);
}
textureStore(backgroundStorageTexture, vec2<i32>(GlobalInvocationID.xy), colour);
}
}
I believed I used to be accomplished right here, however whereas operating manufacturing construct checks I stumbled upon a problem. Sadly, preloading all these movies to make use of as WebGPU textures resulted in an enormous preliminary payload and in addition considerably affected the CPU load. To mitigate that, I’ve carried out a sequential video preloading the place I’d have to attend for every video to have sufficient information earlier than loading the subsequent one. This gave an enormous enhance relating to preliminary load time and CPU overhead.

Invoices scene
The third WebGPU scene was initially imagined to represent my very own take at 3D boids simulations, utilizing instancing and a compute shader. After a bit of labor, I had a bunch of cases that had been following my mouse, however the finish outcome was not dwelling as much as my expectations. The spheres had been generally overlapping one another, or disappearing behind the sides of the display. I stored bettering it, including self-collision, edge detections and attraction/repulsion mechanisms till I used to be completely satisfied sufficient with the outcome.
I wish to name it the “invoices” scene, as a result of the sphere cases right here really signify all of the invoices I really issued throughout my freelance profession, scaled primarily based on the quantities. Since I’m utilizing google sheets to deal with most of my accounting, I’ve made somewhat script that gathers all my invoices quantity in a single, separate non-public sheet every time I’m updating my accounting sheets. I then fetch and parse that sheet to create the cases. It was a enjoyable little facet train and turns this scene into an sarcastically significant experiment: every time you click on and maintain, you sort of assist me accumulate my cash.
The compute shader makes use of a buffer ping-pong approach: you begin with two identically crammed buffers (e.g. packed uncooked information) then at every compute dispatch name, you learn the information from the primary buffer and replace the second accordingly. As soon as accomplished, you swap the 2 buffers earlier than the subsequent name and repeat the method.
In the event you’re aware of WebGL, that is usually accomplished with textures. WebGPU and compute shaders permit us to take action with buffers, which is far more highly effective. Right here is the whole compute shader code:
struct ParticleB {
place: vec4f,
velocity: vec4f,
rotation: vec4f,
angularVelocity: vec4f,
information: vec4f
};
struct ParticleA {
place: vec4f,
velocity: vec4f,
rotation: vec4f,
angularVelocity: vec4f,
information: vec4f
};
struct SimParams {
deltaT: f32,
mousePosition: vec3f,
mouseAttraction: f32,
spheresRepulsion: f32,
boxReboundFactor: f32,
boxPlanes: array<vec4f, 6>
};
@group(0) @binding(0) var<uniform> params: SimParams;
@group(0) @binding(1) var<storage, learn> particlesA: array<ParticleA>;
@group(0) @binding(2) var<storage, read_write> particlesB: array<ParticleB>;
fn constrainToFrustum(pos: vec3<f32>, ptr_velocity: ptr<operate, vec3<f32>>, radius: f32) -> vec3<f32> {
var correctedPos = pos;
for (var i = 0u; i < 6u; i++) { // Loop by means of 6 frustum planes
let aircraft = params.boxPlanes[i];
let dist = dot(aircraft.xyz, correctedPos) + aircraft.w;
if (dist < radius) { // If contained in the aircraft boundary (radius = 1)
// Transfer the purpose contained in the frustum
let correction = aircraft.xyz * (-dist + radius); // Push contained in the frustum
// Apply the place correction
correctedPos += correction;
// Replicate velocity with damping
let regular = aircraft.xyz;
let velocityAlongNormal = dot(*(ptr_velocity), regular);
if (velocityAlongNormal < 0.0) { // Guarantee we solely mirror if shifting in the direction of the aircraft
*(ptr_velocity) -= (1.0 + params.boxReboundFactor) * velocityAlongNormal * regular;
}
}
}
return correctedPos;
}
fn quaternionFromAngularVelocity(omega: vec3f, dt: f32) -> vec4f {
let theta = size(omega) * dt;
if (theta < 1e-5) {
return vec4(0.0, 0.0, 0.0, 1.0);
}
let axis = normalize(omega);
let halfTheta = 0.5 * theta;
let sinHalf = sin(halfTheta);
return vec4(axis * sinHalf, cos(halfTheta));
}
fn quaternionMul(a: vec4f, b: vec4f) -> vec4f {
return vec4(
a.w * b.xyz + b.w * a.xyz + cross(a.xyz, b.xyz),
a.w * b.w - dot(a.xyz, b.xyz)
);
}
fn integrateQuaternion(q: vec4f, angularVel: vec3f, dt: f32) -> vec4f {
let omega = vec4(angularVel, 0.0);
let dq = 0.5 * quaternionMul(q, omega);
return normalize(q + dq * dt);
}
@compute @workgroup_size(64) fn predominant(
@builtin(global_invocation_id) GlobalInvocationID: vec3<u32>
) {
var index = GlobalInvocationID.x;
var vPos = particlesA[index].place.xyz;
var vVel = particlesA[index].velocity.xyz;
var collision = particlesA[index].velocity.w;
var vQuat = particlesA[index].rotation;
var angularVelocity = particlesA[index].angularVelocity.xyz;
var vData = particlesA[index].information;
let sphereRadius = vData.x;
var newCollision = vData.y;
collision += (newCollision - collision) * 0.2;
collision = smoothstep(0.0, 1.0, collision);
newCollision = max(0.0, newCollision - 0.0325);
let mousePosition: vec3f = params.mousePosition;
let minDistance: f32 = sphereRadius; // Minimal allowed distance between spheres
// Compute attraction in the direction of sphere 0
var directionToCenter = mousePosition - vPos;
let distanceToCenter = size(directionToCenter);
// Decelerate when near the attractor
var dampingFactor = smoothstep(0.0, minDistance, distanceToCenter);
if (distanceToCenter > minDistance && params.mouseAttraction > 0.0) { // Solely appeal to if exterior the minimal distance
vVel += normalize(directionToCenter) * params.mouseAttraction * dampingFactor;
vVel *= 0.95;
}
// Collision Dealing with: Packing spheres as a substitute of pushing them away
var particlesArrayLength = arrayLength(&particlesA);
for (var i = 0u; i < particlesArrayLength; i++) {
if (i == index) {
proceed;
}
let otherPos = particlesA[i].place.xyz;
let otherRadius = particlesA[i].information.x;
let collisionMinDist = sphereRadius + otherRadius;
let toOther = otherPos - vPos;
let dist = size(toOther);
if (dist < collisionMinDist) {
let pushDir = normalize(toOther);
let overlap = collisionMinDist - dist;
let pushStrength = otherRadius / sphereRadius; // radius
// Push away proportionally to overlap
vVel -= pushDir * (overlap * params.spheresRepulsion) * pushStrength;
newCollision = min(1.0, pushStrength * 1.5);
let r = normalize(cross(pushDir, vVel));
angularVelocity += r * size(vVel) * 0.1 * pushStrength;
}
}
let projectedVelocity = dot(vVel, directionToCenter); // Velocity element in the direction of mouse
let mainSphereRadius = 1.0;
if(distanceToCenter <= (mainSphereRadius + minDistance)) {
let pushDir = normalize(directionToCenter);
let overlap = (mainSphereRadius + minDistance) - distanceToCenter;
// Push away proportionally to overlap
vVel -= pushDir * (overlap * params.spheresRepulsion) * (2.0 + params.mouseAttraction);
newCollision = 1.0;
if(params.mouseAttraction > 0.0) {
vPos -= pushDir * overlap;
}
let r = normalize(cross(pushDir, vVel));
angularVelocity += r * size(vVel) * 0.05;
}
vPos = constrainToFrustum(vPos, &vVel, sphereRadius);
// Apply velocity replace
vPos += vVel * params.deltaT;
angularVelocity *= 0.98;
let updatedQuat = integrateQuaternion(vQuat, angularVelocity, params.deltaT);
// Write again
particlesB[index].place = vec4(vPos, 0.0);
particlesB[index].velocity = vec4(vVel, collision);
particlesB[index].information = vec4(vData.x, newCollision, vData.z, vData.w);
particlesB[index].rotation = updatedQuat;
particlesB[index].angularVelocity = vec4(angularVelocity, 1.0);
}
One among my predominant inspirations for this scene was this superior demo by Patrick Schroen. I spent lots of time in search of the fitting rendering tips to make use of and eventually arrange my thoughts on volumetric lighting. The implementation is kind of much like what Maxime Heckel defined in this wonderful breakdown article. Funnily sufficient, I used to be already deep into my very own implementation when he launched that piece, and I owe him the thought of utilizing a blue noise texture.
As a facet word, in the course of the improvement section this was the primary scene that required an precise person interplay and it performed a pivotal position in my choice to show my folio right into a sport.
Open supply scene
For the final scene, I needed to experiment a bit extra with particles and curl noise as a result of I’ve all the time preferred how natural and exquisite it could possibly get. I had already printed an article utilizing these ideas, so I needed to give you one thing totally different. Jaume Sanchez’ Polygon Shredder undoubtedly was a significant inspiration right here.
Since this experiment was a part of my open supply dedication part, I had the thought to make use of my GitHub statistics as an information supply for the particles. Every statistic (variety of commits, followers, points closed and so forth) is assigned to a colour and changed into a bunch of particles. You may even toggle them on and off utilizing the filters within the data pop-up. As soon as once more, this modified a somewhat technical demo into one thing extra significant.
Whereas engaged on the portfolio, I used to be additionally exploring new rendering strategies with gpu-curtains resembling planar reflections. Historically used for mirror results or flooring reflections, it consists of rendering part of your scene a second time however from a distinct digital camera angle and projecting it onto a aircraft. Having nailed this, I believed it could be an ideal match there and added it to the scene.
Final however not least, and as a reminder of the retro video video games vibe, I needed so as to add a pixelated mouse path post-processing impact. I quickly realized it could be manner an excessive amount of although, and ended up displaying it solely when the person is definitely drawing a line, making it extra refined.

Efficiency and accessibility
On such extremely interactive and immersive pages, efficiency is vital. Listed below are a number of tips I’ve used to attempt to keep essentially the most fluid expertise throughout all gadgets.
Dynamic imports
I’ve used Nuxt dynamic imported elements and lazy hydration for nearly each non essential elements of the web page. In the identical manner, all WebGPU scenes are dynamically loaded provided that WebGPU is supported. This considerably decreased the preliminary web page load time.
// pseudo code
import sort { WebGPUHeroScene } from "~/scenes/hero/WebGPUHeroScene";
import { CanvasHeroScene } from "~/scenes/hero/CanvasHeroScene";
let scene: WebGPUHeroScene | CanvasHeroScene | null;
const canvas = useTemplateRef("canvas");
const { colours } = usePaletteGenerator();
onMounted(async () => {
const { $gpuCurtains, $hasWebGPU, $isReducedMotion } = useNuxtApp();
if ($hasWebGPU && canvas.worth) {
const { WebGPUHeroScene } = await import("~/scenes/hero/WebGPUHeroScene");
scene = new WebGPUHeroScene({
gpuCurtains: $gpuCurtains,
container: canvas.worth,
colours: colours.worth,
});
} else if (canvas.worth) {
scene = new CanvasHeroScene({
container: canvas.worth,
isReducedMotion: $isReducedMotion,
colours: colours.worth,
});
}
});
I’m not notably keen on Lighthouse reviews however as you may see the check outcome is kind of good (word that it’s operating with out WebGPU although).

Monitoring WebGPU efficiency in actual time
I’ve briefly mentionned it earlier, however every WebGPU scene really displays its personal efficiency by conserving observe of its FPS price in actual time. To take action, I’ve written 2 separate courses: FPSWatcher, that data the typical FPS over a given time period, and QualityManager, that makes use of a FPSWatcher to set a present high quality ranking on a 0 to 10 scale primarily based on the typical FPS.
That is what they appear to be:
export interface FPSWatcherParams {
updateDelay?: quantity;
onWatch?: (averageFPS: quantity) => void;
}
export default class FPSWatcher {
updateDelay: quantity;
onWatch: (averageFPS: quantity) => void;
frames: quantity[];
lastTs: quantity;
elapsedTime: quantity;
common: quantity;
constructor({
updateDelay = 1000, // ms
onWatch = () => {}, // callback known as each ${updateDelay}ms
}: FPSWatcherParams = {}) {
this.updateDelay = updateDelay;
this.onWatch = onWatch;
this.frames = [];
this.lastTs = efficiency.now();
this.elapsedTime = 0;
this.common = 0;
}
restart() {
this.frames = [];
this.elapsedTime = 0;
this.lastTs = efficiency.now();
}
replace() {
const delta = efficiency.now() - this.lastTs;
this.lastTs = efficiency.now();
this.elapsedTime += delta;
this.frames.push(delta);
if (this.elapsedTime > this.updateDelay) {
const framesTotal = this.frames.scale back((a, b) => a + b, 0);
this.common = (this.frames.size * 1000) / framesTotal;
this.frames = [];
this.elapsedTime = 0;
this.onWatch(this.common);
}
}
}
It’s very primary: I simply file the elapsed time between two render calls, put that into an array and run a callback each updateDelay milliseconds with the most recent FPS common worth.
It’s then utilized by the QualityManager class, that does all of the heavy lifting to assign an correct present high quality rating:
import sort { FPSWatcherParams } from "./FPSWatcher";
import FPSWatcher from "./FPSWatcher";
export interface QualityManagerParams {
label?: string;
updateDelay?: FPSWatcherParams["updateDelay"];
targetFPS?: quantity;
onQualityChange?: (newQuality: quantity) => void;
}
export class QualityManager {
label: string;
fpsWatcher: FPSWatcher;
targetFPS: quantity;
#lastFPS: quantity | null;
#lively: boolean;
onQualityChange: (newQuality: quantity) => void;
high quality: {
present: quantity;
min: quantity;
max: quantity;
};
constructor({
label = "High quality supervisor",
updateDelay = 1000,
targetFPS = 60,
onQualityChange = (newQuality) => {},
}: QualityManagerParams = {}) {
this.label = label;
this.onQualityChange = onQualityChange;
this.high quality = {
min: 0,
max: 10,
present: 7,
};
this.#lively = true;
this.targetFPS = targetFPS;
this.#lastFPS = null;
this.fpsWatcher = new FPSWatcher({
updateDelay,
onWatch: (averageFPS) => this.onFPSWatcherUpdate(averageFPS),
});
}
get lively() {
return this.#lively;
}
set lively(worth: boolean) {
if (!this.lively && worth) {
this.fpsWatcher.restart();
}
this.#lively = worth;
}
onFPSWatcherUpdate(averageFPS = 0) {
const lastFpsRatio = this.#lastFPS
? Math.spherical(averageFPS / this.#lastFPS)
: 1;
const fpsRatio = (averageFPS + lastFpsRatio) / this.targetFPS;
// if fps ratio is over 0.95, we should always enhance
// else we lower
const boostedFpsRatio = fpsRatio / 0.95;
// easy change multiplier keep away from large modifications in high quality
// besides if we have seen an enormous change from final FPS values
const smoothChangeMultiplier = 0.5 * lastFpsRatio;
// high quality distinction that must be utilized (quantity with 2 decimals)
const qualityDiff =
Math.spherical((boostedFpsRatio - 1) * 100) * 0.1 * smoothChangeMultiplier;
if (Math.abs(qualityDiff) > 0.25) {
const newQuality = Math.min(
Math.max(
this.high quality.present + Math.spherical(qualityDiff),
this.high quality.min
),
this.high quality.max
);
this.setCurrentQuality(newQuality);
}
this.#lastFPS = averageFPS;
}
setCurrentQuality(newQuality: quantity) {
this.high quality.present = newQuality;
this.onQualityChange(this.high quality.present);
}
replace() {
if (this.lively) {
this.fpsWatcher.replace();
}
}
}
Probably the most troublesome half right here is to easily deal with the standard modifications to keep away from large drops or positive factors in high quality. You additionally don’t need to fall in a loop the place for instance:
- The common FPS are poor, so that you degrade your present high quality.
- You detect a top quality loss and due to this fact resolve to change off an essential characteristic, resembling shadow mapping.
- Eradicating the shadow mapping provides you a FPS enhance and after the anticipated delay the present high quality is upgraded.
- You detect a top quality achieve, resolve to re-enable shadow mapping and shortly sufficient, you’re again to step 1.
Usually, the standard ranking is used to replace issues resembling the present pixel ratio of the scene, body buffers resolutions, variety of shadow maps PCF samples, volumetric raymarching steps and so forth. In worst case eventualities, it could possibly even disable shadow mapping or put up processing results.
Accessibility
Lastly, the positioning needed to respect at the very least the essential accessibility requirements. I’m not an accessibility knowledgeable and I could have made a number of errors right here and there, however the important thing factors are that the HTML is semantically right, it’s potential to navigate utilizing the keyboard and the prefers-reduced-motion choice is revered. I achieved that by disabling totally the gamification idea for these customers, eradicating each CSS and JavaScript animations, and made the scenes fall again to their 2D canvas variations, with out being animated in any respect.
Conclusion
Nicely, it was a protracted journey, wasn’t it?
Engaged on my portfolio these previous 6 months has been a very demanding job, technically but in addition emotionally. I’m nonetheless having lots of self doubts in regards to the general design, key UX selections or degree of creativity. I additionally do suppose that it sort of actually sums up who I’m as a developer but in addition as an individual. In the long run, it’s most likely what issues most.
I hope that you simply’ve learnt a number of issues studying this case research, whether or not it’d be about technical stuff or my very own inventive course of. Thanks all, and bear in mind: keep enjoyable!


