On this tutorial, we’ll discover how you can convey movement and interactivity to your WebGL initiatives by combining GSAP with customized shaders. Working with the Dev crew at Adoratorio Studio, I’ll information you thru 4 GPU-powered results, from ripples that react to clicks to dynamic blurs that reply to scroll and drag.
We’ll begin by organising a easy WebGL scene and syncing it with our HTML format. From there, we’ll transfer step-by-step by means of extra superior interactions, animating shader uniforms, mixing textures, and revealing photographs by means of masks, till we flip all the pieces right into a scrollable, animated carousel.
By the tip, you’ll perceive how you can join GSAP timelines with shader parameters to create fluid, expressive visuals that react in actual time and type the inspiration in your personal immersive internet experiences.
Creating the HTML construction
As a primary step, we are going to arrange the web page utilizing HTML.
We’ll create a container with out specifying its dimensions, permitting it to increase past the web page width. Then, we are going to set the primary container’s overflow property to hidden, because the web page will likely be later made interactive by means of the GSAP Draggable and ScrollTrigger functionalities.
<predominant>
<part class="content material">
<div class="content__carousel">
<div class="content__carousel-inner-static">
<div class="content__carousel-image">
<img src="/photographs/01.webp" alt="" function="presentation">
<span>Lorem — 001</span>
</div>
<div class="content__carousel-image">
<img src="/photographs/04.webp" alt="" function="presentation">
<span>Ipsum — 002</span>
</div>
<div class="content__carousel-image">
<img src="/photographs/02.webp" alt="" function="presentation">
<span>Dolor — 003</span>
</div>
...
</div>
</div>
</part>
</predominant>
We’ll model all this after which transfer on to the subsequent step.
Sync between HTML and Canvas
We will now start integrating Three.js into our challenge by making a Stage class liable for managing all 3D engine logic. Initially, this class will arrange a renderer, a scene, and a digital camera.
We’ll move an HTML node as the primary parameter, which can act because the container for our canvas.
Subsequent, we are going to replace the CSS and the primary script to create a full-screen canvas that resizes responsively and renders on each GSAP body.
export default class Stage {
constructor(container) {
this.container = container;
this.DOMElements = [...this.container.querySelectorAll('img')];
this.renderer = new WebGLRenderer({
powerPreference: 'high-performance',
antialias: true,
alpha: true,
});
this.renderer.setPixelRatio(Math.min(1.5, window.devicePixelRatio));
this.renderer.setSize(window.innerWidth, window.innerHeight);
this.renderer.domElement.classList.add('content__canvas');
this.container.appendChild(this.renderer.domElement);
this.scene = new Scene();
const { innerWidth: width, innerHeight: peak } = window;
this.digital camera = new OrthographicCamera(-width / 2, width / 2, peak / 2, -height / 2, -1000, 1000);
this.digital camera.place.z = 10;
}
resize() {
// Replace digital camera props to suit the canvas measurement
const { innerWidth: screenWidth, innerHeight: screenHeight } = window;
this.digital camera.left = -screenWidth / 2;
this.digital camera.proper = screenWidth / 2;
this.digital camera.prime = screenHeight / 2;
this.digital camera.backside = -screenHeight / 2;
this.digital camera.updateProjectionMatrix();
// Replace additionally planes sizes
this.DOMElements.forEach((picture, index) => {
const { width: imageWidth, peak: imageHeight } = picture.getBoundingClientRect();
this.scene.kids[index].scale.set(imageWidth, imageHeight, 1);
});
// Replace the render utilizing the window sizes
this.renderer.setSize(screenWidth, screenHeight);
}
render() {
this.renderer.render(this.scene, this.digital camera);
}
}
Again in our predominant.js file, we’ll first deal with the stage’s resize occasion. After that, we’ll synchronize the renderer’s requestAnimationFrame (RAF) with GSAP by utilizing gsap.ticker.add, passing the stage’s render perform because the callback.
// Replace resize with the stage resize
perform resize() {
...
stage.resize();
}
// Add render cycle to gsap ticker
gsap.ticker.add(stage.render.bind(stage));
<model>
.content__canvas {
place: absolute;
prime: 0;
left: 0;
width: 100vw;
peak: 100svh;
z-index: 2;
pointer-events: none;
}
</model>
It’s now time to load all the pictures included within the HTML. For every picture, we are going to create a airplane and add it to the scene. To attain this, we’ll replace the category by including two new strategies:
setUpPlanes() {
this.DOMElements.forEach((picture) => {
this.scene.add(this.generatePlane(picture));
});
}
generatePlane(picture, ) {
const loader = new TextureLoader();
const texture = loader.load(picture.src);
texture.colorSpace = SRGBColorSpace;
const airplane = new Mesh(
new PlaneGeometry(1, 1),
new MeshStandardMaterial(),
);
return airplane;
}
We will then name setUpPlanes() inside the constructor of our Stage class.
The outcome ought to resemble the next, relying on the digital camera’s z-position or the planes’ placement—each of which might be adjusted to suit our particular wants.

The subsequent step is to place the planes exactly to correspond with the placement of their related photographs and replace their positions on every body. To attain this, we are going to implement a utility perform that converts display screen house (CSS pixels) into world house, leveraging the Orthographic Digicam, which is already aligned with the display screen.
const getWorldPositionFromDOM = (component, digital camera) => {
const rect = component.getBoundingClientRect();
const xNDC = (rect.left + rect.width / 2) / window.innerWidth * 2 - 1;
const yNDC = -((rect.prime + rect.peak / 2) / window.innerHeight * 2 - 1);
const xWorld = xNDC * (digital camera.proper - digital camera.left) / 2;
const yWorld = yNDC * (digital camera.prime - digital camera.backside) / 2;
return new Vector3(xWorld, yWorld, 0);
};
render() {
this.renderer.render(this.scene, this.digital camera);
// For every airplane and every picture replace the place of the airplane to match the DOM component place on web page
this.DOMElements.forEach((picture, index) => {
this.scene.kids[index].place.copy(getWorldPositionFromDOM(picture, this.digital camera, this.renderer));
});
}

By hiding the unique DOM carousel, we are able to now show solely the pictures as planes inside the canvas. Create a easy class extending ShaderMaterial and use it rather than MeshStandardMaterial for the planes.
const airplane = new Mesh(
new PlaneGeometry(1, 1),
new PlanesMaterial(),
);
...
import { ShaderMaterial } from 'three';
import baseVertex from './base.vert?uncooked';
import baseFragment from './base.frag?uncooked';
export default class PlanesMaterial extends ShaderMaterial {
constructor() {
tremendous({
vertexShader: baseVertex,
fragmentShader: baseFragment,
});
}
}
// base.vert
various vec2 vUv;
void predominant() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(place, 1.0);
vUv = uv;
}
// base.frag
various vec2 vUv;
void predominant() {
gl_FragColor = vec4(vUv.x, vUv.y, 0.0, 1.0);
}

We will then change the shader output with texture sampling based mostly on the UV coordinates, passing the feel to the fabric and shaders as a uniform.
...
const airplane = new Mesh(
new PlaneGeometry(1, 1),
new PlanesMaterial(texture),
);
...
export default class PlanesMaterial extends ShaderMaterial {
constructor(texture) {
tremendous({
vertexShader: baseVertex,
fragmentShader: baseFragment,
uniforms: {
uTexture: { worth: texture },
},
});
}
}
// base.frag
various vec2 vUv;
uniform sampler2D uTexture;
void predominant() {
vec4 diffuse = texture2D(uTexture, vUv);
gl_FragColor = diffuse;
}

Click on on the pictures for a ripple and coloring impact
This steps breaks down the creation of an interactive grayscale transition impact, emphasizing the connection between JavaScript (utilizing GSAP) and GLSL shaders.
Step 1: On the spot Colour/Grayscale Toggle
Let’s begin with the best model: clicking the picture makes it immediately swap between colour and grayscale.
The JavaScript (GSAP)
At this stage, GSAP’s function is to behave as a easy “on/off” swap so let’s create a GSAP Observer to watch the mouse click on interplay:
this.observer = Observer.create({
goal: doc.querySelector('.content__carousel'),
sort: 'contact,pointer',
onClick: e => this.onClick(e),
});
And right here come the next steps:
- Click on Detection: We use an
Observerto detect a click on on our airplane. - State Administration: A boolean flag,
isBw(is Black and White), is toggled on every click on. - Shader Replace: We use
gsap.set()to immediately change auniformin our shader. We’ll name ituGrayscaleProgress.- If
isBwistrue,uGrayscaleProgressturns into1.0. - If
isBwisfalse,uGrayscaleProgressturns into0.0.
- If
onClick(e) {
if (intersection) {
const { materials, userData } = intersection.object;
userData.isBw = !userData.isBw;
gsap.set(materials.uniforms.uGrayscaleProgress, {
worth: userData.isBw ? 1.0 : 0.0
});
}
}
The Shader (GLSL)
The fragment shader could be very easy. It receives uGrayscaleProgress and makes use of it as a swap.
uniform sampler2D uTexture;
uniform float uGrayscaleProgress; // Our "swap" (0.0 or 1.0)
various vec2 vUv;
vec3 toGrayscale(vec3 colour) {
float grey = dot(colour, vec3(0.299, 0.587, 0.114));
return vec3(grey);
}
void predominant() {
vec3 originalColor = texture2D(uTexture, vUv).rgb;
vec3 grayscaleColor = toGrayscale(originalColor);
vec3 finalColor = combine(originalColor, grayscaleColor, uGrayscaleProgress);
gl_FragColor = vec4(finalColor, 1.0);
}
Step 2: Animated Round Reveal
An instantaneous swap is boring. Let’s make the transition a clean, round reveal that expands from the middle.
The JavaScript (GSAP)
GSAP’s function now modifications from a swap to an animator.
As a substitute of gsap.set(), we use gsap.to() to animate uGrayscaleProgress from 0 to 1 (or 1 to 0) over a set length. This sends a steady stream of values (0.0, 0.01, 0.02, …) to the shader.
gsap.to(materials.uniforms.uGrayscaleProgress, {
worth: userData.isBw ? 1 : 0,
length: 1.5,
ease: 'power2.inOut'
});
The Shader (GLSL)
The shader now makes use of the animated uGrayscaleProgress to outline the radius of a circle.
void predominant() {
float dist = distance(vUv, vec2(0.5));
// 2. Create a round masks.
float masks = smoothstep(uGrayscaleProgress - 0.1, uGrayscaleProgress, dist);
// 3. Combine the colours based mostly on the masks's worth for every pixel.
vec3 finalColor = combine(originalColor, grayscaleColor, masks);
gl_FragColor = vec4(finalColor, 1.0);
}
How smoothstep works right here: Pixels the place dist is lower than uGrayscaleProgress – 0.1 get a masks worth of 0. Pixels the place dist is bigger than uGrayscaleProgress get a price of 1. In between, it’s a clean transition, creating the smooth edge.
Step 3: Originating from the Mouse Click on
The impact is rather more partaking if it begins from the precise level of the clicking.
The JavaScript (GSAP)
We have to inform the shader the place the clicking occurred.
- Raycasting: We use a
Raycasterto search out the exact(u, v)texture coordinate of the clicking on the mesh. uMouseUniform: We add auniform vec2 uMouseto our materials.- GSAP Timeline: Earlier than the animation begins, we use
.set()on our GSAP timeline to replace theuMouseuniform with theintersection.uvcoordinates.
if (intersection) {
const { materials, userData } = intersection.object;
materials.uniforms.uMouse.worth = intersection.uv;
gsap.to(materials.uniforms.uGrayscaleProgress, {
worth: userData.isBw ? 1 : 0
});
}
The Shader (GLSL)
We merely change the hardcoded heart with our new uMouse uniform.
...
uniform vec2 uMouse; // The (u,v) coordinates from the clicking
...
void predominant() {
...
// 1. Calculate distance from the MOUSE CLICK, not the middle.
float dist = distance(vUv, uMouse);
}
Vital Element: To make sure the round reveal at all times covers the complete airplane, even when clicking in a nook, we calculate the utmost attainable distance from the clicking level to any of the 4 corners (getMaxDistFromCorners) and normalize our dist worth with it: dist / maxDist.
This ensures the animation completes absolutely.
Step 4: Including the Closing Ripple Impact
The final step is so as to add the 3D ripple impact that deforms the airplane. This requires modifying the vertex shader.
The JavaScript (GSAP)
We want yet one more animated uniform to manage the ripple’s lifecycle.
uRippleProgressUniform: We add auniform float uRippleProgress.- GSAP Keyframes: In the identical timeline, we animate
uRippleProgressfrom0to1and again to0. This makes the wave stand up after which settle again down.
gsap.timeline({ defaults: { length: 1.5, ease: 'power3.inOut' } })
.set(materials.uniforms.uMouse, { worth: intersection.uv }, 0)
.to(materials.uniforms.uGrayscaleProgress, { worth: 1 }, 0)
.to(materials.uniforms.uRippleProgress, {
keyframes: { worth: [0, 1, 0] } // Rise and fall
}, 0)
The Shaders (GLSL)
Excessive-Poly Geometry: To see a clean deformation, the PlaneGeometry in Three.js have to be created with many segments (e.g., new PlaneGeometry(1, 1, 50, 50)). This provides the vertex shader extra factors to govern.
generatePlane(picture, ) {
...
const airplane = new Mesh(
new PlaneGeometry(1, 1, 50, 50),
new PlanesMaterial(texture),
);
return airplane;
}
Vertex Shader: This shader now calculates the wave and strikes the vertices.
uniform float uRippleProgress;
uniform vec2 uMouse;
various float vRipple; // Go the ripple depth to the fragment shader
void predominant() {
vec3 pos = place;
float dist = distance(uv, uMouse);
float ripple = sin(-PI * 10.0 * (dist - uTime * 0.1));
ripple *= uRippleProgress;
pos.y += ripple * 0.1;
vRipple = ripple;
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.0);
}
Fragment Shader: We will use the ripple depth so as to add a ultimate contact, like making the wave crests brighter.
various float vRipple; // Obtained from vertex shader
void predominant() {
// ... (all the colour and masks logic from earlier than)
vec3 colour = combine(color1, color2, masks);
// Add a spotlight based mostly on the wave's peak
colour += vRipple * 2.0;
gl_FragColor = vec4(colour, diffuse.a);
}
By layering these methods, we create a wealthy, interactive impact the place JavaScript and GSAP act because the puppet grasp, telling the shaders what to do, whereas the shaders deal with the heavy lifting of drawing it fantastically and effectively on the GPU.
Step 5: Reverse impact on earlier tile
As a ultimate step, we arrange a reverse animation of the present tile when a brand new tile is clicked. Let’s begin by creating the reset animation that reverses the animation of the uniforms:
resetMaterial(object) {
// Reset all shader uniforms to default values
gsap.timeline({
defaults: { length: 1, ease: 'power2.out' },
onUpdate() {
object.materials.uniforms.uTime.worth += 0.1;
},
onComplete() {
object.userData.isBw = false;
}
})
.set(object.materials.uniforms.uMouse, { worth: { x: 0.5, y: 0.5} }, 0)
.set(object.materials.uniforms.uDirection, { worth: 1.0 }, 0)
.fromTo(object.materials.uniforms.uGrayscaleProgress, { worth: 1 }, { worth: 0 }, 0)
.to(object.materials.uniforms.uRippleProgress, { keyframes: { worth: [0, 1, 0] } }, 0);
}
Now, at every click on, we have to set the present tile in order that it’s saved within the constructor, permitting us to move the present materials to the reset animation. Let’s modify the onClick perform like this and analyze it step-by-step:
if (this.activeObject && intersection.object !== this.activeObject && this.activeObject.userData.isBw) {
this.resetMaterial(this.activeObject)
// Stops timeline if lively
if (this.activeObject.userData.tl?.isActive()) this.activeObject.userData.tl.kill();
// Cleans timeline
this.activeObject.userData.tl = null;
}
// Setup lively object
this.activeObject = intersection.object;
- If
this.activeObjectexists (initially set tonullwithin the constructor), we proceed to reset it to its preliminary black and white state - If there’s a present animation on the lively tile, we use GSAP’s
killtechnique to keep away from conflicts and overlapping animations - We reset
userData.tltonull(will probably be assigned a brand new timeline worth if the tile is clicked once more) - We then set the worth of
this.activeObjectto the thing chosen through the Raycaster
On this method, we’ll have a double ripple animation: one on the clicked tile, which will likely be coloured, and one on the beforehand lively tile, which will likely be reset to its unique black and white state.
Texture reveal masks impact
On this tutorial, we are going to create an interactive impact that blends two photographs on a airplane when the consumer hovers or touches it.
Step 1: Setting Up the Planes
In contrast to the earlier examples, on this case we want completely different uniforms for the planes, as we’re going to create a mixture between a visual entrance texture and one other texture that will likely be revealed by means of a masks that “cuts by means of” the primary texture.
Let’s begin by modifying the index.html file, including an information attribute to all photographs the place we’ll specify the underlying texture:
<img src="/photographs/front-texture.webp" alt="" function="presentation" data-back="/photographs/back-texture.webp">
Then, inside our Stage.js, we’ll modify the generatePlane technique, which is used to create the planes in WebGL. We’ll begin by retrieving the second texture to load through the information attribute, and we’ll move the airplane materials the parameters with each textures and the facet ratio of the pictures:
generatePlane(picture) {
const loader = new TextureLoader();
const texture = loader.load(picture.src);
const textureBack = loader.load(picture.dataset.again);
texture.colorSpace = SRGBColorSpace;
textureBack.colorSpace = SRGBColorSpace;
const { width, peak } = picture.getBoundingClientRect();
const airplane = new Mesh(
new PlaneGeometry(1, 1),
new PlanesMaterial(texture, textureBack, peak / width),
);
return airplane;
}
Step 2: Materials Setup
import { ShaderMaterial, Vector2 } from 'three';
import baseVertex from './base.vert?uncooked';
import baseFragment from './base.frag?uncooked';
export default class PlanesMaterial extends ShaderMaterial {
constructor(texture, textureBack, imageRatio) {
tremendous({
vertexShader: baseVertex,
fragmentShader: baseFragment,
uniforms: {
uTexture: { worth: texture },
uTextureBack: { worth: textureBack },
uMixFactor: { worth: 0.0 },
uAspect: { worth: imageRatio },
uMouse: { worth: new Vector2(0.5, 0.5) },
},
});
}
}
Let’s rapidly analyze the uniforms handed to the fabric:
- uTexture and uTextureBack are the 2 textures proven on the entrance and thru the masks
- uMixFactor represents the mixing worth between the 2 textures contained in the masks
- uAspect is the facet ratio of the pictures used to calculate a round masks
- uMouse represents the mouse coordinates, up to date to maneuver the masks inside the airplane
Step 3: The Javascript (GSAP)
this.observer = Observer.create({
goal: doc.querySelector('.content__carousel'),
sort: 'contact,pointer',
onMove: e => this.onMove(e),
onHoverEnd: () => this.hoverOut(),
});
Shortly, let’s create a GSAP Observer to watch the mouse motion, passing two capabilities:
- onMove checks, utilizing the Raycaster, whether or not a airplane is being hit with the intention to handle the opening of the reveal masks
- onHoverEnd is triggered when the cursor leaves the goal space, so we’ll use this technique to reset the reveal masks’s growth uniform worth again to 0.0
Let’s go into extra element on the onMove perform to elucidate the way it works:
onMove(e) {
const normCoords = {
x: (e.x / window.innerWidth) * 2 - 1,
y: -(e.y / window.innerHeight) * 2 + 1,
};
this.raycaster.setFromCamera(normCoords, this.digital camera);
const [intersection] = this.raycaster.intersectObjects(this.scene.kids);
if (intersection) {
this.intersected = intersection.object;
const { materials } = intersection.object;
gsap.timeline()
.set(materials.uniforms.uMouse, { worth: intersection.uv }, 0)
.to(materials.uniforms.uMixFactor, { worth: 1.0, length: 3, ease: 'power3.out' }, 0);
} else {
this.hoverOut();
}
}
Within the onMove technique, step one is to normalize the mouse coordinates from -1 to 1 to permit the Raycaster to work with the proper coordinates.
On every body, the Raycaster is then up to date to verify if any object within the scene is intersected. If there may be an intersection, the code saves the hit object in a variable.
When an intersection happens, we proceed to work on the animation of the shader uniforms.
Particularly, we use GSAP’s set technique to replace the mouse place in uMouse, after which animate the uMixFactor variable from 0.0 to 1.0 to open the reveal masks and present the underlying texture.
If the Raycaster doesn’t discover any object beneath the pointer, the hoverOut technique is named.
hoverOut() {
if (!this.intersected) return;
// Cease any operating tweens on the uMixFactor uniform
gsap.killTweensOf(this.intersected.materials.uniforms.uMixFactor);
// Animate uMixFactor again to 0 easily
gsap.to(this.intersected.materials.uniforms.uMixFactor, { worth: 0.0, length: 0.5, ease: 'power3.out });
// Clear the intersected reference
this.intersected = null;
}
This technique handles closing the reveal masks as soon as the cursor leaves the airplane.
First, we depend on the killAllTweensOf technique to stop conflicts or overlaps between the masks’s opening and shutting animations by stopping all ongoing animations on the uMixFactor .
Then, we animate the masks’s closing by setting the uMixFactor uniform again to 0.0 and reset the variable that was monitoring the presently highlighted object.
Step 4: The Shader (GLSL)
uniform sampler2D uTexture;
uniform sampler2D uTextureBack;
uniform float uMixFactor;
uniform vec2 uMouse;
uniform float uAspect;
various vec2 vUv;
void predominant() {
vec2 correctedUv = vec2(vUv.x, (vUv.y - 0.5) * uAspect + 0.5);
vec2 correctedMouse = vec2(uMouse.x, (uMouse.y - 0.5) * uAspect + 0.5);
float distance = size(correctedUv - correctedMouse);
float affect = 1.0 - smoothstep(0.0, 0.5, distance);
float finalMix = uMixFactor * affect;
vec4 textureFront = texture2D(uTexture, vUv);
vec4 textureBack = texture2D(uTextureBack, vUv);
vec4 finalColor = combine(textureFront, textureBack, finalMix);
gl_FragColor = finalColor;
}
Contained in the predominant() perform, it begins by normalizing the UV coordinates and the mouse place relative to the picture’s facet ratio. This correction is utilized as a result of we’re utilizing non-square photographs, so the vertical coordinates have to be adjusted to maintain the masks’s proportions appropriate and guarantee it stays round. Due to this fact, the vUv.y and uMouse.y coordinates are modified so they’re “scaled” vertically in response to the facet ratio.
At this level, the space is calculated between the present pixel (correctedUv) and the mouse place (correctedMouse). This distance is a numeric worth that signifies how shut or far the pixel is from the mouse heart on the floor.
We then transfer on to the precise creation of the masks. The uniform affect should differ from 1 on the cursor’s heart to 0 because it strikes away from the middle. We use the smoothstep perform to recreate this impact and acquire a smooth, gradual transition between two values, so the impact naturally fades.
The ultimate worth for the combination between the 2 textures, that’s the finalMix uniform, is given by the product of the worldwide issue uMixFactor (which is a static numeric worth handed to the shader) and this native affect worth. So the nearer a pixel is to the mouse place, the extra its colour will likely be influenced by the second texture, uTextureBack.
The final half is the precise mixing: the 2 colours are combined utilizing the combine() perform, which creates a linear interpolation between the 2 textures based mostly on the worth of finalMix. When finalMix is 0, solely the entrance texture is seen.
When it’s 1, solely the background texture is seen. Intermediate values create a gradual mix between the 2 textures.
Click on & Maintain masks reveal impact
This doc breaks down the creation of an interactive impact that transitions a picture from colour to grayscale. The impact begins from the consumer’s click on, increasing outwards with a ripple distortion.
Step 1: The “Transfer” (Hover) Impact
On this step, we’ll create an impact the place a picture transitions to a different because the consumer hovers their mouse over it. The transition will originate from the pointer’s place and increase outwards.
The JavaScript (GSAP Observer for onMove)
GSAP’s Observer plugin is the right instrument for monitoring pointer actions with out the boilerplate of conventional occasion listeners.
- Setup Observer: We create an Observer occasion that targets our predominant container and listens for contact and pointer occasions. We solely want the onMove and onHoverEnd callbacks.
- onMove(e) Logic:
When the pointer strikes, we use aRaycasterto find out if it’s over one among our interactive photographs.- If an object is intersected, we retailer it in
this.intersected. - We then use a GSAP Timeline to animate the shader’s
uniforms. uMouse: We immediately set thisvec2uniform to the pointer’s UV coordinate on the picture. This tells the shader the place the impact ought to originate.uMixFactor: We animate thisfloatuniform from0to1. This uniform will management the mix between the 2 textures within the shader.
- If an object is intersected, we retailer it in
- onHoverEnd() Logic:
- When the pointer leaves the thing,
Observercalls this perform. - We kill any ongoing animations on
uMixFactorto stop conflicts. - We animate
uMixFactoragain to0, reversing the impact.
- When the pointer leaves the thing,
Code Instance: the “Transfer” impact
This code reveals how Observer is configured to deal with the hover interplay.
import { gsap } from 'gsap';
import { Observer } from 'gsap/Observer';
import { Raycaster } from 'three';
gsap.registerPlugin(Observer);
export default class Impact {
constructor(scene, digital camera) {
this.scene = scene;
this.digital camera = digital camera;
this.intersected = null;
this.raycaster = new Raycaster();
// 1. Create the Observer
this.observer = Observer.create({
goal: doc.querySelector('.content__carousel'),
sort: 'contact,pointer',
onMove: e => this.onMove(e),
onHoverEnd: () => this.hoverOut(), // Known as when the pointer leaves the goal
});
}
hoverOut() {
if (!this.intersected) return;
// 3. Animate the impact out
gsap.killTweensOf(this.intersected.materials.uniforms.uMixFactor);
gsap.to(this.intersected.materials.uniforms.uMixFactor, {
worth: 0.0,
length: 0.5,
ease: 'power3.out'
});
this.intersected = null;
}
onMove(e) {
// ... (Raycaster logic to search out intersection)
const [intersection] = this.raycaster.intersectObjects(this.scene.kids);
if (intersection) {
this.intersected = intersection.object;
const { materials } = intersection.object;
// 2. Animate the uniforms on hover
gsap.timeline()
.set(materials.uniforms.uMouse, { worth: intersection.uv }, 0) // Set origin level
.to(materials.uniforms.uMixFactor, { // Animate the blendvalue: 1.0,
length: 3,
ease: 'power3.out'
}, 0);
} else {
this.hoverOut(); // Reset if not hovering over something
}
}
}
The Shader (GLSL)
The fragment shader receives the uniforms animated by GSAP and makes use of them to attract the impact.
uMouse: Used to calculate the space of every pixel from the pointer.uMixFactor: Used because the interpolation worth in acombine()perform. Because it animates from0to1, the shader easily blends fromtextureFronttotextureBack.smoothstep(): We use this perform to create a round masks that expands from theuMouseplace. The radius of this circle is managed byuMixFactor.
uniform sampler2D uTexture; // Entrance picture
uniform sampler2D uTextureBack; // Again picture
uniform float uMixFactor; // Animated by GSAP (0 to 1)
uniform vec2 uMouse; // Set by GSAP on transfer
// ...
void predominant() {
// ... (code to appropriate for facet ratio)
// 1. Calculate distance of the present pixel from the mouse
float distance = size(correctedUv - correctedMouse);
// 2. Create a round masks that expands as uMixFactor will increase
float affect = 1.0 - smoothstep(0.0, 0.5, distance);
float finalMix = uMixFactor * affect;
// 3. Learn colours from each textures
vec4 textureFront = texture2D(uTexture, vUv);
vec4 textureBack = texture2D(uTextureBack, vUv);
// 4. Combine the 2 textures based mostly on the animated worth
vec4 finalColor = combine(textureFront, textureBack, finalMix);
gl_FragColor = finalColor;
}
Step 2: The “Click on & Maintain” Impact
Now, let’s construct a extra partaking interplay. The impact will begin when the consumer presses down, “cost up” whereas they maintain, and both full or reverse after they launch.
The JavaScript (GSAP)
Observer makes this complicated interplay simple by offering clear callbacks for every state.
- Setup Observer: This time, we configure
Observerto make use ofonPress,onMove, andonRelease. - onPress(e):
- When the consumer presses down, we discover the intersected object and retailer it in
this.lively. - We then name
onActiveEnter(), which begins a GSAP timeline for the “charging” animation.
- When the consumer presses down, we discover the intersected object and retailer it in
- onActiveEnter():
- This perform defines the multi-stage animation. We use
awaitwith a GSAP tween to create a sequence. - First, it animates
uGrayscaleProgressto a midpoint (e.g.,0.35) and holds it. That is the “maintain” a part of the interplay. - If the consumer continues to carry, a second tween completes the animation, transitioning
uGrayscaleProgressto1.0. - An
onCompletecallback then resets the state, getting ready for the subsequent interplay.
- This perform defines the multi-stage animation. We use
- onRelease():
- If the consumer releases the pointer earlier than the animation completes, this perform is named.
- It calls
onActiveLeve(), which kills the “charging” animation and animatesuGrayscaleProgressagain to0, successfully reversing the impact.
- onMove(e):
- That is nonetheless used to repeatedly replace the
uMouseuniform, so the shader’s noise impact tracks the pointer even through the maintain. - Crucially, if the pointer strikes off the thing, we name
onRelease()to cancel the interplay.
- That is nonetheless used to repeatedly replace the
Code Instance: Click on & Maintain
This code demonstrates the press, maintain, and launch logic managed by Observer.
import { gsap } from 'gsap';
import { Observer } from 'gsap/Observer';
// ...
export default class Impact {
constructor(scene, digital camera) {
// ...
this.lively = null; // At the moment lively (pressed) object
this.raycaster = new Raycaster();
// 1. Create the Observer for press, transfer, and launch
this.observer = Observer.create({
goal: doc.querySelector('.content__carousel'),
sort: 'contact,pointer',
onPress: e => this.onPress(e),
onMove: e => this.onMove(e),
onRelease: () => this.onRelease(),
});
// Repeatedly replace uTime for the procedural impact
gsap.ticker.add(() => {
if (this.lively) {
this.lively.materials.uniforms.uTime.worth += 0.1;
}
});
}
// 3. The "charging" animation
async onActiveEnter() {
gsap.killTweensOf(this.lively.materials.uniforms.uGrayscaleProgress);
// First a part of the animation (the "maintain" section)
await gsap.to(this.lively.materials.uniforms.uGrayscaleProgress, {
worth: 0.35,
length: 0.5,
});
// Second half, completes after the maintain
gsap.to(this.lively.materials.uniforms.uGrayscaleProgress, {
worth: 1,
length: 0.5,
delay: 0.12,
ease: 'power2.in',
onComplete: () => {/* ... reset state ... */ },
});
}
// 4. Reverses the animation on early launch
onActiveLeve(mesh) {
gsap.killTweensOf(mesh.materials.uniforms.uGrayscaleProgress);
gsap.to(mesh.materials.uniforms.uGrayscaleProgress, {
worth: 0,
onUpdate: () => {
mesh.materials.uniforms.uTime.worth += 0.1;
},
});
}
// ... (getIntersection logic) ...
// 2. Deal with the preliminary press
onPress(e) {
const intersection = this.getIntersection(e);
if (intersection) {
this.lively = intersection.object;
this.onActiveEnter(this.lively); // Begin the animation
}
}
onRelease() {
if (this.lively) {
const prevActive = this.lively;
this.lively = null;
this.onActiveLeve(prevActive); // Reverse the animation
}
}
onMove(e) {
// ... (getIntersection logic) ...
if (intersection) {
// 5. Preserve uMouse up to date whereas holding
const { materials } = intersection.object;
gsap.set(materials.uniforms.uMouse, { worth: intersection.uv });
} else {
this.onRelease(); // Cancel if pointer leaves
}
}
}
The Shader (GLSL)
The fragment shader for this impact is extra complicated. It makes use of the animated uniforms to create a distorted, noisy reveal.
- uGrayscaleProgress: That is the primary driver, animated by GSAP. It controls each the radius of the round masks and the power of a “liquid” distortion impact.
- uTime: That is repeatedly up to date by gsap.ticker so long as the consumer is urgent. It’s used so as to add motion to the noise, making the impact really feel alive and dynamic.
- noise() perform: A typical GLSL noise perform generates procedural, natural patterns. We use this to distort each the form of the round masks and the picture texture coordinates (UVs).
// ... (uniforms and helper capabilities)
void predominant() {
// 1. Generate a noise worth that modifications over time
float noisy = (noise(vUv * 25.0 + uTime * 0.5) - 0.5) * 0.05;
// 2. Create a distortion that pulses utilizing the primary progress animation
float distortionStrength = sin(uGrayscaleProgress * PI) * 0.5;
vec2 distortedUv = vUv + vec2(noisy) * distortionStrength;
// 3. Learn the feel utilizing the distorted coordinates for a liquid impact
vec4 diffuse = texture2D(uTexture, distortedUv);
// ... (grayscale logic)
// 4. Calculate distance from the mouse, however add noise to it
float dist = distance(vUv, uMouse);
float distortedDist = dist + noisy;
// 5. Create the round masks utilizing the distorted distance and progress
float maxDist = getMaxDistFromCorners(uMouse);
float masks = smoothstep(uGrayscaleProgress - 0.1, uGrayscaleProgress, distortedDist / maxDist);
// 6. Combine between the unique and grayscale colours
vec3 colour = combine(color1, color2, masks);
gl_FragColor = vec4(colour, diffuse.a);
}
This shader combines noise-based distortion, clean round masking, and real-time uniform updates to create a liquid, natural transition that radiates from the clicking place. As GSAP animates the shader’s progress and time values, the impact feels alive and tactile — an ideal instance of how animation logic in JavaScript can drive complicated visible conduct immediately on the GPU.
Dynamic blur impact carousel
Step 1: Create the carousel
On this ultimate demo, we are going to create a further implementation, turning the picture grid right into a scrollable carousel that may be navigated each by dragging and scrolling.
First we are going to implement the Draggable plugin by registering it and concentrating on the suitable <div>
with the specified configuration. Make certain to deal with boundary constraints and replace them accordingly when the window is resized.
const carouselInnerRef = doc.querySelector('.content__carousel-inner');
const draggable = new Draggable(carouselInnerRef, {
sort: 'x',
inertia: true,
dragResistance: 0.5,
edgeResistance: 0.5,
throwResistance: 0.5,
throwProps: true,
});
perform resize() {
const innerWidth = carouselInnerRef.scrollWidth;
const viewportWidth = window.innerWidth;
maxScroll = Math.abs(Math.min(0, viewportWidth - innerWidth));
draggable.applyBounds({ minX: -maxScroll, maxX: 0 });
}
window.addEventListener('resize', debounce(resize));
We ailing additionally hyperlink GSAP Draggable to the scroll performance utilizing the GSAP ScrollTrigger plugin, permitting us to synchronize each scroll and drag conduct inside the identical container. Let’s discover this in additional element:
let maxScroll = Math.abs(Math.min(0, window.innerWidth - carouselInnerRef.scrollWidth));
const scrollTriggerInstance = ScrollTrigger.create({
set off: carouselWrapper,
begin: 'prime prime',
finish: `+=${2.5 * maxScroll}`,
pin: true,
scrub: 0.05,
anticipatePin: 1,
invalidateOnRefresh: true,
});
...
resize() {
...
scrollTriggerInstance.refresh();
}
Now that ScrollTrigger is configured on the identical container, we are able to give attention to synchronizing the scroll place between each plugins, ranging from the ScrollTrigger occasion:
onUpdate(e) {
const x = -maxScroll * e.progress;
gsap.set(carouselInnerRef, { x });
draggable.x = x;
draggable.replace();
}
We then transfer on to the Draggable occasion, which will likely be up to date inside each its onDrag and onThrowUpdate callbacks utilizing the scrollPos variable. This variable will function the ultimate scroll place for each the window and the ScrollTrigger occasion.
onDragStart() {},
onDrag() {
const progress = gsap.utils.normalize(draggable.maxX, draggable.minX, draggable.x);
scrollPos = scrollTriggerInstance.begin + (scrollTriggerInstance.finish - scrollTriggerInstance.begin) * progress;
window.scrollTo({ prime: scrollPos, conduct: 'instantaneous' });
scrollTriggerInstance.scroll(scrollPos);
},
onThrowUpdate() {
const progress = gsap.utils.normalize(draggable.maxX, draggable.minX, draggable.x);
scrollPos = scrollTriggerInstance.begin + (scrollTriggerInstance.finish - scrollTriggerInstance.begin) * progress;
window.scrollTo({ prime: scrollPos, conduct: 'instantaneous' });
},
onThrowComplete() {
scrollTriggerInstance.scroll(scrollPos);
}
Step 2: Materials setup
export default class PlanesMaterial extends ShaderMaterial {
constructor(texture) {
tremendous({
vertexShader: baseVertex,
fragmentShader: baseFragment,
uniforms: {
uTexture: { worth: texture },
uBlurAmount: { worth: 0 },
},
});
}
}
Let’s rapidly analyze the uniforms handed to the fabric:
uTextureis the bottom texture rendered on the airplaneuBlurAmountrepresents the blur power based mostly on the space from the window heart
Step 3: The JavaScript (GSAP)
constructor(scene, digital camera) {
...
this.callback = this.scrollUpdateCallback;
this.centerX = window.innerWidth / 2
...
}
Within the constructor we arrange two items we’ll use to drive the dynamic blur impact:
<sturdy>this.callback</sturdy>references the perform used inside ScrollTrigger’sonUpdateto refresh the blur quantitythis.heartX represents the window heart on X axes and is up to date on every window resize
Let’s dive into the callback handed to ScrollTrigger:
scrollUpdateCallback() {
this.tiles.forEach(tile => {
const worldPosition = tile.getWorldPosition(new Vector3());
const vector = worldPosition.clone().challenge(this.digital camera);
const screenX = (vector.x * 0.5 + 0.5) * window.innerWidth;
const distance = Math.abs(screenX - this.centerX);
const maxDistance = window.innerWidth / 2;
const blurAmount = MathUtils.clamp(distance / maxDistance * 5, 0.0, 5.0);
gsap.to(tile.materials.uniforms.uBlurAmount, {
worth: Math.spherical(blurAmount / 2) * 2,
length: 1.5,
ease: 'power3.out'
});
});
}
Let’s dive deeper into this:
- Vector initiatives every airplane’s 3D place into normalized gadget coordinates;
.challenge(this.digital camera)converts to the -1..1 vary, then it’s scaled to actual display screen pixel coordinates. - screenX are the 2D screen-space coordinates.
- distance measures how far the airplane is from the display screen heart.
- maxDistance is the utmost attainable distance from heart to nook.
- blurAmount computes blur power based mostly on distance from the middle; it’s clamped between
0.0and5.0to keep away from excessive values that may hurt visible high quality or shader efficiency. - The
<sturdy>uBlurAmount</sturdy>uniform is animated towards the computedblurAmount. Rounding to the closest even quantity (Math.spherical(blurAmount / 2) * 2) helps keep away from overly frequent tiny modifications that might trigger visually unstable blur.
Step 4: The Shader (GLSL)
uniform sampler2D uTexture;
uniform float uBlurAmount;
various vec2 vUv;
vec4 kawaseBlur(sampler2D tex, vec2 uv, float offset) {
vec2 texelSize = vec2(1.0) / vec2(textureSize(tex, 0));
vec4 colour = vec4(0.0);
colour += texture2D(tex, uv + vec2(offset, offset) * texelSize);
colour += texture2D(tex, uv + vec2(-offset, offset) * texelSize);
colour += texture2D(tex, uv + vec2(offset, -offset) * texelSize);
colour += texture2D(tex, uv + vec2(-offset, -offset) * texelSize);
return colour * 0.25;
}
vec4 multiPassKawaseBlur(sampler2D tex, vec2 uv, float blurStrength) {
vec4 baseTexture = texture2D(tex, uv);
vec4 blur1 = kawaseBlur(tex, uv, 1.0 + blurStrength * 1.5);
vec4 blur2 = kawaseBlur(tex, uv, 2.0 + blurStrength);
vec4 blur3 = kawaseBlur(tex, uv, 3.0 + blurStrength * 2.5);
float t1 = smoothstep(0.0, 3.0, blurStrength);
float t2 = smoothstep(3.0, 7.0, blurStrength);
vec4 blurredTexture = combine(blur1, blur2, t1);
blurredTexture = combine(blurredTexture, blur3, t2);
float mixFactor = smoothstep(0.0, 1.0, blurStrength);
return combine(baseTexture, blurredTexture, mixFactor);
}
void predominant() {
vec4 colour = multiPassKawaseBlur(uTexture, vUv, uBlurAmount);
gl_FragColor = colour;
}
This GLSL fragment receives a texture (uTexture) and a dynamic worth (uBlurAmount) indicating how a lot the airplane must be blurred. Based mostly on this worth, the shader applies a multi-pass Kawase blur, an environment friendly method that simulates a smooth, pleasing blur whereas staying performant.
Let’s study the kawaseBlur perform, which applies a mild blur by sampling 4 factors across the present pixel (uv), every offset positively or negatively.
texelSizecomputes the scale of 1 pixel in UV coordinates so offsets discuss with “pixel quantities” no matter texture decision.- 4 samples are taken in a diagonal cross sample round
uv. - The 4 colours are averaged (multiplied by 0.25) to return a balanced outcome.
This perform is a light-weight single move. To attain a stronger impact, we apply it a number of occasions.
The multiPassKawaseBlur perform does precisely that, progressively rising blur after which mixing the passes:
vec4 blur1 = kawaseBlur(tex, uv, 1.0 + blurStrength * 1.5);
vec4 blur2 = kawaseBlur(tex, uv, 2.0 + blurStrength);
vec4 blur3 = kawaseBlur(tex, uv, 3.0 + blurStrength * 2.5);
This produces a progressive, visually clean outcome.
Subsequent, we mix the completely different blur ranges utilizing two separate smoothsteps:
float t1 = smoothstep(0.0, 3.0, blurStrength);
float t2 = smoothstep(3.0, 7.0, blurStrength);
vec4 finalBlur = combine(blur1, blur2, t1);
finalBlur = combine(finalBlur, blur3, t2);
The primary combine blends blur1 and blur2, whereas the second blends that outcome with blur3. The ensuing finalBlur represents the Kawase-blurred texture, which we lastly combine with the bottom texture handed through the uniform.
Lastly, we combine the blurred texture with the unique texture based mostly on blurStrength, utilizing one other smoothstep from 0 to 1:
float mixFactor = smoothstep(0.0, 1.0, blurStrength);
return combine(baseTexture, finalBlur, mixFactor);
Closing Phrases
Bringing collectively GSAP’s animation energy and the artistic freedom of GLSL shaders opens up a complete new layer of interactivity for the net. By animating shader uniforms immediately with GSAP, we’re capable of mix clean movement design rules with the uncooked flexibility of GPU rendering — crafting experiences that really feel alive, fluid, and tactile.
From easy grayscale transitions to ripple-based deformations and dynamic blur results, each step on this tutorial demonstrates how movement and graphics can reply naturally to consumer enter, creating interfaces that invite exploration reasonably than simply remark.
Whereas these methods push the boundaries of front-end growth, in addition they spotlight a rising development: the convergence of design, code, and real-time rendering.
So, take these examples, remix them, and make them your personal — as a result of probably the most thrilling a part of working with GSAP and shaders is that the canvas is kind of actually infinite.


