This article was generated 100% by AI, using GPT-5.5.

Introduction

This site has a faint transparent-watercolor background behind the writing. It looks like a quiet decoration, but it is not a single image. The current implementation draws paper with WebGPU, moves wet pigment a little with WebGL2, keeps a Canvas 2D fallback, and switches to a static background on mobile.

The important point is that this is not an attempt to build a physically correct watercolor simulator in the browser. For a writing site, that would be the wrong goal. The background should not become more important than the text.

The effect only needs a few cues.

  • the paper has relief and fiber

  • pigment is not just a uniform translucent color

  • wet pigment moves for a short time

  • dry pigment stays fixed to the paper

  • environments that cannot run the effect fail quietly

  • text, links, and scrolling remain untouched

After rereading the implementation, the watercolor feeling comes less from one spectacular shader and more from this split. Paper, wet paint, dry paint, input, and fallbacks each have their own boundary. That boundary is the design.

The Background Stays Weak

The visual stack is roughly this.

paper fallback span
paper fallback canvas
paper WebGPU canvas
brush canvas
page content
controls

The Vue template mostly just declares the layers.

<div class="watercolor-bg-container" aria-hidden="true">
  <span ref="fallbackLayerElement" class="paper-fallback" />
  <canvas ref="fallbackCanvasElement" class="paper-fallback-canvas" aria-hidden="true" />
  <canvas ref="paperCanvasElement" class="paper-canvas" aria-hidden="true" />
  <canvas ref="brushCanvasElement" class="brush-canvas" aria-hidden="true" />
</div>

The whole thing is aria-hidden because it is not content. It is material behind the content.

The container itself is not a giant fixed overlay. The canvases are fixed, while the container stays out of the content flow.

.watercolor-bg-container {
  position: static;
  pointer-events: auto;
  background: transparent;
}

.paper-fallback-canvas,
.paper-canvas,
.brush-canvas {
  position: fixed;
  inset: 0;
  pointer-events: none;
}

pointer-events: none matters. The background covers the screen visually, but it cannot steal links or scrolling. It is visually large and behaviorally weak. That is where a writing-site decoration should be.

Do Not Wake Every Island

The background is a Vuerend island, but it does not hydrate on every viewport.

const desktopIslandMedia = "(min-width: 769px)";

export const WatercolorBackgroundIsland = defineIsland("watercolor-background", {
  component: WatercolorBackgroundIslandView,
  load: () => import("./features/ShellWatercolorBackgroundIslandLoader"),
  hydrate: "media",
  media: desktopIslandMedia,
});

Only desktop-width screens load the client code. On mobile, CSS hides the canvases and uses bg-sp.png.

@media (max-width: 768px) {
  html {
    background: #f1eedf url("/bg-sp.png") center center / cover no-repeat;
  }

  .paper-fallback {
    opacity: 1 !important;
    background: #f1eedf url("/bg-sp.png") center center / cover no-repeat;
  }

  .paper-fallback-canvas,
  .paper-canvas,
  .brush-canvas {
    display: none;
  }
}

That is not just a compromise. It is part of the background contract. On touch-first screens, interactive watercolor has more risk than value. Static paper is enough.

Pointer input is gated the same way.

export function canUseFinePointerPaint(): boolean {
  return window.matchMedia("(hover: hover) and (pointer: fine)").matches;
}

The implementation decides not only what it can do, but where it should not run.

Draw Paper First

The WebGPU paper shader initializes asynchronously. Before that path finishes, a Canvas 2D fallback paper is drawn.

renderPaperFallback(fallbackCanvasElement.value, maxDpr);
brushLayer.resize();
removeViewportListeners = addShellViewportListeners(queueResize);

The fallback is not a flat color. It layers radial wash, shadow patches, highlight patches, fibers, and a grain tile.

context.fillStyle = "#f1eedf";
context.fillRect(0, 0, width, height);
drawRadialWash(context, width, height);

context.globalCompositeOperation = "multiply";
drawPaperPatches(context, width, height, random, Math.round(820 * areaScale), "shadow");

context.globalCompositeOperation = "screen";
drawPaperPatches(context, width, height, random, Math.round(520 * areaScale), "highlight");

When WebGPU succeeds, the fallback opacity is reduced. It is not removed entirely.

target.fallbackLayer?.style.setProperty("opacity", "0.1");
target.fallbackCanvas?.style.setProperty("opacity", "0.16");

This order matters. If WebGPU is fast, the shader paper appears. If it is slow, fallback paper is already visible. If it fails, paper remains. That is a good failure mode for a background.

Bake One Paper Layer With WebGPU

The WebGPU path imports ShellSitePaper.wgsl as raw shader source.

import sitePaperShaderSource from "./ShellSitePaper.wgsl?raw";

export const sitePaperShader = sitePaperShaderSource;

The adapter asks for low-power.

const adapter = await gpu.requestAdapter({ powerPreference: "low-power" });

Drawing paper behind text does not need the strongest possible GPU. The render itself is a single fullscreen triangle.

var positions = array<vec2f, 3>(
  vec2f(-1.0, -1.0),
  vec2f(3.0, -1.0),
  vec2f(-1.0, 3.0)
);

DPR is capped at 1.25.

const dpr = Math.min(window.devicePixelRatio || 1, maxDpr);

The paper grain does not need native DPR to read as paper. The useful resolution for a background is not the same as the physical display resolution.

Keep The Paper Still

The paper shader receives no time uniform. Its coordinates come from viewport shape and seed.

let aspect = uniforms.resolution.x / max(uniforms.resolution.y, 1.0);
let centered = (input.uv - 0.5) * vec2f(aspect * 1.72, 1.72);
let point =
  rotate(centered * vec2f(1.0, 1.04), -0.06)
  + vec2f(uniforms.seed * 6.4, uniforms.seed * 3.2);

If the paper moves while someone reads, it stops feeling like paper and becomes an effect. For a background, stillness is stronger.

The shader is also not just color noise. It builds a height field from material roles like embossField, microGrainField, fiberField, celluloseNetworkField, coldPressToothField, crumpleField, and wrinkleLineField. Those fields are mixed into paperHeight.

return macroRelief * 0.34
  + detail.x * 2.16
  + micro * (0.76 + uniforms.fiberDensity * 0.16 + uniforms.grainScale * 0.1)
  + (emboss - 0.18) * (0.18 + uniforms.embossDepth * 0.24)
  + (tooth.y - 0.28) * (0.46 + uniforms.embossDepth * 0.34)
  - (tooth.x - 0.18) * (0.34 + uniforms.shadowStrength * 0.22)
  + (grain - 0.5) * 0.018
  + (pulp - 0.3) * 0.018;

tooth.y acts like raised paper. tooth.x acts like a cavity. That sign difference matters because pigment needs a surface to settle into. The paper is not a white backdrop; it is terrain.

The paper shader can be read as this algorithm.

1. Convert uv into paper coordinates
2. Build low-frequency macro relief
3. Build fiber / fibril / cellulose / tooth / pulp fields separately
4. Mix those fields into a height field
5. Sample the height field nearby to produce a normal
6. Build cavity and ridge masks from the normal, relief, and tooth fields
7. Mix cavity shadow and ridge highlight into a warm white paper color

The normal is not analytic. It is derived by sampling paperHeight at small offsets.

fn paperNormal(point: vec2f) -> vec3f {
  let epsilon = 0.00076;
  let center = paperHeight(point);
  let dx = paperHeight(point + vec2f(epsilon, 0.0)) - center;
  let dy = paperHeight(point + vec2f(0.0, epsilon)) - center;
  let strength = 3.1 + uniforms.embossDepth * 2.2 + uniforms.grainScale * 1.1;
  return normalize(vec3f(-dx * strength, -dy * strength, 1.0));
}

That normal and paper tooth are then split into low spots and raised ridges.

let cavityMask = saturate(
  max(0.0, -localRelief * 1.84)
    + tooth.x * 0.68
    + (1.0 - normal.z) * 0.4
    + max(0.0, -diffuse) * 0.16
    + crumple.x * 0.08
    + wrinkleLines.x * 0.05
);

let ridgeMask = saturate(
  max(0.0, localRelief * 1.42)
    + tooth.y * 0.62
    + detail.y * 0.06
    + max(0.0, diffuse) * 0.18
    + crumple.y * 0.06
    + wrinkleLines.y * 0.04
);

The cavity shadow is intentionally limited. This is still a paper background, so relief should not steal text contrast. The final tone is clamped with clamp(tone, vec3f(0.72), vec3f(1.0)) to keep it from sinking too far.

The Brush Is Wet State, Not A Stamp

One tempting path is to draw organic Canvas 2D shapes and call it watercolor. The fallback brush does use that kind of drawing. But by itself, it still feels like translucent images being stamped onto the page. Watercolor needs a little time after the stroke.

The normal path uses a small WebGL2 solver, but the solver is created lazily.

const fallback = createBrushLayer(options);
let engine: FluidBrushEngine | undefined;
let triedEngine = false;

function ensureEngine(): FluidBrushEngine | undefined {
  if (engine || triedEngine) {
    return engine;
  }

  triedEngine = true;
  engine = FluidBrushEngine.create();
  return engine;
}

If WebGL2 or float framebuffer support is missing, the Canvas 2D brush is used instead.

if (!gl || !gl.getExtension("EXT_color_buffer_float")) {
  return undefined;
}

The solver does not run at full viewport resolution. Its base simulation size is 560.

const simulationBaseSize = 560;

For background bleeding, a slightly soft field is better than a precise high-resolution simulation. Lower resolution helps both performance and appearance.

The fluid section can be read as one sentence first.

the brush deposits wet -> velocity carries wet -> pressure calms the flow -> old wet becomes dry

The thing to visually follow is only the blue wet stain. The velocity arrows move that stain, red pressure is the brake that keeps the flow from bursting, and orange dry is the pigment fixed into the paper. This way of reading the solver was inspired by this three.js post on X, then reframed around this site's own watercolor background implementation.

One frame of the brush solver looks roughly like this.

curl
vorticity confinement
advect velocity
boundary velocity
advect wet
boundary wet
dry wet into dry
remove dried wet
divergence
pressure solve
subtract pressure gradient
boundary velocity
display
One frame of the WebGL2 brush solver One Frame At A Glance curl measure spin vorticity restore swirl advect u move velocity advect wet move pigment dry wet into dry divergence measure expansion pressure 8 iterations gradient subtract pressure display: composite wet / dry back into transparent brush canvas

This is not a complete Navier-Stokes solver. It adds a little vorticity back into the velocity field, moves wet pigment with semi-Lagrangian advection, and uses pressure projection to reduce divergence. For a background, that stable-fluids subset is enough.

First, There Are No Particles

When people hear "fluid," it is easy to imagine many little water particles moving around. This implementation does not track particles. It creates a texture grid smaller than the screen, and each pixel stores "which way water is moving here" and "how much wet pigment is here."

So the thing being simulated is a field, not a set of particles.

Each pixel:
  velocity = water direction at this position
  wet      = wet pigment at this position
  dry      = dried pigment at this position

One stroke changes roughly like this.

How one stroke separates into wet, velocity, and dry fields Blue wet moves, orange dry remains 0. splat add wet pigment and velocity 1. advect velocity carries wet pigment 2. project pressure keeps the flow from exploding outward burst is softened 3. dry wet fades, dry remains on paper dry pigment no longer advects blue = moving wet arrows = velocity orange = fixed dry red = pressure correction

In this picture, the solver mostly moves velocity and wet every frame. dry is removed from that moving system. That is why pigment first rides the water, then gradually becomes fixed into the paper.

The Fluid Fields Are Textures

The solver does not keep arrays on the CPU. It keeps fields as WebGL2 RGBA16F textures. Each pass draws one fullscreen triangle, and the fragment shader writes the next texture. Because a pass cannot safely read and write the same texture, velocity, wet, dry, and pressure are ping-pong buffers with read and write sides.

velocity.xy   velocity
velocity.zw   unused / spare alpha
wet.rgb       absorption from wet pigment
wet.a         wet amount
dry.rgb       absorption from dried pigment
dry.a         dried amount
pressure.x    pressure
divergence.x  velocity divergence
curl.x        2D curl
Texture fields owned by the fluid solver The Fields Live As Textures velocity xy = flow wet rgb = wet pigment dry rgb = dry pigment pressure x = pressure read texture pass writes the next field write texture swap: write becomes the next read

As a fluid algorithm, one frame is roughly this.

u = velocity
w = wet pigment
d = dry pigment

u <- add vorticity(u)
u <- advect(u, u)
w <- advect(w, u)
d, w <- dry(w, d)
u <- project(u)

dry is not advected. It represents pigment fixed into paper. The watercolor feeling comes as much from deciding what cannot move as from moving the wet field.

Advection Samples Backward

Advection uses a semi-Lagrangian step. The amount that arrives at the current pixel is assumed to have been upstream one step earlier. So the shader samples backward from vUv.

Backtracing in semi-Lagrangian advection Advection Reads Upstream previous sample position current pixel vUv trace backward by velocity velocity u tries to move pigment down and right coord = vUv - dt * velocity * texelSize
vec2 vel = texture(uVelocity, vUv).xy;
vec2 coord = vUv - dt * vel * uTexelSize;
outColor = dissipation * texture(uSource, coord);

The same shader moves both velocity and wet.

this.run("advect", this.velocity.write, {
  dissipation: 0.986,
  dt,
  uSource: this.velocity.readTexture,
  uVelocity: this.velocity.readTexture,
});

this.run("advect", this.wet.write, {
  dissipation: 0.998,
  dt,
  uSource: this.wet.readTexture,
  uVelocity: this.velocity.readTexture,
});

Velocity dissipates at 0.986, while wet pigment dissipates at 0.998. Motion fades sooner than pigment. Semi-Lagrangian advection is stable, but it blurs. For background bleeding, that blur is usually useful.

Add Back A Little Curl

Semi-Lagrangian advection tends to lose small rotations. The solver first computes curl, then uses vorticity confinement to reintroduce a little swirl.

outColor = vec4(
  0.5 * (right.y - left.y - (top.x - bottom.x)),
  0.0,
  0.0,
  1.0
);

In 2D, curl can be stored as a scalar. The next pass looks at the gradient of abs(curl) and pushes around the curl center.

vec2 force = 0.5 * vec2(abs(top.x) - abs(bottom.x), abs(right.x) - abs(left.x));
force /= length(force) + 0.0001;
force *= strength * center * vec2(1.0, -1.0);
vec2 vel = texture(uVelocity, vUv).xy;
outColor = vec4(vel + force * dt, 0.0, 1.0);

strength is 18. Higher values make the motion showier, but also louder as a background. The goal is not visible water turbulence; it is slight uneven pigment escape.

Boundaries Depend On The Field

After several passes, the solver applies a boundary pass. Near an edge, it samples the neighboring texel and multiplies by a scale.

if (uv.x < uTexelSize.x) data = scale * texture(uTarget, uv + vec2(uTexelSize.x, 0.0));
if (uv.x > 1.0 - uTexelSize.x) data = scale * texture(uTarget, uv - vec2(uTexelSize.x, 0.0));
if (uv.y < uTexelSize.y) data = scale * texture(uTarget, uv + vec2(0.0, uTexelSize.y));
if (uv.y > 1.0 - uTexelSize.y) data = scale * texture(uTarget, uv - vec2(0.0, uTexelSize.y));

For velocity, scale = -1, so velocity reflects at the wall. For wet, scale = 0, so wet pigment that reaches the outside disappears. For pressure, scale = 1, so pressure continues across the edge. One boundary shader is reused, but the sign depends on the meaning of the field.

Pressure Projection Reduces Divergence

Splatting and vorticity can make velocity expand or collapse. The solver first computes divergence.

outColor = vec4(
  0.5 * (right.x - left.x + top.y - bottom.y),
  0.0,
  0.0,
  1.0
);

Divergence marks places where flow is appearing or disappearing. To reduce it, pressure is solved with Jacobi iterations.

What pressure projection does Pressure Projection Smooths Sources before projection large divergence solve pressure subtract ∇p after projection divergence reduced less explosive flow
float div = texture(uDivergence, vUv).x;
outColor = vec4(
  (left.x + right.x + bottom.x + top.x - div) * 0.25,
  0.0,
  0.0,
  1.0
);

Then the pressure gradient is subtracted from velocity.

vec2 vel = texture(uVelocity, vUv).xy;
outColor = vec4(
  vel - 0.5 * vec2(right.x - left.x, top.x - bottom.x),
  0.0,
  1.0
);

This does not make the field perfectly incompressible. The pressure solve only runs for 8 iterations. For a watercolor background, strict volume conservation matters less than keeping the stroke edge from looking like it explodes.

Split Wet And Dry

The WebGL2 solver mainly owns velocity, wet, dry, and pressure.

velocity  water motion
wet       pigment that can still move
dry       pigment fixed into the paper
pressure  field used to keep velocity coherent

Incoming color is not written directly into dry. It is splatted into wet. The stroke first turns pointer delta into direction, radius, force, and opacity.

const speed = Math.hypot(stroke.deltaX, stroke.deltaY);
const directionX = speed > 0.00001 ? stroke.deltaX / speed : 1;
const directionY = speed > 0.00001 ? stroke.deltaY / speed : 0;
const radius = 0.00018 + stroke.pressure * 0.00034;
const force = Math.min(46, 8 + speed * 2200);
const opacity = Math.min(0.42, 0.18 + stroke.alpha * 3.8 + stroke.pressure * 0.045);

One stroke updates both velocity and wet pigment. The velocity splat carries motion; the wet splat carries pigment absorption.

this.splatBuffer(
  this.velocity,
  stroke.x,
  stroke.y,
  directionX * force,
  directionY * force,
  0,
  1,
  radius * 1.05,
  0,
  directionX,
  directionY,
);

this.splatBuffer(
  this.wet,
  stroke.x,
  stroke.y,
  (1 - stroke.red) * opacity,
  (1 - stroke.green) * opacity,
  (1 - stroke.blue) * opacity,
  opacity,
  radius,
  speed > 0.0009 ? 1 : 0,
  directionX,
  directionY,
);

The solver stores 1 - rgb, treating pigment as light absorption rather than paint added on top of paper. It also adds six small outward velocity splats around the center. This is not a physically exact capillary model, but it gives the stroke edge somewhere to bleed.

for (let index = 0; index < 6; index += 1) {
  const angle = (index / 6) * Math.PI * 2;
  const offset = Math.sqrt(radius) * 0.42;
  const bloomX = clamp01(stroke.x + Math.cos(angle) * offset);
  const bloomY = clamp01(stroke.y + Math.sin(angle) * offset);
  this.splatBuffer(
    this.velocity,
    bloomX,
    bloomY,
    Math.cos(angle) * 28,
    Math.sin(angle) * 28,
    0,
    1,
    radius * 0.62,
    0,
    Math.cos(angle),
    Math.sin(angle),
  );
}

Each frame advects velocity and wet. It does not advect dry.

vec2 vel = texture(uVelocity, vUv).xy;
vec2 coord = vUv - dt * vel * uTexelSize;
outColor = dissipation * texture(uSource, coord);

This separation does most of the work. Wet pigment moves. Dry pigment does not. That turns "a transparent image spreading on screen" into something closer to pigment settling into paper.

The velocity field still needs projection. The solver computes divergence, iterates pressure, and subtracts the pressure gradient.

this.run("divergence", this.divergence, {
  uVelocity: this.velocity.readTexture,
});
this.clearTarget(this.pressure.read);

for (let index = 0; index < pressureIterations; index += 1) {
  this.run("pressure", this.pressure.write, {
    uDivergence: this.divergenceTexture,
    uPressure: this.pressure.readTexture,
  });
  this.swap(this.pressure);
  this.applyBoundary(this.pressure, 1);
}

The pressure solve stops at 8 iterations. That is rough as fluid simulation, but good enough for bleeding pigment. A slightly loose field often reads as paper irregularity instead of error.

Drying is split between dryAccum and dryWet.

float density = max(max(w.r, w.g), w.b);
float thicknessFactor = 1.0 / (1.0 + density * thicknessK);
float wet1 = max(wet0 - dryRate * thicknessFactor * dt, 0.0);

Dense pigment dries more slowly. Thin water disappears earlier, while pooled pigment remains as edges and unevenness. It is not exact physics, but it is the right perceptual shortcut.

The dry accumulation pass also mixes addition and replacement.

vec3 transfer = w.rgb * frac * 1.015;
float coverage = 1.0 - exp(-density * 1.8);
vec3 headroom = max(vec3(dryCap) - d.rgb, vec3(0.0));
vec3 added = d.rgb + min(transfer, headroom);
vec3 replaced = mix(d.rgb, w.rgb, frac);
outColor = vec4(mix(added, replaced, coverage), min(d.a + dried, 4.0));

Pure addition makes repeated strokes sink toward mud. Pure replacement erases transparent glazing. Thin pigment stays closer to additive layering; dense pigment moves closer to replacement. That keeps both pale washes and heavier pools.

Return To A Transparent Canvas

Inside the solver, wet and dry pigment are first composited against an assumed paper color.

vec3 paper = vec3(0.96, 0.945, 0.92);
vec3 afterDry = paper * exp(-dry.rgb * 2.2);
vec3 wetColor = paper * exp(-wet.rgb * 2.2);
vec3 glazed = afterDry * exp(-wet.rgb * 2.2);

The actual page already has a separate WebGPU paper layer, so the final brush output must stay transparent.

float density = max(wetDensity, dryDensity);
float edge = smoothstep(0.002, 0.018, length(vec2(dFdx(density), dFdy(density))));
float alpha = clamp(density * 1.15 + wetAmt * 0.18 + edge * 0.12, 0.0, 0.62);
outColor = vec4(color, alpha);

The offscreen WebGL canvas is copied back into the visible 2D canvas.

context.clearRect(0, 0, canvas.width, canvas.height);
context.drawImage(this.canvas, 0, 0, canvas.width, canvas.height);

CSS then sinks it into the paper.

.brush-canvas {
  mix-blend-mode: multiply;
}

The paper and brush are not combined into one renderer because their failure scopes are different. Paper can survive without brush. Brush can fall back to Canvas 2D. Mobile can skip both dynamic paths. Separate layers make those choices simple.

The Canvas 2D Fallback Keeps The Same Contract

When WebGL2 is unavailable, createBrushLayer is used. It is not a fluid solver, but it keeps the wet/dry idea.

dryLayer
wetLayer
wetBlooms
wetDryFrames

Strokes are drawn into wetLayer, then slowly moved into dryLayer.

dryContext.globalCompositeOperation = "multiply";
dryContext.globalAlpha = 0.026;
dryContext.drawImage(wetLayer, 0, 0);

wetContext.globalCompositeOperation = "destination-out";
wetContext.fillStyle = "rgba(0, 0, 0, 0.018)";
wetContext.fillRect(0, 0, wetLayer.width, wetLayer.height);

The WebGL2 solver is not the essence of the effect. The essence is that pigment can be wet, can dry, and those two states have a time gap. The fallback preserves that contract.

Do Not Change Color Too Fast

Brush color changes every 36 draws.

const brushColorHoldDraws = 36;

Changing color on every pointermove creates a rainbow marker, not watercolor. One pigment needs time to dilute and spread. The palette also stays low-alpha, around 0.04 to 0.05.

For a reading background, layered density is better than loud hue changes.

Keep Renderer State Out Of Vue

The Vue state is intentionally small.

const fallbackLayerElement = ref<HTMLElement>();
const fallbackCanvasElement = ref<HTMLCanvasElement>();
const paperCanvasElement = ref<HTMLCanvasElement>();
const brushCanvasElement = ref<HTMLCanvasElement>();
const isDrawingEnabled = ref(false);

WebGPU devices, WebGL contexts, textures, framebuffers, wet/dry buffers, and previous pointer positions are not reactive state. They are renderer state.

From Vue's side, the brush layer is just an imperative object with draw, resize, and clear.

const brushLayer = createFluidBrushLayer({
  canvas: () => brushCanvasElement.value,
  isDrawingEnabled: () => isDrawingEnabled.value,
  maxDpr,
});

When graphics live inside a UI framework, it is tempting to make everything reactive. But state that never updates the template only adds tracking cost and confusion. The renderer should own renderer state.

Cleanup Matters

The background attaches window-level listeners. That makes cleanup part of the feature.

onUnmounted(() => {
  if (resizeFrame !== 0) {
    cancelAnimationFrame(resizeFrame);
  }
  removeViewportListeners?.();
  removeBrushListeners?.();
  paperCleanup?.();
});

The WebGPU cleanup calls unconfigure() too.

return () => {
  disposed = true;
  if (paperResizeFrame !== 0) {
    cancelAnimationFrame(paperResizeFrame);
  }
  removeViewportListeners();
  gpuContext.unconfigure();
};

A background feels minor, but it touches the whole page. Stopping it cleanly is part of implementing it.

Test Properties, Not Beauty

It is hard to automatically test whether watercolor looks beautiful. It is much easier to test the properties that should not regress.

The Playwright checks look for things like these.

  • the fallback canvas is not blank

  • .watercolor-bg-container does not cover content

  • the paper canvas initializes when WebGPU is available

  • mobile uses bg-sp.png and hides the canvases

  • the tooltip does not stay visible after hover

  • brush color does not change too quickly within one stroke

  • the bleeding edge expands or darkens after a delay

  • pointer-event bursts finish within a bounded time

The tests are not pixel-perfect. They protect the behavioral and perceptual contracts. For a background, that is more useful than locking down an exact image.

Summary

The tech behind transparent watercolor on the Web is not just about solving fluid accurately. On this site, these choices matter more.

  • split paper and brush into separate layers

  • draw Canvas 2D fallback paper before WebGPU starts

  • bake paper once with WebGPU and keep it still

  • cap DPR so the background does not pay for unnecessary fragments

  • enable brush input only for fine-pointer desktop contexts

  • lazily initialize the WebGL2 solver and fall back to Canvas 2D

  • split pigment into wet and dry, and never advect dry

  • keep color stable long enough to feel like one pigment

  • keep renderer state out of Vue reactivity

  • use a static background on mobile

  • test properties instead of pixels

Transparent watercolor does not require solving water, paper, and pigment perfectly. It only needs to make the reader feel paper, wetness, and drying over a short time. And when the environment cannot support that, it should do nothing. For a background, quiet failure is part of the implementation.