Shader Development for Technical Artists
Shaders are the programs that run on the GPU to determine how every pixel on screen is drawn. For a technical artist, understanding shaders is a superpower - it lets you create custom visual effects, optimize rendering performance, and bridge the gap between art direction and real-time graphics. In this project you'll write GLSL shaders from scratch, covering the fundamentals and building three portfolio-worthy effects.
Introduction to Shaders for TAs
A shader is a small program that runs massively in parallel on the GPU. The two most common types are:
- Vertex shaders - run once per vertex. They transform positions from object space to screen space and pass data (normals, UVs, colours) to the next stage.
- Fragment shaders (pixel shaders) - run once per fragment (pixel candidate). They compute the final colour that appears on screen.
As a TA, you don't need to write a full rendering engine. What you do need is the ability to read, modify, and author individual shaders - whether that's a custom toon shader for stylised characters, a procedural texture for environments, or a dissolve effect for gameplay transitions.
All shaders in this guide are written in GLSL (OpenGL Shading Language). The concepts translate directly to HLSL (DirectX/Unity) and Unreal's material nodes - we'll cover porting at the end.
GLSL Basics: Vertex & Fragment Shaders
Here's a minimal shader pair that transforms geometry and outputs a solid colour. This is the "hello world" of shader programming.
// basic_vertex.glsl - Minimal vertex shader
#version 330 core
layout(location = 0) in vec3 aPosition;
layout(location = 1) in vec3 aNormal;
layout(location = 2) in vec2 aTexCoord;
uniform mat4 uModel;
uniform mat4 uView;
uniform mat4 uProjection;
out vec3 vWorldPos;
out vec3 vNormal;
out vec2 vTexCoord;
void main() {
vec4 worldPos = uModel * vec4(aPosition, 1.0);
vWorldPos = worldPos.xyz;
vNormal = normalize(mat3(transpose(inverse(uModel))) * aNormal);
vTexCoord = aTexCoord;
gl_Position = uProjection * uView * worldPos;
}
// basic_fragment.glsl - Minimal fragment shader with diffuse lighting
#version 330 core
in vec3 vWorldPos;
in vec3 vNormal;
in vec2 vTexCoord;
uniform vec3 uLightDir;
uniform vec3 uLightColor;
uniform vec3 uBaseColor;
uniform vec3 uCameraPos;
out vec4 FragColor;
void main() {
// Ambient
vec3 ambient = 0.1 * uBaseColor;
// Diffuse (Lambert)
float diff = max(dot(vNormal, normalize(-uLightDir)), 0.0);
vec3 diffuse = diff * uLightColor * uBaseColor;
// Specular (Blinn-Phong)
vec3 viewDir = normalize(uCameraPos - vWorldPos);
vec3 halfDir = normalize(viewDir + normalize(-uLightDir));
float spec = pow(max(dot(vNormal, halfDir), 0.0), 64.0);
vec3 specular = spec * uLightColor * 0.5;
FragColor = vec4(ambient + diffuse + specular, 1.0);
}
Test your shaders in real time using Shadertoy, The Book of Shaders editor, or a local tool like glslViewer. Fast iteration is key to learning shaders.
Writing a Toon / Cel Shader
Toon shading replaces smooth lighting gradients with discrete bands, creating a stylised, hand-drawn look. The trick is quantising the diffuse value into a small number of steps and adding an edge outline.
// toon_fragment.glsl - Cel shader with configurable bands and rim light
#version 330 core
in vec3 vWorldPos;
in vec3 vNormal;
in vec2 vTexCoord;
uniform vec3 uLightDir;
uniform vec3 uBaseColor;
uniform vec3 uCameraPos;
uniform int uBands; // number of shading bands (e.g. 3 or 4)
uniform float uRimPower; // rim light falloff (e.g. 3.0)
uniform vec3 uRimColor; // rim light colour
out vec4 FragColor;
float quantize(float value, int steps) {
return floor(value * float(steps)) / float(steps);
}
void main() {
vec3 normal = normalize(vNormal);
vec3 lightDir = normalize(-uLightDir);
vec3 viewDir = normalize(uCameraPos - vWorldPos);
// --- Quantised diffuse ---
float NdotL = max(dot(normal, lightDir), 0.0);
float toonDiffuse = quantize(NdotL, uBands);
// --- Rim lighting ---
float rim = 1.0 - max(dot(normal, viewDir), 0.0);
rim = pow(rim, uRimPower);
vec3 rimContrib = rim * uRimColor;
// --- Edge detection (simple screen-space approach) ---
float edge = (dot(normal, viewDir) < 0.15) ? 0.0 : 1.0;
// --- Combine ---
vec3 color = uBaseColor * (0.15 + 0.85 * toonDiffuse);
color += rimContrib;
color *= edge; // darken silhouette edges
FragColor = vec4(color, 1.0);
}
For a production toon shader, you'd typically use a post-process pass with a Sobel or Roberts Cross filter for cleaner outlines. The edge detection above is a quick approximation good for prototyping.
Procedural Textures with Noise Functions
Procedural textures are generated entirely in the shader - no image files needed. The foundation is a noise function. Below is a complete fragment shader that layers multiple octaves of Simplex-style noise to create organic patterns.
// procedural_texture.glsl - fBm noise for organic procedural textures
#version 330 core
in vec3 vWorldPos;
in vec3 vNormal;
in vec2 vTexCoord;
uniform float uTime;
uniform float uScale; // base noise scale (e.g. 4.0)
uniform int uOctaves; // number of noise layers (e.g. 6)
uniform vec3 uColorA; // low value colour
uniform vec3 uColorB; // high value colour
out vec4 FragColor;
// Hash function for pseudo-random gradient generation
vec3 hash3(vec3 p) {
p = vec3(
dot(p, vec3(127.1, 311.7, 74.7)),
dot(p, vec3(269.5, 183.3, 246.1)),
dot(p, vec3(113.5, 271.9, 124.6))
);
return -1.0 + 2.0 * fract(sin(p) * 43758.5453123);
}
// 3D gradient noise
float gradientNoise(vec3 p) {
vec3 i = floor(p);
vec3 f = fract(p);
vec3 u = f * f * (3.0 - 2.0 * f); // smoothstep
return mix(
mix(mix(dot(hash3(i + vec3(0,0,0)), f - vec3(0,0,0)),
dot(hash3(i + vec3(1,0,0)), f - vec3(1,0,0)), u.x),
mix(dot(hash3(i + vec3(0,1,0)), f - vec3(0,1,0)),
dot(hash3(i + vec3(1,1,0)), f - vec3(1,1,0)), u.x), u.y),
mix(mix(dot(hash3(i + vec3(0,0,1)), f - vec3(0,0,1)),
dot(hash3(i + vec3(1,0,1)), f - vec3(1,0,1)), u.x),
mix(dot(hash3(i + vec3(0,1,1)), f - vec3(0,1,1)),
dot(hash3(i + vec3(1,1,1)), f - vec3(1,1,1)), u.x), u.y),
u.z
);
}
// Fractal Brownian Motion - layers of noise at increasing frequency
float fbm(vec3 p, int octaves) {
float value = 0.0;
float amplitude = 0.5;
float frequency = 1.0;
for (int i = 0; i < octaves; i++) {
value += amplitude * gradientNoise(p * frequency);
frequency *= 2.0;
amplitude *= 0.5;
}
return value;
}
void main() {
vec3 pos = vWorldPos * uScale;
// Animate slowly for a living texture
pos.z += uTime * 0.1;
float noise = fbm(pos, uOctaves);
float t = noise * 0.5 + 0.5; // remap from [-1,1] to [0,1]
vec3 color = mix(uColorA, uColorB, t);
// Simple diffuse so the texture reads on geometry
vec3 lightDir = normalize(vec3(0.5, 1.0, 0.3));
float diff = max(dot(normalize(vNormal), lightDir), 0.0);
color *= (0.3 + 0.7 * diff);
FragColor = vec4(color, 1.0);
}
Experiment with the noise output: warp domain coordinates with another noise call (fbm(p + fbm(p))) for marble or lava effects. Small changes to the frequency multiplier and amplitude falloff create wildly different textures.
Building a Dissolve / Reveal Effect Shader
A dissolve effect is a classic real-time technique where geometry appears to burn away (or materialise) over time. It works by comparing noise values to a threshold and discarding fragments below it.
// dissolve_fragment.glsl - Dissolve effect with glowing edge
#version 330 core
in vec3 vWorldPos;
in vec3 vNormal;
in vec2 vTexCoord;
uniform sampler2D uAlbedoMap; // base colour texture
uniform sampler2D uNoiseMap; // greyscale noise texture (or use procedural)
uniform float uDissolveAmount; // 0.0 = fully visible, 1.0 = fully dissolved
uniform float uEdgeWidth; // width of the glowing edge (e.g. 0.05)
uniform vec3 uEdgeColor; // glow colour (e.g. orange-red)
uniform float uEdgeIntensity; // HDR multiplier for the edge glow
uniform vec3 uLightDir;
uniform vec3 uLightColor;
out vec4 FragColor;
void main() {
// Sample the noise pattern
float noise = texture(uNoiseMap, vTexCoord).r;
// Discard fragments below the threshold - this creates the dissolve
if (noise < uDissolveAmount) {
discard;
}
// --- Base lighting ---
vec3 albedo = texture(uAlbedoMap, vTexCoord).rgb;
vec3 normal = normalize(vNormal);
float diff = max(dot(normal, normalize(-uLightDir)), 0.0);
vec3 litColor = albedo * (0.15 + diff * uLightColor);
// --- Glowing edge ---
float edgeFactor = 1.0 - smoothstep(0.0, uEdgeWidth, noise - uDissolveAmount);
vec3 edgeGlow = uEdgeColor * edgeFactor * uEdgeIntensity;
FragColor = vec4(litColor + edgeGlow, 1.0);
}
To use this shader, animate uDissolveAmount from 0.0 to 1.0 over time. The noise texture controls the dissolve pattern - a Perlin noise texture gives organic erosion, while a grid pattern gives a digital disintegration look.
The discard keyword breaks early-Z optimisation on most GPUs. For production dissolve effects, consider writing to the alpha channel instead and using alpha-to-coverage or a separate transparency pass.
Shader Debugging Tips
Shaders don't have print() statements, so debugging requires creative techniques:
- Visualise intermediates as colour - output
vec4(vNormal * 0.5 + 0.5, 1.0)to see normals, orvec4(vec3(noise), 1.0)to see a noise value as greyscale. - Use solid colours for branches - temporarily return red/green/blue to confirm which code path is executing.
- RenderDoc - an essential free tool. Capture a frame, inspect every draw call, and step through shaders with full variable inspection.
- GPU validation layers - enable OpenGL debug output or Vulkan validation layers to catch errors the driver silently ignores.
- Simplify, then rebuild - strip the shader down to the minimum that reproduces the bug. Add complexity back one line at a time.
// debug_helpers.glsl - Utility functions for visual debugging
#version 330 core
// Visualise a float value as a heatmap (blue -> green -> red)
vec3 heatmap(float t) {
t = clamp(t, 0.0, 1.0);
vec3 color;
if (t < 0.5) {
color = mix(vec3(0.0, 0.0, 1.0), vec3(0.0, 1.0, 0.0), t * 2.0);
} else {
color = mix(vec3(0.0, 1.0, 0.0), vec3(1.0, 0.0, 0.0), (t - 0.5) * 2.0);
}
return color;
}
// Visualise a direction vector (maps [-1,1] to [0,1] per channel)
vec3 debugDirection(vec3 dir) {
return dir * 0.5 + 0.5;
}
// Checkerboard pattern - useful for verifying UV coordinates
float checkerboard(vec2 uv, float scale) {
vec2 grid = floor(uv * scale);
return mod(grid.x + grid.y, 2.0);
}
// Usage examples (uncomment one to debug):
// FragColor = vec4(heatmap(diff), 1.0); // visualise diffuse
// FragColor = vec4(debugDirection(vNormal), 1.0); // visualise normals
// FragColor = vec4(vec3(checkerboard(vTexCoord, 10.0)), 1.0); // verify UVs
Keep a personal library of debug helper functions. Copy-pasting the heatmap and checkerboard functions into any new shader saves time and gives you instant visual feedback.
Porting Shaders Between Engines
GLSL is the most portable starting point, but real-world production means working in Unity (HLSL / ShaderLab) or Unreal (Material Editor / HLSL). Here are the key differences you'll encounter:
GLSL -> HLSL Quick Reference
vec2/vec3/vec4->float2/float3/float4mat4->float4x4mix()->lerp()fract()->frac()texture(sampler, uv)->tex2D(sampler, uv)(orsampler.Sample()in SM5)in/outvaryings -> semantic-annotated structs (POSITION,TEXCOORD0, etc.)
Unity Considerations
- Unity uses ShaderLab as a wrapper around HLSL. Your fragment shader logic goes inside a
CGPROGRAMorHLSLPROGRAMblock. - For URP/HDRP, write custom shader graph nodes using the
Custom Functionnode and paste your HLSL logic directly. - Matrix multiplication order is reversed:
mul(matrix, vector)in HLSL vs.matrix * vectorin GLSL.
Unreal Considerations
- Unreal's Material Editor is node-based, but you can inject HLSL via the Custom Expression node.
- World position, normals, and UVs are available as built-in nodes - no vertex shader boilerplate needed.
- Unreal uses a deferred rendering pipeline by default, which limits some transparency and forward-shading techniques.
When porting the dissolve shader to Unity, your uDissolveAmount uniform becomes a _DissolveAmount property exposed in the Inspector. The noise texture becomes a _NoiseTex sampler. Everything else maps almost 1:1.
For your portfolio, include screenshots or short videos of each shader running in at least two environments (e.g. a WebGL viewer and Unity). This demonstrates your ability to work across platforms - a critical skill for any TA.